See Details

Multimodal UX with Haptics

Immersion Corporation, a pioneer in haptic technology, set out to demonstrate how high-definition haptics could improve safety, usability, and experience in automotive touchscreens. The goal was to reduce visual distraction and help drivers interact more confidently with digital interfaces through touch instead of sight.
The result was a fully functional, large-scale touchscreen demonstrator, showcased at CES, the Tokyo Auto Show, and in private OEM presentations. It received strong praise from Tier 1 suppliers and automotive partners, positioning Immersion as a leader in advanced automotive haptics.
Client:
Immersion
Role:
Principal Multi-modal Designer
Timeline:
12 months
Tools:

Sketch
InVision
Adobe Illustrator
OverFlow
Unity
GitHub
Internal Haptic Design Tools

My Role

I was responsible for the complete UX and visual design of the experience, including haptic mapping, prototyping, and interaction modeling. I worked closely with UX researcher Felipe Almeida, hardware and firmware engineers, and product leadership.

I designed the interface system, interaction flows, animated prototypes, and all visual assets. I also defined the taxonomy and classification of haptic effects, which became a foundation for future demonstrations and customer projects.

Challenge and Goals

Touchscreens in vehicles often increase distraction due to a lack of tactile feedback. The challenge was to demonstrate how haptics could solve this while still feeling intuitive and familiar for drivers.

Goals:

  • Design a safe, intuitive touchscreen experience for real driving contexts
  • Integrate haptics in a way that added clarity without overload
  • Educate OEMs on haptics as a tool for safety, experience, and brand differentiation

I partnered closely with engineers, internal stakeholders, and Phd UX Researcher (Felipe) throughout concept, testing, and refinement.

Discovery and Research

To ground our design in real-world driving scenarios, Felipe and I facilitated cross-functional ideation workshops involving engineering, product, sales, and IT.

We conducted three core testing phases:

1 – Wireframe Testing
Purpose: Validate layout and interaction logic
Outcome: Strong clarity and intuitive mental models

2 – UI Mockup Testing (NASA-TLX)
Purpose: Measure cognitive workload
Outcome: Low cognitive load, minor visual refinements

3 – Haptic Interaction Testing
Purpose: Evaluate tactile feedback in full context
Outcome: Positive reception, with refinements made to strength and noise

I iterated rapidly between testing rounds using an agile, build-measure-learn loop.

To inform our approach, I reviewed academic and industry research on:

  • Multimodal feedback and distraction
  • Haptic + visual interaction patterns
  • Skeuomorphism and affordances
  • Driver attention and cognitive load

These insights guided how tactile information was structured, applied, and refined.

The data highlighted how haptics can:

  • Reduce cognitive load
  • Enable eyes-free interaction
  • Provide confirmation and information effects (e.g., on/off states, volume/fan levels)
No items found.

Design Approach

I focused on the most meaningful in-car tasks: navigation, media, climate, and system controls. I built the information architecture and user flows to minimize glance time and maximize tactile confidence.

I developed a haptic interaction model based on four core categories:

  1. Exploration haptics
    Guided users toward active zones through subtle tactile cues
  2. Confirmation haptics
    Replicated the sensation of switches and buttons
  3. Information haptics
    Communicated system state changes through pressure-based gradients
  4. Skeuomorphic haptics
    Mirrored physical sensations such as dials, knobs, and mechanical feedback

Each interaction was mapped to one or more of these categories, creating a consistent and scalable tactile language for the entire system.

This allowed users to perform tasks with greater confidence and less visual dependence, especially relevant for L3 and L4 autonomous environments.

No items found.

Interaction and Visual Design

I built interactive prototypes in Unity with real-time haptic feedback, collaborating closely with software developers. I also created low- and high-fidelity prototypes in Sketch, InVision, and Illustrator for testing and validation.

The visual design emphasized:

  • Large touch targets
  • Minimal text
  • Clear iconography
  • Color-coded system states
  • Cockpit-aware ergonomics (55-degree screen angle)

Visuals were intentionally restrained to keep the focus on the haptic experience, using subtle skeuomorphic cues only where necessary to support tactile metaphors.

Ongoing user testing continuously informed iteration of visuals, interactions, and haptic effects.

No items found.

Special Focus

Haptic Integration

Haptics were treated as a core design element, not an add-on. I collaborated directly with Immersion’s engineering team using PowerHap actuators to tune each tactile effect.

Each interaction was designed according to:

  • Urgency
  • Cognitive load
  • Interaction type (tap, drag, slide)

Examples included:

  • Short bursts for toggles
  • Continuous feedback for map dragging
  • Threshold pulses for volume and limits

I also tested multiple “search” waveforms (valley vs mountain), discovering that users preferred valley-based patternsfor spatial awareness and guidance.

To ensure consistency and memorability, I formalized four core haptic categories:

  • Exploration
  • Confirmation
  • Information
  • Skeuomorphic

This framework became a reusable reference for future Immersion projects.

Outcome and Impact

The demonstrator received outstanding feedback at CES, the Tokyo Auto Show, and private OEM demonstrations. Automotive partners praised both its realism and usability.

The project contributed directly to:

  • Strategic partnerships in Europe and Asia
  • Follow-up concept work for next-gen cockpits
  • Immersion’s sales and education pipeline
  • Expanded internal haptic libraries

It helped move Immersion from conceptual demos to credible, production-adjacent experiences.

No items found.

Key Learnings

This project sharpened my ability to design for human perception, where precision and restraint are essential. I learned how to blend visual, tactile, and auditory feedback into a unified experience.

Key takeaways:

  • Haptics must be integral, not decorative
  • Simplicity improves safety and confidence
  • Well-designed skeuomorphism bridges physical and digital worlds
  • Rapid testing is critical in high-risk environments
  • A limited, structured haptic language increases usability

Key Design Principles for Automotive Haptics

  1. Safety and Confidence: Reduce glance time and support eyes-on-the-road interactions.
  2. Information Transfer: Deliver meaningful, consistent signals through touch.
  3. Physical Design Freedom: Enable new interior design opportunities without mechanical buttons.
  4. Branding Through Touch: Allow OEMs to craft distinct, signature haptic sensations.

“Your work is always very thoughtful and well-presented. This project really helped us tell the story of what HD haptics can do, not just how it works.”
Felipe Almeida Ph.D - Sr. UX Researcher @ Immersion
"Filip's designs got many of our clients inspired and excited with new business opportunities."
Alberto Bonamico - Business Development Director @ Immersion
"His designs and interactions on automotive applications have captured the imaginations of our partners because of the simplicity and haptic feeling."
Manuel Cruz - VP of UX and Research @ Immersion