
Immersion Corporation, a pioneer in haptic technology, set out to demonstrate how high-definition haptics could improve safety, usability, and confidence in automotive touchscreen interfaces. As vehicles increasingly replace physical controls with digital surfaces, the risk of driver distraction rises due to the lack of tactile feedback and reliance on visual attention.
The objective was to create a realistic, production-adjacent automotive experience that showed how touch-based interfaces could be operated through feel rather than sight, reducing cognitive load and supporting safer in-car interaction. This was not intended as a conceptual demo, but as a functional, credible system that OEMs and Tier 1 suppliers could evaluate as a future interaction model.
The result was a fully functional, large-scale touchscreen demonstrator that integrated visual, tactile, and interaction design into a single multimodal experience. The system was showcased at CES, the Tokyo Auto Show, the Detroit Auto Show, and in private OEM presentations, positioning Immersion as a leader in advanced automotive haptics and next-generation in-car UX.
I led the complete UX and visual design of the experience, working closely with UX research, hardware and firmware engineering, and product leadership. The focus was on building a coherent interaction language where haptics were not an enhancement, but a core interface layer that structured how users perceived, understood, and interacted with the system.
Showcased at CES, Tokyo Auto Show, Detroit and OEM Private Showcases.
Adopted as flagship demonstrator for OEM engagement.
Positioned Immersion as leader in automotive haptics.
Applied research · Product-grade prototyping · Multisensory design · Safety-first UX · Industry influence
I was responsible for the complete UX and visual design of the experience, including haptic mapping, prototyping, and interaction modeling. I worked closely with UX researcher Felipe Almeida, PhD, hardware and firmware engineers, and product leadership.
I designed the interface system, interaction flows, animated prototypes, and all visual assets. I also defined the taxonomy and classification of haptic effects, which became a foundation for future demonstrations and customer projects.
Touchscreens in vehicles often increase distraction due to a lack of tactile feedback. The challenge was to demonstrate how haptics could solve this while still feeling intuitive and familiar for drivers.
Goals:
I partnered closely with engineers, internal stakeholders, and Phd UX Researcher (Felipe) throughout concept, testing, and refinement.
To ground our design in real-world driving scenarios, Felipe and I facilitated cross-functional ideation workshops involving engineering, product, sales, and IT.
We conducted three core testing phases:
1 – Wireframe Testing
Purpose: Validate layout and interaction logic
Outcome: Strong clarity and intuitive mental models
2 – UI Mockup Testing (NASA-TLX)
Purpose: Measure cognitive workload
Outcome: Low cognitive load, minor visual refinements
3 – Haptic Interaction Testing
Purpose: Evaluate tactile feedback in full context
Outcome: Positive reception, with refinements made to strength and noise
I iterated rapidly between testing rounds using an agile, build-measure-learn loop.
To inform our approach, I reviewed academic and industry research on:
These insights guided how tactile information was structured, applied, and refined.
The data highlighted how haptics can:





I focused on the most meaningful in-car tasks: navigation, media, climate, and system controls. I built the information architecture and user flows to minimize glance time and maximize tactile confidence.
I developed a haptic interaction model based on four core categories:
Each interaction was mapped to one or more of these categories, creating a consistent and scalable tactile language for the entire system.
This allowed users to perform tasks with greater confidence and less visual dependence, especially relevant for L3 and L4 autonomous environments.




I built interactive prototypes in Unity with real-time haptic feedback, collaborating closely with software developers. I also created low- and high-fidelity prototypes in Sketch, InVision, and Illustrator for testing and validation.
The visual design emphasized:
Visuals were intentionally restrained to keep the focus on the haptic experience, using subtle skeuomorphic cues only where necessary to support tactile metaphors.
Ongoing user testing continuously informed iteration of visuals, interactions, and haptic effects.






Haptics were treated as a core design element, not an add-on. I collaborated directly with Immersion’s engineering team using PowerHap actuators to tune each tactile effect.
Each interaction was designed according to:
Examples included:
I also tested multiple “search” waveforms (valley vs mountain), discovering that users preferred valley-based patternsfor spatial awareness and guidance.
To ensure consistency and memorability, I formalized four core haptic categories:
This framework became a reusable reference for future Immersion projects.
The demonstrator received outstanding feedback at CES, the Tokyo Auto Show, and private OEM demonstrations. Automotive partners praised both its realism and usability.
The project contributed directly to:
It helped move Immersion from conceptual demos to credible, production-adjacent experiences.


This project sharpened my ability to design for human perception, where precision and restraint are essential. I learned how to blend visual, tactile, and auditory feedback into a unified experience.
Key takeaways:
“Your work is always very thoughtful and well-presented. This project really helped us tell the story of what HD haptics can do, not just how it works.”
Felipe Almeida Ph.D - Sr. UX Researcher @ Immersion
"Filip's designs got many of our clients inspired and excited with new business opportunities."
Alberto Bonamico - Business Development Director @ Immersion
"His designs and interactions on automotive applications have captured the imaginations of our partners because of the simplicity and haptic feeling."
Manuel Cruz - VP of UX and Research @ Immersion