Sketch
InVision
Adobe Illustrator
OverFlow
Unity
GitHub
Internal Haptic Design Tools
I was responsible for the complete UX and visual design of the interface, including haptic mapping and prototyping. I worked closely with UX researcher Felipe Almeida, hardware and firmware engineers, and product leads. I created the interaction model, UI system, animated prototypes, and all visual assets. I also defined the taxonomy and categorization of haptic effects, which became a foundation for future demos and customer work.
Touchscreens are increasingly common in vehicles, but they lack the tactile feedback drivers rely on to operate controls with minimal distraction. Our challenge was to demonstrate how haptics could solve this issue. We needed to design an experience that would feel futuristic yet familiar, convincing both automotive engineers and end users of its value.
Goals:
I collaborated closely with hardware and software engineers, internal stakeholders, and most notably, UX Researcher Felipe Almeida. Felipe and I worked side-by-side throughout the project, from initial concept workshops through iterative user testing and final refinements.
To ensure our design direction was grounded in real-world driving scenarios, I co-facilitated ideation workshops with Felipe Almeida. These sessions included cross-disciplinary stakeholders from engineering, sales, and IT, bringing in a rich diversity of perspectives and driving experiences.
We ran three structured user tests across the development process:
Felipe and I collaborated closely during each of these phases. He handled recruiting and facilitation, while I iterated based on findings between test rounds. Our Agile, build-measure-learn approach allowed us to rapidly prototype and refine interactions.
We also did a series of cross-functional ideation workshops involving engineering, product, marketing, and sales. The sessions were designed to surface common pain points in modern vehicle HMI systems and understand expectations for future autonomous environments (L3 and beyond).
To inform our work, I conducted a review of academic research including:
The data highlighted how haptics can:
These insights shaped how we categorized and implemented different tactile cues throughout the demonstrator.
Based on our findings, I focused on the most meaningful and repeatable in-car actions. These included car settings like opening doors, navigation, media control, and climate adjustments. I created information architecture maps and user flows to support these primary tasks and intentionally kept things simple to minimize glance time and distraction.
One of the core challenges I tackled was how to translate complex, layered interaction needs into intuitive and glanceable touchscreen experiences suitable for in-vehicle use. To solve this, I created a haptic interaction model that grouped all effects into four distinct categories, each based on the role of touch feedback in the driver’s flow. These categories formed the foundation for the entire demonstrator and ensured consistency across every screen and interaction:
I tested and tuned these categories against user research findings and the technical capabilities of the actuators. Each interaction in the UI whether a slider, toggle, or navigation gesture was assigned one or more haptic effects based on this model. This structure gave the system a tactile language that was simple, memorable, and scalable across different use cases.
As a result, users could perform tasks with less visual confirmation and more confidence, while the experience remained rich and dynamic. This was especially important for future L3 and L4 autonomous environments, where interaction shifts away from constant visual monitoring and toward multimodal feedback.
Interactive prototypes with haptics were built in Unity where I collaborated closely with software developer. This enabled real-time interaction testing with both the visual interface and the embedded haptic feedback. I also created low and high-fidelity prototypes in InVision, Sketch and Adobe Illustrator, used for early internal validation and user testing. The Project Manager made it clear we are not building a prototype to showcase visuals for automotive HMIs, so those need to be non distracting from the haptics.
The UI featured large touch targets and minimal text, with clear iconography and soft animation. Color coding was used to differentiate system modes. Layouts were designed with 55-degree screen angle and common cockpit ergonomics in mind. The visual design followed a clean, minimal aesthetic, patterns. I used familiar iconography and layout conventions to minimize learning effort. Where necessary, I applied skeuomorphic design details to support haptic metaphors, for example, tactile switches that felt like physical ones.
Constant user testing of every fidelity by UX Researcher provided me with valuable feedback, allowing to me refine visuals, interactions and later acopmanied haptic effects.
Haptic feedback was integrated as a core element of the interaction model, not as an afterthought. I worked with Immersion’s haptics engineering team to define effects for each type of interaction, using PowerHap actuators for detailed, high-definition feedback.
Each key action was matched to a specific haptic signature based on:
My design goal was to convey clear tactile information through vibration patterns. For example:
I have also tested two different haptic “search” strategies; valley vs mountain waveforms. During internal user testing we learned that users preferred the valley shape. It better communicated proximity to interaction zones.
I created four core categories of haptic effects to serve as the design framework:
This taxonomy allowed us to maintain consistency and ensure each haptic effect served a purpose tied to user perception.
The demonstrator received outstanding feedback during its public and private showings. The unit made a strong impression at CES and Tokyo Auto Show. Automotive partners praised the realism and usability. Internally, the project led to new haptic effect libraries and seeded follow-up projects. It also became a foundational sales and education tool for Immersion’s automotive pipeline. Executives from multiple OEMs highlighted the demo’s clarity, creativity, and physicality. Our work demonstrated how haptics could bring emotional and functional value to digital interfaces.
The project directly contributed to:
Designing for perception requires precision and restraint. I learned how to combine visual, tactile, and auditory feedback into a seamless experience. This project also reinforced the value of working closely with researchers and engineers from day one. Tactile interaction has the power to deeply influence user confidence when done right.
“Your work is always very thoughtful and well-presented. This project really helped us tell the story of what HD haptics can do, not just how it works.”
Felipe Almeida Ph.D - Sr. UX Researcher @ Immersion
"Filip's designs got many of our clients inspired and excited with new business opportunities."
Alberto Bonamico - Business Development Director @ Immersion
"His designs and interactions on automotive applications have captured the imaginations of our partners because of the simplicity and haptic feeling."
Manuel Cruz - VP of UX and Research @ Immersion