top of page

Category

Human Machine Interface

DESIGN

A holistic HMI behaviour design for autonomous driving systems (as part of the European Union's Horizon 2020 project Mediator)

Concept

Date

Oct 2020

till

Now

Project

logo_Mediator.png
research realisation.png
Screen Shot 2021-09-29 at 13.55.11.png

When automated vehicles provide different levels of automation in one system, a challenge arises: how to communicate drivers' responsibility during (and when switching between) these different levels of automation?

 

To tackle this challenge an Human Machine Interface (HMI) was designed and applied to several use-cases (emergency- and comfort- situations, mode transfers). The holistic design combines behaviour and lay-out of screens, lighting, sounds and other sorts of (tactile) components in order to enhance the communication between driver and autonomous driving systems. It is designed in such a way that either driver and/or automation are able to decide which automation level suit specific situations.  

 

By conducting VR-, simulator- and on-road experiments it is being evaluated if the (collaboration of the) HMI components enable clear communication between automation and the human driver.

This is an ongoing project, in which the design has not been (publicly) published, therefore the visual shown on this page merely give an abstract impression.

Summary

Mediator process-01-01-01.png

Process

Public

Deliverable

31/03/21

Confidential

Deliverable

23/12/21

Public

Deliverable

TBD

Design visualisation.png

The design

The holistic HMI enables intuitive communication between driver and automation by means of visual, tactile and auditory signals following a generic ritual. This control transfer ritual is designed for several use-cases following 5 design principles: a holistic approach, design for user acceptance, design for industry acceptance, design for a generic transfer ritual and design for learned affordances. But above all, it should enhance trust in the system.

COMPONENTS.png
HMI design page-ideation interior.png
HMI design page-nudging ideation.png

In order to maintain driver awareness in a vehicle that provides multiple levels of automation, signals should be loud and clear in any situation which is why it is chosen to communicate in a multi-sensory, ritual-based manner according to 3 HMI identities: neutral, compassionate and attentive. The first two contribute to a feeling of team-work between the driver an automation, which could enhance trust, whilst the attentive identity allows the HMI to take a higher hierarchical stand and decides what's needed for that certain (alarming) moment.

HMI IDENTITIES.png

The design embraces users with it's multi-scensory communication, enabling contact during any non-driving related activity. This is important especially when the automation takes over certain driving tasks and the user might not focus on the road and driver display the entire time. Ambient lighting is one of the aspects providing this communication: it radiates different colours during different driving modes: manual- (SAE 0), assisted- (SAE 1&2), or piloted (SAE 3&4) driving. These same colours that match the specific driving mode are used in all visual components like led strips and UI design.

 

Next to these visual cues the HMI can reach the user by means of tactile signals (inflatable lower-seat cushion, vibrators, retractable seatbelt & haptic feedback in the lever), but also auditory messages like abstract sounds (indicating notifications, alarming situations and confirmations) and spoken messages that emphasize visual alerts and notifications.

partners mediator.png

Collaboration

The multi-disciplinairy consortium consists of: research institutes, universities, OEMs, and suppliers. They represent all kinds of transport modes: aviation, railway & maritime.

 

Collaboration between all partners took place during the entire research-, implementation- and testing phase. During the design phase of the HMI the TUDelft mainly collaborated with SWOV and Autoliv.

Responsibility

As the lead design researcher of the TUDelft team, I am responsible for the HMI (behaviour) design; the communication between different disciplines and taking care of a seamless transfer of the design implementation  into the different prototype platforms (on-road and simulators).

 

Furthermore, as one of the task leaders of the last phase, I'm in charge of gathering and translating the insights of the analyses into safe (generic) HMI guidelines.

Publications

Aldea, A., Tinga, A., Van Zeumeren, I., Van Nes, N., & Aschenbrenner, D. (2022). Virtual Reality Tool for Human-Machine Interface Evaluation and Development (VRHEAD). 2022 IEEE Intelligent Vehicles Symposium (IV). DOI: 10.1109/IV51971.2022.9827375

 

Tinga, A., van Zeumeren, I., Christoph, M., van Grondelle, E., Cleij, D., Aldea, A., & van Nes, N. (2022). Development and Evaluation of a Human Machine Interface to Support Mode Awareness in Different Automated Driving Modes. SSRN Electronic Journal. DOI: 10.2139/ssrn.4074806

Kim, S., van Grondelle, E., van Zeumeren, I., Mirnig, A., & Stojmenova, K. (2022). Let’s Negotiate with Automation: How can Humans and HMIs Negotiate Disagreement on Automated Vehicles?. In 14th International Conference on Automotive User Interfaces and Interactive Vehicular Applications (AutomotiveUI ’22 Adjunct), September 17–20, 2022, Seoul, Republic of Korea. ACM, New York, NY, USA, 3 pages. https://doi.org/ 10.1145/3544999.3550159

bottom of page