Subliminal cues, warnings, and feedback to augment driving behavior
Emotional state recognition while driving
Concepts and solutions to keep the driver in the loop and to increase user experience
Driver interaction with different stages of automation
HMIs vs. HBIs (Human Bot Interfaces) vs. BHI (Bot-Human Interfaces)
Is the automobile ready for AI – Machine Learning & Learning systems – what does that means for future car HMIs?
New NHTSA driver distraction guidelines
User adaptation for cloud based automotive speech interfaces – speech system architecture including on-board and off-board speech recognizers
Exploring the application of neuroscience to assess driver interactions, cognitive load and potential driver distraction and potential use of Consumer Neuroscience as a tool for designing and assessing the next generation of UX strategies
Automated Driving and Interfaces for autonomous driving in Level 3 & 4
Head-Up Displays (HUDs) and Augmented Reality (AR) concepts
Co-operative Driving / Connected Vehicles / Cognitive vehicles & vehicle perception
Inter Conversational chatbots, personal assistants, personalization UX & Information access (search, browsing, etc.)
Modeling techniques for cognitive workload and visual demand estimation
Automation Systems: from operation to monitoring & from driver-to-driver interaction to vehicle-to-vehicle interaction / UI for an autonomous driving car
Future HMI architecture / Cockpit architecture: diversification and common elements in current display & controls configuration
How to develop future display and controls / Head-Up Displays (HUDs) and Augmented Reality (AR) concepts
Flexible architectures that are able to manage unknown, future threats
Multi modal, speech, audio, gestural, natural input / output / & feedback systems
New interfaces for navigation / Text HMIs – input and output while driving / Sound in Product Design / Sound HMIs