Applications in the car have moved beyond steering aids and tuners a long time ago. Rich driver assistance and information systems increase comfort and deliver a substantial added value. To preserve an intuitive and safe operation in spite of the increased number and complexity of functions, we consider novel interaction methods that combine different modalities such as touch, speech, eye-gaze, and finger gestures. Control is t hereby not limited to systems in the own car, but can also be used to interact with the outside environment.
A major part of our research in this area is the creation of a multimodal dialogue platform that helps us investigate particular topics and also serves as a solid basis for creating dialogue applications for the car. The superior features of our dialogue platform SiAM-dp (Situation-Adaptive Multimodal Dialogue Platform) include
- Integrated coverage of multimodality concepts (fusion and fission)
- Several devices representing common modalities are supported out-of-the-box
- Intelligent dialogue system behavior through semantic interpretation of user input
- Situation adaptivity through dynamic behavior depending on user and context
- Personalized Driver Assistance
- Consideration of user resources (e.g. cognitive load, time)
- Offline evaluation of dialog runs gives early insights without a costly user study
- Multi-party support allows inclusion of passengers in the dialogue discourse
- Ability to dynamically connect to external devicesas output devices, e.g. electronic roadsigns and billboards.