|
> Congress /Advance Programme/ Topical days / Top5
Top5: Multimodal Interaction
Tuesday 24 August 2004
Organiser
Laurence NIGAY, University of Grenoble, CLIPS-IMAG Laboratory,France
Laurence.Nigay@imag.fr
Overview
The use of multiple modalities such as speech, gesture, sound and graphics opens a vast world of possibilities for human-machine interaction. Multimodal interaction is a research axis that has matured enough to be applied to real application domains and that plays a central role for the next generation of interactive systems: pervasive and mobile computing. This status was the driving force for organizing a topical day on multimodal interaction. Our objective is two-fold: 1- By gathering the complementary required expertise, we first aim at presenting in a coherent way the theories, methods and tools for the design and the development of robust and efficient multimodal systems. 2- By soliciting experts in ubiquitous, mobile and web computing, we also aim at demonstrating that multimodality is an integrating vector for several recent interaction paradigms that include perceptual user interfaces, tangible interfaces, augmented reality and ubiquitous computing.
The topical day is geared towards researchers, engineers, developers and practitioners interested in the ergonomic design, software development, usability evaluation of multimodal systems and in their future applications. For the topical day, we therefore decided to provide a review of theories and generic results for the design and development of robust and efficient multimodal systems and then focus on some specific major applications of multimodality. The day is structured into two parts: i) design and development of multimodal user interfaces and ii) domains of multimodality.
Programme
10h30 - 12h:
Design and development of multimodal user interfaces I
Chair: Jean Vanderdonckt
Design Space for Multimodal Interaction - Laurence Nigay (CLIPS-IMAG, France)
Software Design and Development of Multimodal Interaction - Marie-Luce Bourguet (U. London, UK)
13h30 - 14h15:
Design and development of multimodal user interfaces II
Chair: Fabio Paterno
A generic formal specification of fusion of modalities in a multimodal HCI - Yamine Ait Ameur and Nadjet Kamel (U. Poitiers, France)
14h15 - 15h:
Domains of multimodality I
Chair: Laurence Nigay
Multimodality and Multi-Platform Interactive Systems - Fabio Paterno (ISTI-CNRS, Italy)
15h30 - 17h: Domains of multimodality II
Chair: Laurence Nigay
Multimodality and Context-Aware Adaptation - Quentin Limbourg and Jean Vanderdonckt (UCL, Belgium)
Towards multimodal web interaction: Web pages you can speak to and gesture at - Dave Raggett (W3C/Canon), Max Froumentin (W3C) and Philipp Hoschka (W3C)

|