Position Paper from Nokia

Vetek Akos and Ram Hariharan

As the processing power of mobile terminals and the related communication infrastructure evolve, mobile devices are increasingly used for a variety of tasks other than just voice communication. This coupled with the decrease in the size of these devices, and the requirement of accessing information anytime and anywhere posed by mobility present significant new challenges when designing innovative user interfaces.

We believe that adaptive multimodal interfaces allowing interaction with these devices using modalities (either one at a time or in combination) most suited to the user's current needs and abilities can substantially enhance the user interface of existing applications and services making the interaction more usable and robust. At the same time the emergence and incorporation of new modalities can lead to the creation of completely new applications and services. Multimodal interfaces also inherently provide for improved accessibility.

For instance a user of a voice-enabled device with a touch-screen can start a session while commuting on the train, continue the interaction while walking to his office and complete the transaction while sat at his office-desk. As the user moves between environments with very different characteristics, the user is given the opportunity to interact using the preferred and most appropriate modes for the situation. For example, while sitting in a train, the use of stylus and handwriting can offer higher accuracy than speech (due to noise) and protect privacy. When the user is walking, the appropriate input and output modalities would be voice with some visual output. Finally, at the office the user can use pen and voice in a synergistic way [1].

While the W3C has made much progress in defining the W3C Multimodal Interaction framework identifying the functional components of multimodal systems and laying down the groundwork for the coordination and communication between these components at a browser implementation level, much more needs to be done further to enable a standardised way of authoring multimodal applications. Nokia has been keenly following/participating in the related standardisation activities in various standardisation fora.

References

  1. Multimodal Interaction Use Cases, W3C NOTE 4 December 2002, http://www.w3.org/TR/mmi-use-cases/
  2. W3C Multimodal Interaction Framework, W3C NOTE 06 May 2003, http://www.w3.org/TR/mmi-framework