The new tendency in human-computer interaction is to exploit all humans ex- pressive forms to enable natural interaction with different applications and de- vices. Usually, hand gesture and speech modalities represent the best to imple- ment suitable interfaces in every application context. Developing a framework to define and recognise any set of hand gestures (with or without physical con- trollers) associating it to a set of vocal sentences is still challenging. In fact, on one side, different sets of gestures are characterised by different recogni- tion approaches derived from context, device or application needs. On the other hand, the matching process between a specific gesture and a particular sentence is prone to ambiguous interpretations, errors and coarse simplifications. This paper describes a novel gesture and speech based framework to generate a set of bi-modal interfaces designed to be plugged-in with XML compatible devices.

A Novel Multimodal Framework To Support Human-Computer Interaction

PLACIDI, GIUSEPPE
2012-01-01

Abstract

The new tendency in human-computer interaction is to exploit all humans ex- pressive forms to enable natural interaction with different applications and de- vices. Usually, hand gesture and speech modalities represent the best to imple- ment suitable interfaces in every application context. Developing a framework to define and recognise any set of hand gestures (with or without physical con- trollers) associating it to a set of vocal sentences is still challenging. In fact, on one side, different sets of gestures are characterised by different recogni- tion approaches derived from context, device or application needs. On the other hand, the matching process between a specific gesture and a particular sentence is prone to ambiguous interpretations, errors and coarse simplifications. This paper describes a novel gesture and speech based framework to generate a set of bi-modal interfaces designed to be plugged-in with XML compatible devices.
File in questo prodotto:
Non ci sono file associati a questo prodotto.
Pubblicazioni consigliate

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11697/38639
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus ND
  • ???jsp.display-item.citation.isi??? ND
social impact