Human-centered interfaces (1)
Automatic hand gesture recognition, analysis of facial expressions, head and body movement, eye tracking, force sensing, or electroencephalogram are recently gained interest as potential modalities for human-centered interfaces.
Some modalities like speech and lip movements, are more closely tied than others, such as speech and hand gestures.
The fusion of such different modalities can be explored at different levels of integration by distributed detection and data fusion algorithms.