Wireless Gesture Controllers to Affect Information Sonification

Publisher:
ICAD
Publication Type:
Conference Proceeding
Citation:
Website Proceedings of International Conference of Auditory Display, 2005, 2005 pp. 105 - 112
Issue Date:
2005-01
Full metadata record
This paper proposes a framework for gestural interaction with information sanification in order to both monitor data aurally and, in addition, to interact with it, transform and even modify the source data in a two-way communication model (Figure I). Typical data sonification uses automatically generated computational modelling of information, represented in parameters of auditory display, to convey data in an informative representation. It is essentially a one-way data to display process and interpretation by users is usually a passive experience. In contrast, gesture controllers, spatial interaction, gesture recognition hardware and software, are used by musicians and in augmented reality systems to affect, manipulate and perform with sounds. Numerous installation and artistic works arise from motion-generated audio.
Please use this identifier to cite or link to this item: