Expressive Musical Interface Design
- Publisher:
- Cambridge Scholars Publishing
- Publication Type:
- Chapter
- Citation:
- Sound Musicianship: Understanding the Crafts of Music, 2012, 1, pp. 189 - 201
- Issue Date:
- 2012-01
Closed Access
Filename | Description | Size | |||
---|---|---|---|---|---|
![]() | 2012007015OK.pdf | 1.81 MB |
Copyright Clearance Process
- Recently Added
- In Progress
- Closed Access
This item is closed access and not available.
This chapter focuses on the interface - the part of the instrument that facilitates interaction between a player and sound generating systems. It describes some background to musical instrument design, outlines the PIDS design framework, and then details a case study of the design and development of new electronic controllers for the jam2jam music system. In traditional musical instruments, the sound source and interface are integrated, but these two parts are often separate in electronic instruments. With the introduction of the MIDI (Musical Instrument Digital Interface) protocol in the mid-1980s, this separation of interface and sound source was further facilitated. At present, the OSC (Open Sound Control) protocol is widespread and allows a higher bandwidth and greater precision of control parameters. However, due to this separation, the natural connection a player used to have with the sound source is lost. The interface has to be designed to allow all forms of interaction necessary for successfully supporting musical expressivity. This is try in general for electronic systems, as studied in the field of Human-Computer Interaction (HCI). Electronic musical instruments are devices that require extreme sensitivity and connection between player and process. This chapter shows some examples of the reciprocal relationship between HCI and musical instrument design, and how this relationship can mediate musicianship skills.
Please use this identifier to cite or link to this item: