Skip to main content
  • Research Article
  • Published:

Real-Time Gesture-Controlled Physical Modelling Music Synthesis with Tactile Feedback

Abstract

Electronic sound synthesis continues to offer huge potential possibilities for the creation of new musical instruments. The traditional approach is, however, seriously limited in that it incorporates only auditory feedback and it will typically make use of a sound synthesis model (e.g., additive, subtractive, wavetable, and sampling) that is inherently limited and very often nonintuitive to the musician. In a direct attempt to challenge these issues, this paper describes a system that provides tactile as well as acoustic feedback, with real-time synthesis that invokes a more intuitive response from players since it is based upon mass-spring physical modelling. Virtual instruments are set up via a graphical user interface in terms of the physical properties of basic well-understood sounding objects such as strings, membranes, and solids. These can be interconnected to form complex integrated structures. Acoustic excitation can be applied at any point mass via virtual bowing, plucking, striking, specified waveform, or from any external sound source. Virtual microphones can be placed at any point masses to deliver the acoustic output. These aspects of the instrument are described along with the nature of the resulting acoustic output.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to David M Howard.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Howard, D.M., Rimell, S. Real-Time Gesture-Controlled Physical Modelling Music Synthesis with Tactile Feedback. EURASIP J. Adv. Signal Process. 2004, 830184 (2004). https://doi.org/10.1155/S1110865704311182

Download citation

  • Received:

  • Revised:

  • Published:

  • DOI: https://doi.org/10.1155/S1110865704311182

Keywords