The Worlds first Digitized Indian String Instrument By Ajay Kapur, Ari Lazier, Phil Davidson, R. Scott Wilson, Perry Cook Princeton University's SoundLab, Stanford University's CCRMA & University of Victoria's MISTIC Project Description and Goals
The purpose of this project is to use microcontroller technology to create a real-time instrument that models the Sitar. This Electronic Sitar (known as the ESitar) has digitizing sensors, custom positioned to traditional Sitar technique, which converts human musical gestures to binary code which machines can comprehend. These signals can then be used to trigger real-time sound and graphics.
Design Details With the goal of capturing a wide variety of gestural input data, the ESitar controller combines several different families of sensing technology and signal processing methods. The specific gestures our system captures data from are the depressed fret number, pluck time, thumb pressure, and 3 axes of the performer’s head tilt. The core of the ESitar's sensing and communication systems is an Atmel AVR ATMega16 microcontroller. Pictures
In Concert
Performed Live in Princeton, New Jersey on November 15th, 2004 at the Listening in the Sound Kitchen Computer Music Conference. Performed Live in Hamamatsu, Japan on June 4th, 2004 at the International Conference for New Interfaces for Musical Expression. Performed Live in Victoria, BC Canada on November 18th, 2004 with 8 Robotic Turntables at the Trimpin Lansdowne Scholar Concert. New Interface for Musical Expression
Please use the following reference to cite this work: Related Controllers The Electronic Tabla The Electronic Dholak Related Work Audio-Based Gesture Extraction on the ESitar Controller Questions Email: akapur@alumni.princeton.edu |