This semester at ITP I am really excited to be taking NIME: New Interfaces for Musical Expression. On the first day of class, out professor, Greg Shakar, asked for us to create a musical instrument in one week, and prepare a brief performance for the class.
I iterated over several ideas before I came to Music Wall. Initially what I wanted to create was an idea that I have had for some time for a musical instrument called “Beat Burger Helper” that would be sort of an assistant to beat-boxers. The premise is that when the performer makes sounds into the Beat Burger, the Beat Burger would process those sounds and play a sample of the sound that the user was attempting to emulate. For instance making the sound of a kick drum with your mouth, would result in a sample of an actual kick drum being played. Well, although this was a fun idea, it turned out to be too complicated to accomplish in the allotted time frame of one week.
The idea of the Music Wall came to me as an evolution of a previous idea of creating music by scanning the audience in the performance space. After an long night of coding what I came up with my initial prototype for “Music Wall.” This is a piece of software that uses computer vision to scan any surface to find objects (blob detection) and turn these objects into discrete musical notes. Essentially, it functions as a sequencer. In the video demo below, I am simply using pieces of grey tape on a white wall to compose a melody. The height of the piece of tape determines the pitch of the note. The horizontal position determines the length of the note, and its position in time, and the size of the determine the volume (larger objects are louder, smaller ones are quieter). At the end of the demo I give a shot of the Processing sketch in action.
Sorry for the poor quality and focus on the video. Thanks to my sister Anna for handling the camera (my Droid Incrapable).