This project aims to create a collaborative MIDI interface that requires multiple people to play together, exploring the connections between the body and music, music and the body, and between people themselves.
We envision the final presentation as a body or head, where different organs correspond to different MIDI inputs, each triggering distinct sounds. Below are some visual style references.





From a design perspective, it was intentionally made so that each component would require two-handed operation. For example, pressing only the left side of the pressure sensor plays melodic fragments, while pressing both sides together produces the full melody. Similarly, the distance sensor is designed to encourage collaborative use. This setup prevents a single person from operating the entire device alone and promotes coordination among multiple participants. Ideally, it works best when three to four people perform together.

A step sequencer essentially allows for the audio to constantly play. For now, a lead plays when the pressure sensor is pressed, however more sounds will be added in the final. We’d also like to incorporate more musical genres that can be toggled on and off for a whole new switch up.

Resolume Arena was used for the visual aspect of the project. Using a network MIDI connection with my hotspot (NYU wifi has a firewall), we were able to synchronize Ableton and Resolume Arena together. The visuals move faster or slower based on if the bpm changes, for instance.
