In C(++) can be found from August 17th to August 30th, 2012 at Gallery 128 at 128 Rivington Street, NYC from 1 PM to 8 PM, Wednesday through Sunday.
If you visited In C(++), you can follow this link to find the images of you and the final sheet music generated by the program.
With Mad Mohre and Sugar Vendil's Nouveau Classical Project I implemented a musical art installation. Mad told me about her Graduate School thesis in which she conceived of staff lines on the ground, which turn unwitting gallery patrons into musical notes. Those notes could then be turned into music in the style of Terry Riley's In C (hence the fantastically nerdy name of "In C++".
As is expected, the program is written in C++. It relies on Open Frameworks for video capture, and uses OpenCV for the vision portions. A camera is placed on the ceiling looking down onto vinyl staff lines on the floor. As gallery patrons walk through the gallery across the staff lines, they become notes in an interactive performance.
The first step is detecting the staff lines. I originally considered using a Hough Transform to detect the staff lines, but given that the camera will likely not move (unless it falls off the ceiling), and that the lines might not be straight for a variety of reasons (vinyl applied in a curve, perspective distortion from the angle of the camera, lens distortion from the camera), and that using that technique would require gallery patrons to clear the gallery floor (mostly) to re-initialize the lines, it ended up being simplest and most robust to have a person simply drop into "edit mode" and click along each line. It takes about 2 minutes and is very robust. Since an individual staff line can be comprised of multiple line segments, this allows straight line approximation of curved staff lines.
Next comes detecting the people. For this I use OpenCV's blob detection on a thresholded capture from the camera. To better detect heads I've considered using HoughCircles, but for now just detecting contrasty areas is good enough. It's an art project anyway! Some heuristics are applied, such as long and skinny blobs can't count, neither can giant or tiny ones, since those likely aren't people. Blobs inside other blobs also don't count.
Once blobs have been detected, they can be converted into actual pitches. The pitch of a blob is determined by taking its centroid, and then finding the closest staff-line line segment. Once that line segment is identified, a pitch can be assigned. If the closest line segment exceeds a maximum distance threshold, the blob is discarded as being extraneous noise.
Now that blobs have been detected and pitches have been assigned to them, all that remains to be computed is the rhythm. There are a few techniques used to create interesting rhythms within a single measure:
After at least one measure has been captured, the performance can start. Some number of virtual players are selected, usually numbering between 4 and 10 (plus the metronome marimba which continues for the entire performance). Offline, instrument combinations were selected which sound good together, and the virtual players select one of those pre-determined combinations. The choice of instrumentation remains constant for each performance.
Players are represented by colorful dots which circle around. The dots show which measure of recognized music the virtual players are currently playing. Additionally the dots pulse for each note played by their musician. Observers can watch the players progress through the piece by following the dots.
At the end of a performance, a typeset piece of music is generated showing what the musicians were playing. For the video at the top of the page, the following music was generated.