This was a final project for an NYU ITP course called Music Interaction Design. It was my first foray into the world of musical programming beyond just sound filtering, and the nature of the class gave me the chance to explore and have fun with the project.
The idea was to reimagine how we interact with the instrument by making the player move their hands in the exact opposite way of what they expect. The sound was output from my guitar to a Raspberry Pi (previously onboard the guitar), put it through a Pure Data patch that shifted the pitch according to the note being played, e.g. a high note gets reflected around middle C to become a low note, and vice versa.
It's extremely hard to play for anyone accustomed to a normal guitar (some of this is admittedly due to technical issues, like the PureData patch not being able to recognize more than one note at a time), but if you're playing it for the first time, it's hard to figure out why. If you're just playing mechanically without attention to sound, the notes are completely wrong. If you're trying to play by ear, you have to listen and remember to move your fingers in the opposite direction to everything you're used to. For an activity that becomes as much muscle memory as anything, this is jarring in the extreme. When I asked my friend Paul to test the first time, he couldn't figure out why every note sounded wrong - although he did tell me that the action was too high.
The major problem I had in the technical implementation of this project was in pitch recognition, as polyphonic pitch detection is still a major subject of research. I found a couple of methods for monophonic, including performing spectral analysis on the incoming signal to find the fundamental frequency using P.D. Cooper's Spectral Analysis toolkit, and using ml.lib, a machine learning library for Pure Data.
I had never done a machine learning project before, so a lot of time was spent testing different algorithms and figuring out the different parameters that would make the pitch detection most accurate. For the final project for the class, the testing had to be done on my computer instead of the miniature amplifier that I built, because the Spectral Toolkit version of the program ended up being slightly more reliable, but could not be compiled for Raspberry Pi.
During user testing, I got multiple responses of "It sounds like Star Wars," (the algorithm that did the reflection around middle C had some interesting synth-like effects when the number of half notes it was supposed to pitch shift by changed very quickly, as happened when the pitch detection was unreliable) and just a few very cool reactions when the testers figured out what the sound was doing (or didn't, sometimes). You can see a few of the videos below.
The amplifier, bottom left, is currently being rebuilt with significantly more powerful speakers and a touch screen to enable user interaction with the patch more easily as they're playing. The clear offshoot of my original brian guitar project, jimi is a programmable amplifier that allows you to take input from both a line instrument input and any connected MIDI peripherals.