I created this Max patch as a test for sound reactive visuals. I use jitter physics to give the balls mass in the virtual world and then map ghost objects at the bottom of the world to impulse based on the sound coming in or out of the computer. I use the fffb object to separate the left and right audio channels into different bands which correspond to the ghost objects that send out impulses to move the balls. That way instead of just flying all over the place the direction the balls move directly corresponds to how much bass or treble is in the music and which channel it’s coming from. The song is by my friend Ula from Poland. It was an ideal choice for the test as it has a dynamic range.
Here is the patch being used to do visuals at a show.
Today I debuted my new interactive dome piece, Mind Chimes at ARTS Lab, UNM. The piece generates visuals and music from a live brainwave feed captured by a NeuroSky MindWave Mobile headset. I coded the entire piece with MaxMSP and used vDome, an open source Max based dome player, to skin it to the dome. The audio is generated by sending MIDI notes from my brainwave synth to Camel Audio’s Alchemy MIDI synth instruments. The visuals are generated by the notes played from the audio. They change colors based on your state of mind. This is a great first iteration and I look forward to building it out further.
There’s no good way to capture a dome piece with standard video but here’s a little clip I shot of my friend going to town with his mind.
The intention of the piece is to recreate a walk home and then emphasize the repetition of similar sounds heard every day in roughly the same order and time.