I created this Max patch as a test for sound reactive visuals. I use jitter physics to give the balls mass in the virtual world and then map ghost objects at the bottom of the world to impulse based on the sound coming in or out of the computer. I use the fffb object to separate the left and right audio channels into different bands which correspond to the ghost objects that send out impulses to move the balls. That way instead of just flying all over the place the direction the balls move directly corresponds to how much bass or treble is in the music and which channel it’s coming from. The song is by my friend Ula from Poland. It was an ideal choice for the test as it has a dynamic range.
Here is the patch being used to do visuals at a show.
For this project I used Millumin to map the geometry, Resolume for the VJing, and Max/MSP to route OCS data between Synapse, which does skeletal tracking/triggering using the Kinect, and Millumin. The video content is a mixture of video loops, a Lucius music video (Turn It Around), and my Light Dreams video which can be found on my channel.
You can see me triggering different videos with gestures in the PIP on the bottom right.
Virtual Traffic is an interactive art installation that composites pedestrian foot traffic at five high-density areas of the UNM campus into a comprehensive shared experience.
Cameras simultaneously capture pedestrians at five high traffic locations, and custom software composites the videos together. Different blending effects activate based on traffic density, direction, and position of the pedestrians in the space. This facilitates a virtual interaction between the people in each space, and helps us begin to understand our daily commute in a new way.