Arcade Fire – Just a Reflector: Explorations in Web 2.0

A friend of mine just shared this with me and I had to write about it. It’s an interactive music video using a handful of web 2.0 tech that lends to a really impressive interactive experience. First and foremost you’ll have to check it out for yourself at


Second, after you’ve checked the video out, here is a list of the tech they used and a short explanation of what it does.

Web Technologies

JavaScript library that uses WebGL to create fast 2D and 3D graphics

JavaScript API that allows access to the user’s GPU for image processing. Part of the HTML5 canvas element.

An open source project that makes it simple for JavaScript developers to deploy on Google App Engine. Includes a mesh network for connecting multiple devices through WebRTC and WebSockets.

web technology that allows browsers to communicate with each other, enabling near real time communication.

web technology that gives JavaScript developers access to the webcam and microphone. Using camera vision, the site can then track your phone’s position in front of the webcam.

web technology that lets developers analyze and manipulate audio files.

Device Orientation
Your phone’s orientation is tracked using accelerometer and gyroscope data, which is passed to your computer via WebSockets.

Google Technologies

Google Chrome’s advanced features, such as WebSockets, WebGL and getUserMedia(), help create an immersive, interactive experience.

App Engine
App Engine lets web developers build and deploy instantly scalable web applications on Google’s infrastructure.

Compute Engine
Compute Engine is used to run the project’s mesh network, keeping both phone and desktop browsers communicating at all times.

Cloud Storage
Video files are stored via Google’s Cloud Storage for cost effective file serving online.

Information sourced from

Leave a Reply

Your email address will not be published. Required fields are marked *