Optical sound research


Image/video sonification processes

Over the past few years I've developed an interest in optical sound as a technique for image sonification. The original process is fairly straightforward: since normal film soundtracks work by passing light through to a photocell, any filmed pattern that modulates the light will produce sound. Artists such as Guy Sherwin and Oskar Fischinger made use of this flexibility to develop new audiovisual relationships through the sonification of filmed images and patterns.

Since my own practice is based in digital video and audio, I wanted to adapt some of these principles so they might be used in my pieces such as No-Input Pixels. This page contains a number of examples that I've developed in Max/MSP/Jitter that use these ideas and detail some of the new possibilities that emerge in a digital context. A further discussion of this work can be found in my paper with Carlos Dominguez, "Digitally Extending the Optical Soundtrack."

Max/MSP/Jitter code

Code related to this project can be found at github.com/apdupuis/Optical-Sound-Patches.