The universe is an ensemble of atoms and electromagnetic fields that vibrate and resonate. Particles and energies that our brain turns into colors, smells, fragrances and tastes, generating a flow of emotions that give meaning to our lives. The whole world we live in is created by the way reality is reflected in our minds.
Cortex explores the sense of sight. As our eyes capture light, our mind turns the frequencies of the colors reflected by all objects into electric impulses. These impulses reach our cortex, where billions of neurons are responsible for making sense of what we see. By observing Cortex, the viewer gets the feeling of being immersed in a flow of particles travelling at the speed of light: particles that might turn into delicate reflections - slightly visible - or into violent lightstorms enwrapping the viewer.
In order to represent the process of light perception we created a light sculpture capable of making the light journey visible throughout the space. The installation is made of 4 square metal frames that host, on their inner side, 16 mirrors reflecting the projected light onto the fog. The mirrors are oriented in such a way that allows the generation of an infinite series of light compositions, sometimes symmetric, some other times chaotic and random.
The square shape has been chosen for its symbolic meaning: square is connected to the earth and is linked to number 4, as 4 are the natural elements water, fire, earth, air. It is on these elements that the light is reflected, absorbs color and is transformed into sounds and shapes. The driver of this transformation is an algorithm that analyzes real time photography of pure, pristine landscapes and ‘informs’ the light accordingly. The pictures have been collected during travels across Cambodia, Vietnam, Mongolia, Iceland and Australia. Cortex analyzes the images one by one and - resembling the functioning of human eye retina - turns their RGB and greyscale data into a series of ‘instructions’, a framework of rules that generates the soundscape and model the light. The installation changes over time, evolving along with the images, each time giving its own ‘interpretation’ of what is being shown.
The system at the core of Cortex’ functioning has been created in openFrameworks and Max/MSP. Max uses data from the processed images for controlling the synthesis systems that model the white noise, generating a specific soundscape for each image.
The images are broken up into 4 matrixes sized 1x171 pixel, and for each pixel the RGB values are extracted.
Every line generated through these values creates a sequence of 171 notes in C major. The notes - played in a random time between 500 and 2500 ms - control the central frequency of 2 pass band filters with very high Q resonance coefficients, causing the transformation from white noise into sound. Throughout the different scenes of the installation, sound is modeled by waveshaping functions that distort the signal, amplifying and limiting the maximum value to +/- 1. Moreover, pass band filters with random cut frequencies and Q resonance coefficients are used, as well as random are their amplitude modulations.
The software developed in openFrameworks generates the visual part of the installation, which evolves based on real time analysis of the audio samples created by Max/MSP’s patch (synchronization via OSC). Sound is analyzed through a buffer of 512 samples for the left channel and 512 for the right channel. This kind of analysis allows to highlight the characteristics of the specific waveforms used to create the soundscape, emphasizing pan-pot, peaks and frequencies, as a brush moved by sound would do.
Direction, Executive Production: Luca Camellini, Mattia Carretti
Software Development: Luca Camellini
IT Development: Matteo Mestucci
Sound Design: Riccardo Bazzoni
Production Management: Filippo Aldovini
Prototyping: Daniele Iandolo
Premiere at: Scopitone Festival 2016