The journal starts by explaining the project idea. It’s using city data sonfied into a musical composition. The project uses pieces of public data such as noise, air pollution, foot traffic and other pieces of data as triggers or attenuators for the composition, essentially letting the city create the music. Generative city music. I found these quotes interesting.
Eno calls his approach “generative music,” defining it as “system-propagated music that is in a state of constant flux.” These systems (e.g. Discreet Music and Music for Airports) are driven by phase-shifting looping processes similar to those exhibited in the work of Steve Reich. Reich’s work receives a similar cybernetic analysis from Strange [9]. Lucier’s 1965 Music for Solo Performer, categorized by Pickles as first-order cybernetics, takes a different approach [10]. In the piece, Lucier employs data-driven music composition techniques to map brainwave
data to control musical parameters, specifically percussion patterns. Drawing from these influences, I explored generative music systems, phase-shifted looping, and data-driven music in the Signal to Noise Loops project.
I’ve also looked into Eno and his idea of generative music, such as Music for Airports in which Eno used tape loops and let the long tapes go in and out of time to teach other to create compositional elements that decide amongst themselves what is happening. This project is similar in nature and decides to follow in the footsteps, but instead of tape, it uses data from cities as information to decide elements in the composition.
I used noise data from a series of sensors around Dublin to drive a performance system I created with Python, Max for Live, and Ableton Live (Fig.1). At the outset of each performance, I downloaded the most up-to-date data
archives and stored them locally for the duration of the performance. The system explored the cybernetic ideas of looping, human-in-the-loop reflexivity, and evolution. It allowed me, as the performer, to record loops of improvised guitar passages live. The system then manipulated these loops through phase shifting
I found the way the creator has used Ableton and other software with the data to be of interest to me. Can I possibly sonify data from Stave Hill? Since it’s an ecological park there must be data available for me to use and translate over into compositional audible expression. Can Stave Hill speak for itself?
improvisations are treated as a starting point by
the system, which always evolves, mutates, and iterates over
the performer’s input. Exactly how that happens is determined once again by the data,
This is similar to how Eno describes the idea of generative music and letting the improvisations, similar to Eno stating that the initial parameters are decided by the human. Then left to create something amongst itself. It mutates and iterates itself into something new and different without thinking about why.