¬¬ index  ¬¬ research  ¬¬ publications  ¬¬ development  ¬¬ blog  ¬¬ music  ¬¬ contact

Evolutionary generative music based on functional mappings from time and control variables.

We use a representation for evolutionary music based on free-form directed acyclic graphs whose nodes execute arithmetic functions. Input nodes supply time variables, abstract control variables, and user control signals; multiple output nodes map numerical results to MIDI data. The motivation is that multiple outputs from a single graph should tend to behave in related ways, a key characteristic of good music.

We get a type of separation of control: the graph specifies the musical content, while the control signals specify the musical structure. Of course there is also overlap. This separation of music into form and content enables novel compositional techniques well-suited to writing for games and film, as well as for standalone pieces.

Where necessary, evolutionary search can be applied, using statistical, target-matching, or purely subjective fitness measures.

My thinking on abstract structure in music, and using graphs to express functional relationships, was influenced by the Buzzmachines.com community, the Buzz machines I wrote for generative music, and their use by artists such as Tinga: This example is from 2004.

This project was also partly inspired by the work of Amy Hoover and colleagues on NEAT Drummer.

Sound and Music 2015

I collaborated with Dr Katie Crowley (Trinity College Dublin), on an implementation where input signals are taken from brain-computer interfaces, ie brain waves control the music. This work has been submitted to the Sound and Music Computing Conference 2015, with the title "Mapping brain signals to music via executable graphs". Some example pieces, data files and source code are available.

GECCO 2011

I also collaborated with Prof Una-May O'Reilly (MIT) on a different implementation, published as "An Executable Graph Representation for Evolutionary Generative Music", Digital Entertainment Technologies and Arts Track, GECCO, 2011.

XG Demo pieces available here.

XG Experiment 0 (music and questionnaire) available here.

XG Experiment 1 (music and questionnaire) available here.

Source code available on request.

EvoMUSART 2010

On this project I collaborated with Dr Jianhua Shao (at the time an undergrad summer intern), who implemented the Jive system (project page), where input signals are taken from mouse or Wiimote. This work was published as "JIVE: A Generative, Interactive, Virtual, Evolutionary Music System", EvoMUSART 2010 (best paper award).