This project was partly inspired by the work of Amy Hoover and colleagues on NEAT Drummer. My thinking on abstract structure in music, and using graphs to express functional relationships, was also influenced by the Buzzmachines.com community, the Buzz machines I wrote for generative music, and their use by artists such as Tinga: This example is from 2004.
On this project I collaborated with Jianhua Shao, who implemented the Jive system (project page). This work was published as "JIVE: A Generative, Interactive, Virtual, Evolutionary Music System", EvoMUSART 2010 (best paper award).
I also collaborated with Una-May O'Reilly on a different implementation, first published as "An Executable Graph Representation for Evolutionary Generative Music", Digital Entertainment Technologies and Arts Track, GECCO, 2011. Abstract:
We focus on a representation for evolutionary music based on executable graphs in which nodes execute arithmetic functions. Input nodes supply time variables and abstract control variables, and multiple output nodes are mapped to MIDI data. The motivation is that multiple outputs from a single graph should tend to behave in related ways, a key characteristic of good music. While the graph itself determines the short-term behaviour of the music, the control variables can be used to specify large-scale musical structure. This separation of music into form and content enables novel compositional techniques well-suited to writing for games and film, as well as for standalone pieces. A mapping from integer-array genotypes to executable graph phenotypes means that evolution, both interactive and non-interactive, can be applied. Experiments with and without human listeners support several specific claims concerning the system's benefits.
XG Demo pieces available here.
XG Experiment 0 (music and questionnaire) available here.
XG Experiment 1 (music and questionnaire) available here.
Source code available on request.