The paper describes an interactive musical system that utilizes a genetic algorithm in an effort to create inspiring collaborations between human musicians and an improvisatory robotic xylophone player.
The robot is designed to respond to human input in an acoustic and visual manner, evolving a human-generated phrase population based on a similarity driven fitness function in real time. The robot listens to MIDI and audio input from human players and generates melodic responses that are informed by the analyzed input as well as by internalized knowledge of contextually relevant material.
The paper describes the motivation for the project, the hardware and software design, two performances that were conducted with the system, and a number of directions for future work.
Authors: Gil Weinberg, Mark Godfrey, Alex Rae, and John Rhoads