Regenerative Music explores new physiological interfaces for musical instruments. The overall goals of this project are to investigate the creation of “Regenerative Music”. In Regenerative Music, the computer, instead of taking active only cues from the musician, reads physiological signals (heart beat, respiration, brain waves, etc..) from the musician/performer.
These signals are then used to alter the behaviour of the instrument itself. For instance filter settings on the sound can be applied, to which the musician responds by changing the way they play the instrument. The music will in turn generate an emotional response on the part of the musician/performer, and that this emotional response will be detectable by the computer, which then alters the behaviour of the instrument further in response. […]
In DECONcert 1, we hooked up 48 people’s EEG signals, which were collectively used to affect the audio environment. Signal averging was used across groups to clean the signal, and look for collective alpha synchronization (which occurs, for instance, when people close their eyes).
DECONcert utilized electroencephalogram (EEG) sensors which sensed electrical activity produced in the brains of the participants. 48 participants were equipped with EEG sensors, and the signals from the brains of the participants were used as signals to alter a computationally controlled soundscape. DECONcert allowed the particpants to form a feedback loop with the computational process of musical composition. The soundscape being generated generates a response from the participants, and the collective response from the group of participants is sensed by the computer, which then alters the music based upon this response. Again, the participants hear the music, and again respond, and again the computer senses and further alters the sound. In this way, collaborative biofeedback is being used in conjuction with an intelligent signal processing system to continually re–generate the music on the fly.