It has been more than thirty years since I began practicing computer music. My first experiences were with real-time computer sound synthesis techniques. Later, I began exploring signal processing of instrumental sounds.
In 1986, as a commission from the International Computer Music Conference (ICMC), I composed my first interactive computer music “Five Inventions Accompanied by Computers” for clarinet, cello, and piano, using two DSP systems, for which I programmed the software in their microcode languages. It was a reckless and ambitious challenge.
Since then I have been realizing interactive computer music for diverse instruments, using the NeXT computer and the IRCAM Signal Processing Workstation (ISPW) with Max/ISPW during the 1990s, and since 2000, Macintosh computers using Max/MSP.
A live computer system performs real-time signal processing on instrumental sound played on the stage.
The system samples sound from the instruments, performs digital signal processing on the sound, and reproduces the transformed instrumental sound simultaneously along with its original instrumental sound in the concert hall. Diverse real-time signal processing techniques are employed for the transformation of instrumental sound, including time and frequency domain manipulation of incoming instrumental sound using FFT/iFFT re-synthesis techniques, pitch shifting, with feedback, and real-time grain-based frequency modulation techniques.
The live computer system acts as if it is a part of the instruments, expanding the timbre of the instruments, and creating a novel musical space in combination with the instruments.