Waveform.AI
A multi-day exploration of sound, improvisation, and artificial intelligence
Waveform.ai was a multi-day interdisciplinary event that brought together music, physics, and artificial intelligence in an immersive series of workshops, concerts, and an interactive installation. Developed collaboratively by the Departments of Physics and Music, Waveform.ai pushed the boundaries of creative expression, treating AI not simply as a tool, but as an active collaborator in the improvisational process.
The centerpiece concert featured internationally acclaimed transdisciplinary artist and flutist Melo (Melody Chua), whose experimental approach to sound and performance exemplified the cutting-edge potential of human-machine co-creation. Throughout the performance, musicians played alongside a bespoke digital synthesizer—designed by W&M Physics students—while the AI system responded in real time to the music's evolving aesthetic qualities and stimuli from the audience.
Workshops and installations invited students, faculty, and the public to explore the intersections of algorithmic composition, responsive design, and live improvisation, encouraging critical reflection on the role of AI in the future of performance and creativity.
In alignment with the Art & Science Exchange’s mission, Waveform.ai showcased the power of interdisciplinary collaboration to open new possibilities for how we make, experience, and understand art in a technological age.
Principle Investigators: Ran Yang and Benjamin Whiting