Was presented at:
Publication: https://doi.org/10.5281/zenodo.15527324
This paper describes a system for interactively coding the behaviour of flocks of artificial sound-making agents. The system is based on an animated two-dimensional visualization where the background represents acoustic or visual information, and multiple agents make sound by traversing the space. The main interface is a domain-specific language that allows for composing agent behaviours from elementary actions. The language is described through several basic examples. The system is demonstrated through Brunzit, a standalone C++ application that supports real-time music live coding using two different synthesis engines. While the system is still in early development, two musical examples are used to demonstrate its potential.