ICLC 2023 Catalogue PROOF VERSION

Botbop: Integers & Strings

Kasper Jordaens, Dagobert Sondervan, Andrew Claes

Was performed at:

Program Notes

The fourth iteration of Bot Bop’s exploratory journey into the artistic potential of combining AI and live coding, will deepen the focus incorporating real-time string arrangement, its visual representation and the interaction and creative possibilities of AI.

Three live coders, Dago Sondervan -who uses multiple live coding environments at the same time-, Andrew Claes -accomplished wind-controller and patching adept- and Kasper Jordaens -live visual and data artist- team up creating an interactive environment where improvisation and computer aided composition can overflow.

Continuing development of BOTBOP’s custom software and implementations using primarily open source software, now enables the creation of musical scores in real time. Starting from improvisation, MIDI data is gathered, processed and distributed on stage to a classical string quartet. Algorithmic strategies are combined with machine learning techniques to render a sensible musical output, to be played at first sight by the strings. The focus is on dynamic systems, optimized to react in real time rather than pre-generated and ‘offline’, giving this audio-visual performance a distinct edge, blurring lines between classical composition, electronic music and jazz.

Integers & Strings premiered at Sònar festival during its ‘AI & Music S+T+ARTS’ 2021 festival in Barcelona in co-production with BOZAR Brussels.

Abstract


The abstract is displayed here for proof-reading and will only be part of the published proceedings, not of the final version of this web catalogue.

BotBop is a live coding band, incorporating different approaches to computer aided music composition as well as real time interaction with the running algorithms. In this case, computers are not only generating sound, but become like a ‘player’, interacting with the band members, also by means of AI.

On this backdrop of code, MIDI-data is generated and translated to a written musical score in real-time, arranged for string quartet. The musicians perform live on a synced clock, displaying the next bars to be played on an ipad over html/js/websockets.

The quartet starts playing ‘a prima vista’ whatever the algorithm is outputting, visualized after a fixed delay of 8 measures needed for the AI inferring, interpreting and transforming the music. Every 4 bars, the former bars are erased, enabling the strings to play seamlessly for prolonged times.

Skilled in musical interpretation, the 4 classical trained musicians breathe life back into the equation, feeding back to the core BotBop musician/coders, tweaking their setups in Sonic Pi, TidalCycles, Bespoke Synthesizer and Kasper’s live coded data visualization system (Quil). A selection of synths, Dago’s semi-modular drummachine and Andrew’s windcontroller (EWI) data close the circle.

Everyone is livecoding, playing music, creating visuals or interpreting data but this is all bound together by a shared system, a network making it possible to share any data, audio or MIDI, operating as a live band, reinforcing and having direct feedback on one another

The AI brain consists of a magenta implementation with an existing pretrained polyphonic model that was tweaked to infer midi scores on the fly (with less than 8 measures delay). The magic happens when the AI-generated scores are remapped to match incoming notes and rhythms generated live by the computer trio.

The Bop-robot is in constant development, gathering snippets of code, harvesting beautiful algorithms and ironing out risky implementations.

While using similar parts and movements, every performance turns out very different and is unique.