ICLC 2023 Catalogue PROOF VERSION

We All Begin in Abstraction

Shelly Knotts, Sojung Bahng (방소정), Kirby Casilli

Was performed at:

Program Notes

“We All Begin in Abstraction…” is an AV networked live coding performance for live coding musician (Shelly Knotts), live video (Sojung Bahng) and movement (Kirby Casilli) which takes inspiration from Legacy Russell’s text Glitch Feminism to explore threads of female embodiment, error and algorithmically mediated life.

In live coding, error constitutes an embodied form of computing, drawing attention from the screen and towards the body which is creating a rupture, a break, in the flow system.

In this work we draw attention to the points of interaction between body, machine and algorithm and the ruptures and errors that shape the interaction between artists, code and audience in algorithmically driven performance. We use machine learning algorithms to track movement and live video as an input to a live coded performance, forming a feedback loop between body, algorithm, embodied algorithms and mediated forms.

Abstract


The abstract is displayed here for proof-reading and will only be part of the published proceedings, not of the final version of this web catalogue.

“We All Begin in Abstraction…” is an AV networked live coding performance for live coding musician (Shelly Knotts), live video (Sojung Bahng) and movement (Kirby Casilli) which takes inspiration from Legacy Russell’s text Glitch Feminism to explore threads of female embodiment, error and algorithmically mediated life.

In live coding, error constitutes an embodied form of computing, drawing attention from the screen and towards the body which is creating a rupture, a break, in the flow system. As a non-normative body in computer programming, I’m interested in this tension between error and embodiment, inhabiting, and performing, an alien body in this context.

In this work we draw attention to the points of interaction between body, machine and algorithm and the ruptures and errors that shape the interaction between artists, code and audience in algorithmically driven performance. We use machine learning algorithms to track movement and live video as an input to a live coded performance, forming a feedback loop between body, algorithm, embodied algorithms and mediated forms.

We use machine learning tools to analyse live video received from Kirby and Sojung, producing data streams that that I can integrate into my live coding music performance in SuperCollider and JavaScript.

The visual component includes layers of live streamed and pre-recorded video contemplating algorithmically mediated embodiment, alongside live coding and video analysis screens I am using during the performance.