Martin Dupras, Simon Holland, Paul Mulholland
Was presented at:
The aim of this paper is to explore how electric guitarists can engage in live coding without removing their hands from the guitar to enable them to implement and apply real time signal processing code to their sound. We propose a paradigm where a guitarist plays a guitar capable of outputting a separate audio signal from each string to support the expression of both sound and data, and uses this in combination with a MIDI foot controller to alternate switching between playing and programming modes. Evolving a solution involves addressing two key design challenges: (1) the identification of suitable means to translate playing gestures into recognisable abstract data, and (2) the development of a flexible mapping strategy to allow players of different styles or practices (e.g. metal vs. jazz) to extend the system to suit their needs. The resulting design, which enables guitarists to both play and program, is currently undergoing an iterative process of refinement and evaluation.