ICLC 2017 Programme
ICLC 2017 workshops, concerts and paper presentations are open to the general public, they have no cost
and are subject to seat availability.
Please note: Schedule is subject to change without notice.
ICLC 2017 Venue location
Centro Mexicano para la Música y las Artes Sonoras (CMMAS)
Casa de la Cultura (2do. piso)
Av. Morelos Norte 485, Centro.
Morelia, Michoacán, México.
Tel: +52 (443) 317 5679 y +52 (443) 313 8343
Centro Cultural Clavijero
El Nigromante 79, Centro Histórico, c.p. 58000.
Morelia, Michoacán, México.
Tel: +52 443 312 0412
Comité de Asuntos Intangibles
Bartolomé de Las Casas 488, Centro Histórico, c.p. 58000.
Morelia, Michoacán, México.
Tezla Music Gallery
Benito Juárez 194, Centro Histórico. C.p. 58000
Morelia, Michoacán, México.
Tel: +52 443 312 1023
Jeudi 27
Valentín Gómez Farias 264, Centro Histórico. C.p. 58000
Morelia, Michoacán, México.
9:00 - 9:30 hrs -> Registration -> CMMAS Hall
9:30 - 10:45 hrs -> Session 1: Short & long papers -> CMMAS Hall
11:00 - 11:55 hrs -> Session 2: Short & long papers -> CMMAS Hall
12:30 - 14:30 hrs -> Workshops -> CMMAS & Comité de Asuntos Intangibles
14:30 - 16:00 hrs -> Lunch
16:00 - 17:30 hrs -> Opening concert -> CMMAS concert hall
20:30 - 01:00 hrs -> Algorave 1 -> Tezla music gallery
10:00 - 14:00 hrs -> Workshops -> CMMAS
14:00 - 16:00 hrs -> Lunch
15:30 - 16:30 hrs -> Session 3: Keynote by Alex McLean: Patterns I Have Known And Loved -> CMMAS concert hall
16:40 - 17:40 hrs -> Session 4: Panel “Corporalidad, colectividad y desaceleración en las prácticas del Live Coding” -> CMMAS concert hall
En este panel se abordarán temas relativos a la relación y diferencias del live coding con la
tradición electroacustica, también se discutirán formas de entender la colectividad en el live
coding y los rasgos que la caracterizarían frente a otras manifestaciones del arte y la cultura
digital, así como sus diferencias y semejanzas con los discursos del capitalismo sobre las redes
y economías de colaboración.
20:00 - 21:35 hrs -> Concert 2 -> Comité de Asuntos Intangibles
9:00 - 9:30 hrs -> Registration -> CMMAS concert hall
9:30 - 10:45 hrs -> Session 5: Short & long papers -> CMMAS concert hall
11:00 - 11:55 hrs -> Session 6: Short & long papers-> CMMAS concert hall
12:30 a 14:30 hrs -> Workshops -> CMMAS & Comité de Asuntos Intangibles
14:30 - 15:30 hrs -> Lunch
15:30 - 17:40 hrs -> Concert 3 -> CMMAS concert hall
18:00 - 20:00 hrs -> Workshop -> Comité de Asuntos Intangibles
20:30 hrs -> Dinner with all delegates -> Tezla Music Gallery
10:00 - 14:00 hrs -> Workshops -> CMMAS
14:00 - 15:30 hrs -> Lunch
15:40 - 16:40 hrs -> Session 7 Keynote by Shelly Knotts " Live Coding Fieldnotes" -> CMMAS concert hall
17:30 - 18:30 hrs -> Session 8: Panel "Meshworks" -> CMMAS concert hall
19:00 - 20:55 hrs -> Concert 4 -> "2o Patio" from CENTRO CULTURAL CLAVIJERO
9:00 - 9:30 hrs -> Registration -> CMMAS concert hall
9:30 - 11:35 hrs -> Sesión 9: Presentación de artículos -> CMMAS concert hall
11:35 - 12:15 hrs -> Town Hall meeting-> CMMAS concert hall
12:15 - 14:00 hrs -> Comida
14:00 - 18:00 hrs -> Workshop -> Comité de Asuntos Intangibles
18:10 - 19:55 hrs -> Concert 5 -> CMMAS concert hall
20:30 - 01:30 hrs -> Algorave 2 -> Jeudi 27
Location -> CMMAS classroom
Facilitator -> Esteban Betancur
CineVivo es un minilenguaje de programación diseñado para el desarrollo de piezas
visuales usando videos pregrabados y cámaras, en el lenguaje del live cinema y usando live coding como interfaz principal, la propuesta del workshop es construir una base sintáctica y semántica del lenguaje CineVivo y con esto construir una pieza de live cinema con los asistentes usando el material construido, descargado o cámaras en vivo.
Location -> Comité de Asuntos Intangibles
Facilitator -> Cecilia Anda & Aide Violeta |
This workshop is aimed for those who want to have a first approach to Live Coding but come from different backgrounds, for those who want to try and implement coding into their respective areas. There’s no need to know coding or have previous experience with it but rather to have interest in Visual Narratives, Remix Culture or just watching YouTube Videos online. The aim of the workshop is to explore and redefine the concept of culture, remix and digital immediacy by means of Live Visual Coding achieved through YouTube videos. The topic to explore: Creating Remix videos in real time, no postproduction required. The workshop will be imparted from a Design, Communication and Digital Media perspective, overall covering how to transform basic programming into playing with code and learning by messing with visual videos. In a whole, creating Remix Culture by combining previous existing materials (YouTube videos) into a new creative and personalized product.
Location -> CMMAS classroom
Facilitator -> Jaime Alonso Lobato Cardoso
Note: This workshop has a cost of 1000 MXN for the material provided to attendees.
The term computer has been used widely for our society since the invention of the automatic electronic machines for performs calculations, but these devices have not always been electronic, even they are only tools for helping calculate, not necessarily automatic. We have the example of the inca quipu, or the mesopotamic abacus. This is also true for concepts as code and algorithm, they are strongly related with the modern computer (as it is used fundamentally in this conference), but is totally correct to think of the morse code as computational code or a cooking recipe. Under this perspective new research has been developed on computers that do not work with electricity. In this workshop we will learn to elaborate a belousov zhabotinsky reactions and reflect on hoy to develop a form based computing language.
Location -> CMMAS classroom
Facilitator -> Olivia Jack
The workshop will explore methods for collaboration and modulation in live-coded visuals. Using WebRTC (web-based streaming), participants will receive and modify video and camera feeds from other participants and multiple devices in real time. The methods are inspired by analog, modular video synthesis, in which each browser/device outputs a signal or stream and receives streams from other browsers/devices. Participants are invited to experiment with feedback, glitch, latency, modulation, and network effects, in addition to more algorithmic ways of generating visuals. Open to all experience levels.
Ejecutantes/Performers: Liliana Rodríguez Alvarado (Viola), Adriana López López (viola), Barush Fernández (percusión), Roger Vargas (percusión), Juán Sebastian Lach (teclado), Rodrigo Treviño Frenk (bajo eléctrico), Diego Villaseñor de Cortina (flauta), Luis Sánchez (flauta), Claudia Cisneros (Violoncello), Oscar René Mayoral Landavazo (Violoncello), Diushi Keri (Saxofón), Suleyma Guadalupe Vega Martínez (Trompeta), Andrés Alejandro García Moreno (Contrabajo), Ricardo Hernández Díaz (Fagot).
Iterating Absences is a heterogeneous and radical form of live coding performance with an instrumental musical output and a visual AI prediction of the score. The data that will be used to generate the music and the visuals are obtained from an official online database of disappearances of journalists, activists, women in Mexico. It has been observed that a major problem to understand violence in the country is the lack of empathy of a national and international civil society. The instrumental music is performed by 15 musicians using an audio-score system produced algorithmically via SuperCollider. The music generated is a rhythm oriented one in which statistical data is analyzed to generate a series of temporal micro-canons in order to produce a topological, nonlinear, multiple rhythmic structures that will be performed live. We extract frequencies and temporal features from the audio-score. The features are pass to an artificial intelligence entity which tries to predict the image that best matches the input. This method of generating a live composition and a live representation could reveal a meaning at a subconscious level of a sound-image representation.
sampler/sampler is a networked performance investigating the commonalities between sampling practices in 16th century embroidery and contemporary electronic music, explored through contemporary live coding techniques. The performance is a digital emulation of Blackwork stitching techniques using a bespoke heuristic interface, in combination with a pattern sampling and sonification engine in SuperCollider. Through the performance, traditional Blackwork patterns are created and rendered as visual and sonic pattern samples by a performer trained in Blackwork embroidery. These patterns are then sent as samples to a second performer who sequences, plays back and manipulates the pattern samples, exploring their use as both sonic and visual data, using improvised live coding techniques.
Created during October 2013, LiveCodeNet Ensamble is a networked live coding ensemble from Mexico City which explores the possibilities of improvised music and interconnection in order to interact, write and modify source code on the fly within a collaborative environment to co-create music. The Ensemble is connected through a local network; therefore, a mediation of individual processes that build a collective sound is implied. This makes possible an artistic practice that shows the activity of writing code during a collective music improvisation. The main purpose of the Ensemble is to create music through different processes developed by a network of individuals interacting through sound and source code in a context of computer music. For the present edition of ICLC, LiveCodeNet Ensemble will perform a networked improvisation. Members of the Ensemble will write, modify and share their source code using the software SuperCollider. https://livecodenetensamble.wordpress.com/
Location -> Comité de Asuntos Intangibles
Facilitator -> Carlos Hasbún
Workshop about supercollider language extensions or something about developing tools.
Location -> Comité de Asuntos Intangibles
Facilitator -> Jack Armitage, Joanne Armitage and Shelly Knotts
This hands-on workshop introduces participants to using SuperCollider on Bela, an open source embedded platform for ultra-low latency audio and sensor processing based on the BeagleBone Black. Bela is designed to generate and process audio while connected to the physical world using all sorts of analog and digital sensors. During the workshop we will show how you can adapt your existing SuperCollider-based live coding setup to the Bela platform. We will also show how you can live code or develop and test an instrument interactively, and later move to a fixed setup where your code runs automatically on startup and is controlled remotely
Noises of forgetting is an installation and live coding performance. It plays with the augmentation of dust particles floating in the apparent empty space. Dust are small fragments of matter floating in the atmosphere. Its sources are variated, containing a lot of organic and inorganic matter, from terrestrial to interstellar sources. Its presence is barely visible yet is a reminder that all things are in perpetual motion and transformation, and that everything has a common origin and perhaps common end. Dust is a denouncer of the invisible forces that are in constant play among things, that pull the world together and apart. The cycles of all things are thus interrelated in complex ways, which we can barely perceive, and hardly conceive or describe. To a certain extent, all things are part of the same, yet the greater cannot be grasped. There is an intrinsic greatness within dust and its presence, which transfers a certain notion of the sublime, a certain experience of what is left in the world and of what we will leave behind and will be forgotten.
The CodeKlavier is a system which enables the pianist to code through playing the piano as a performative experience. It aims to address two main questions: Can coding be an 'embodied experience' alike to a skilled musician making music? And how can you translate musical 'thinking' into code? Still in an early stage of its development, the CodeKlavier is currently geared to allow the pianist to live code their own electronic effects and accompanying sounds through the interface of an acoustic-midi piano. The CodeKlavier draws upon live coding and pianistic performance practices to forge a new practice which embodies both. In our presentation we will present two versions of the CodeKlavier: “hello world” and Motippets.
pl:2b is an audience driven movement performance wherein observers influence performer action and movement through an interactive web-interface via their mobile devices (iOS and Android). Interaction with the mobile interface generates moment-to-moment sound events while also periodically activating vibrational motors attached to performers’ bodies and sending auditory directives to performers’ headsets, altering the course of the performance in real-time.
Location -> Comité de Asuntos Intangibles
Facilitator -> Luis Navarro, Jamie Beverley & David Ogborn
In this workshop, we will introduce and explore the possibilities of the newly created Estuary platform for collaborative, projectional live coding. Estuary lets people live code musical patterns in a web browser, and is organized around the idea of providing different interfaces for different situations (such as restricted tutorial interfaces, more complex interfaces meant for a solo performer, and collaborative interfaces meant for groups of people playing together over the Internet). Built on top of the TidalCycles language for live coding musical pattern, Estuary also emphasizes the idea of structure editing — often instead of typing one simply clicks to perform valid changes to a structure. Estuary requires no special installation and so participants are encouraged to bring their own laptops or tables — we will learn about Estuary by logging in and using it from the first moment!
Location -> CMMAS classroom
Facilitator -> Nick Rothwell
Outline:
An introductory workshop on the Pulse Sequencer, an open-source Python-based step sequencing environment which traces its roots back 20 years to implementations for Opcode Max version 3, but now works in Max for Live.
Abstract:
The Pulse Sequencer started out as an early live-codeable pattern-based sequencer developed around 1996 as an object for the Max development environment. The sequencer operated in the MIDI domain only, as it predated MSP (Max's audio processing system) by about a year. The design foreshadowed projects like Gibber, featuring a very similar pattern syntax, and was influenced by Pyrite, an embedded language interpreter for Max which went on to become the scripting language for SuperCollider.
Following numerous iterations, the current version is written in Python and runs inside Max for Live, giving it access to the Ableton Live DAW environment. Although it works primarily in the MIDI domain, it can also control device parameters, allowing for some step-oriented audio modulation.
Topics to be covered include:
Skills required:
Requirements: Ableton Live 9.x, Max for Live
Location -> CMMAS classroom
Facilitator -> Evan Raskob
How might livecoding help users control the complex choreography of networked, “smart” gadgets? For example, how might they manage their networked refrigerators for the home that are hooked into smart energy meters, alongside weaponised home security systems linked into indicators of local and global political unrest? A future of networked sensors and computerised gadgets could potentially reduce users to passive actors in a tightly controlled world, or give them previously unseen levels of control over their environments through livecoding. Livecoding used playfully could explore choreography of complex gestures and smart lighting systems. Product designers could livecode supply chain software to choose environmentally sound, local materials for furniture production. Scientists could disseminate research livecoded data toolkits[4]. Using the techniques of “futuring” or “scenario planning,” we will develop short outlines of stories in the genre of “design science fiction.” Structured discussions and ideation exercises led by an academic in Product Design and Interaction will challenge participants to develop provocative but plausible future scenarios that may lead to new areas of research. Provoking discussion between computer-based practitioners and product designers around how smart, networked products could and should interact with their users should spark new insight and collaboration between the two disciplines.
Everything is digital. Everything is connected. Everything is algorithmic. Under this scenario, the domain of technology and the code acquires new significance. The coder is not only able to create, modify or hack any algorithm. He can potentially achieve to abstract the thought in algorithmic form and to manipulate it through the language, the code. The coder rise as a new kind of shaman, the new alchemist; capable of modifying reality through thought. The live coding performance can be seen as a ritual where different thoughts are found; abstracted algorithmically, in code form. They are shared, manipulated and reconstructed by the coders. It transcends the technological sphere and manifests itself.
[Exterior. Sunset / Nightfall] A park, a street, a square... a city.
# ViVoEnCode #LivingCode is a space intervention in two parts. A relationship is established with the present people to invite them to "intervene" in the assembly. In this way, it is agreed to participate in a collective creation experience in which some of ManifestoDraft's postulates are met ("Show us your screens"... "Algorithms are thoughts"...), but also the interpretations of participants.
We are immersed in all kinds of codes, all the time; experiential, everyday, conscious and unconscious. On the other hand, computer technologies work through codes that affect our lives, but at the same time, we interpret them and we can modify them.
'pSY.dR34M5:karmaZqncer' is a live-coding audiovisual performance by voodoochild/. The real-time graphics are coded in Processing and controlled via OSC while the music and sound performance is triggered and coded live from SuperCollider. The performance explores the opposition between clean and wide ambients guided by strong white-noise basslines, and noisy and saturated ambients generated by waveform and sampling manipulation. Visually, the same opposition is explored by the presentation of clean tridimensional structures based on spherical and toroidal coordinates versus bitmap filtering, shaders and rendering operations. The performance search to expose and evidence performative and processual aspects of an artwork which main characteristic have been to be produced through the typing of code, in tension with a conventional human role represented in the guitar player.
Live audiovisual performance with textile controllers being sewn in real time while their behaviors are (re)coded live with the Pure Data platform.
I have been working with computer music artist Sang Won Lee (University of Michigan), integrating gestural control of audio with his browserbased performance and visualisation interface, known as Live Writing. Together, we would like to contribute to ICLC 2017 by proposing the performance of a new audio-visual composition, entitled Jimmy raps, that combines Live Writing, the MYO armband, and audio/programming components Max and Ableton Suite. Both Sang and myself are co-developers of this performance.
Facilitator -> Rodrigo Velasco
‘On-the-fly e-lit’ is an invitation to explore diverse interrelations between live coding and electronic literature through ‘s2hs2’; an interface that allow us to assign a visual form for each letter on our computer keyboard. Then sharing OSC messages between TidalCycles and Processing, we will deconstruct text and visuals through live coding, drawing as writing live processes, looking to celebrate unexpected materialities.
Random Corpus Binary es una obra audiovisual en la que el cuerpo es interpretado por la tecnología convertido en una proyección aleatoria sobre pantallas móviles que da lugar a formas humanas sin categorías que emergen por la escritura del código, la aleatoriedad y la improvisación en la que intervienen las artes visuales, el sonido, la programación de software y el diseño en una misma puesta en escena en la que siete personas crean desde la escritura de código en tiempo real.
A performance interweaving the live coding of computer vision and music with a constant (de-)formation of the human body and its representation in space and time. On the stage two live coders: one building the computer vision program; the other spinning a musical system; and the performer (dancer) moving her own body— live hacking the system with her movement, interfering and being interfered by an ongoing audio-visual composition. The piece and the performance reflect on the fragility of the bodies, matter and space. On how human and machine perception can encounter similarities and discrepancies, while experiencing the environment and themselves. Keywords: Live Coding > Body > Movement > Thinking in Action > Improvisation > Computer Vision > Programming > Making Visible > Process > Performance > Free/Open Source Software
The performance “Disentrenchment” is about making new connections between musical events, and breaking up entrenched structures. At first, we start with a sequence of notes in a well-known order (like, the beginning of a piece by W.A. Mozart). Over time, new connections are made, the old order is “un-learned”, and new patterns emerge, maybe just as a hint, and might just as well disappear immediately in an ever-changing process. The main compositional parameter is rather a kind of entropy that changes over time, while the original order becomes more and more unlikely. If not constrained by the limits of the concert situation, this could go on forever.
'Improvisation in Painting And Composition', started in 2017, is a experimental performance by Yosuke Sakai(painter and programmer) and Hiroto Takeuchi(composer), which connects live painting and live composition via sound spectrogram. Yosuke Sakai will be Sumi-e paiting to make a image treated as a spectrogram(a image representing time, frequency and amplitude). Sumi-e is a Japanese traditional painting, expressed by gradation of black-and-white and bleeding of ink. Regarding a painting as a sound spectrogram, we can make a sound from a painting. And in that respect, you can regard painting as coding. The generated sound is passed to the composer's environment and Hiroto Takeuchi will be editing and composing a piece of music in real time, and played. The painting with brush and ink is not under control, that is to say, the generated sounds by them are different every time although the painter try to make the exactly same one, thus the composer will face the uncertainty and have to handle it. The performance will be highly improvisational.
JazzCodes is a real time composition and improvisation project where a jazz quartet interacts with a fifth interpreter, a live coder. As real time composition is the central concept to develop, the only element that repeats each time is how the supercollider “class” shows the audience the structural architecture and process of the piece.