Programa ICLC 2017

Los talleres, conciertos y presentación de artículos de la ICLC 2017 están abiertos al público en general, no tienen costo y son de cupo limitado.

Nota: El siguiente programa de la conferencia está sujeto a cambios sin pervio aviso.

Centro Mexicano para la Música y las Artes Sonoras (CMMAS)
Casa de la Cultura (2do. piso)
Av. Morelos Norte 485, Centro.
Morelia, Michoacán, México.
Tel: +52 (443) 317 5679 y +52 (443) 313 8343

Centro Cultural Clavijero
El Nigromante 79, Centro Histórico, c.p. 58000.
Morelia, Michoacán, México.
Tel: +52 443 312 0412

Comité de Asuntos Intangibles
Bartolomé de Las Casas 488, Centro Histórico, c.p. 58000.
Morelia, Michoacán, México.

Tezla Music Gallery
Benito Juárez 194, Centro Histórico. C.p. 58000
Morelia, Michoacán, México.
Tel: +52 443 312 1023

Jeudi 27
Valentín Gómez Farias 264, Centro Histórico. C.p. 58000
Morelia, Michoacán, México.

De 9:00 a 9:30 hrs -> Registro -> Auditorio CMMAS

De 9:30 a 10:45 hrs -> Sesión 1: Presentación de artículos -> Auditorio CMMAS

De 11:00 a 11:55 hrs -> Sesión 2: Presentación de artículos-> Auditorio CMMAS

De 12:30 a 14:30 hrs -> Talleres -> CMMAS & Comité de Asuntos Intangibles

  • CineVivo mini lenguaje para visuales en tiempo real
    Lugar -> salón CMMAS Imparte -> Esteban Betancur

    CineVivo es un minilenguaje de programación diseñado para el desarrollo de piezas visuales usando videos pregrabados y cámaras, en el lenguaje del live cinema y usando live coding como interfaz principal, la propuesta del workshop es construir una base sintáctica y semántica del lenguaje CineVivo y con esto construir una pieza de live cinema con los asistentes usando el material construido, descargado o cámaras en vivo.

  • Live Coding in the Form of Remix Video
    Lugar -> Comité de Asuntos Intangibles Imparte -> Cecilia Anda and Aide Violeta Fuentes

    This workshop is aimed for those who want to have a first approach to Live Coding but come from different backgrounds, for those who want to try and implement coding into their respective areas. There’s no need to know coding or have previous experience with it but rather to have interest in Visual Narratives, Remix Culture or just watching YouTube Videos online. The aim of the workshop is to explore and redefine the concept of culture, remix and digital immediacy by means of Live Visual Coding achieved through YouTube videos. The topic to explore: Creating Remix videos in real time, no postproduction required. The workshop will be imparted from a Design, Communication and Digital Media perspective, overall covering how to transform basic programming into playing with code and learning by messing with visual videos. In a whole, creating Remix Culture by combining previous existing materials (YouTube videos) into a new creative and personalized product.

De 14:30 a 16:00 hrs -> Comida

De 16:00 a 17:30 hrs -> Concierto Inaugural -> Auditorio CMMAS

  • Iván Paz e Ian Medina: Live Coding through Perceptual Spots on the Parametric Space

  • En Casa is a performance created by approaching live coding through a parametric conception. It uses a rule system to reduce the operational complexity of remembering the different parameter configurations sought for the specific perceptual categories constituting the parts of the piece. Then, the performer uses the rules to automate the creation of material during the sections. The output is then live mixed through different types of sound sources. The idea of modeling the parametric space is extended to the mixer and the sources, and sets of parameter settings are also chosen for each part. Draws of the mixer settings are projected together with the code, showing how code and gear configurations shape the piece throughout its parametric space.

  • Joana Chicau: Tango for Us Two/Too

  • “Tango for Us Two/Too”, is an assemblage of graphic experiments into a hybrid form of composition, combining principles of choreography with the formal structures of web-coding. As in choreography, web-design also deals with space, time and movement qualities. It has been defining ways of moving, collectively or individually, through fluid nonetheless complex landscapes of information displays, networked spaces, and multi-media environments. The performance being presented and the notion of ‘choreographic coding’ is a technical as much as social, cultural and aesthetic experiment which can be expanded both at the level of web-design as well at the one of choreography.

  • Ivan Abreu and Malitzin Cortes: PSEUDO_CODE

  • Performatic act of Pseudo_code, concrete real-time poetry, where language is literary and free interpretation for the public, generating algorithmic sound compositions, each word hides functions and mathematical possibilities of sound that are visually loaded with metaphors and hidden messages.

  • Shelly Knotts: Performance-Code Your Own Future

  • Code Your Own Future is an algorithmic crystal ball for live coders. Analyses of the performer's previous and current live coding performances is used as a basis to predict the future of the ongoing performance in realtime, and to determine the improvisational originality of the current performance. Music Information Retrieval and text analysis techniques are used to analyse an archive of code and audio files from the performer's previous performances. The live coding performance (using SuperCollider's JITLib to live programme synthesis) is augmented by a visualisation which shows a representation of the past, present and potential future of the current performance. This aims to gently encourage the performer into more innovative improvisation by using archive material to determine the originality of the current performance in relation to the performer's own archive. Realtime audio feature data relating to the past, present and potential future is mapped to greyscale, with features in the y axis and time in the x axis. The performer's current code is projected as it is being typed alongside the most likely (in orange) and least likely (in blue) possible future code. The piece is part of an on-going project to develop visualisations which show augmented information relating to the live coding performance.

De 18:00 a 20:00 hrs -> Taller -> Comité de Asuntos Intangibles

De 20:30 a 01:00 hrs -> Algorave 1 -> Tezla music gallery

  • Spencer Salazar: Pulse
  • Nick Rothwell: Pulse Sequencing
  • Ian Jarvis: skeuomorph
  • Niklas Reppel: gamma808 - tuberías de plomo
  • i> Atsushi Tadokoro: A Voyage into Synesthetic Space
  • Algo0ritmos: performace multi lenguaje
  • Sandro Miccoli: MicroCode
  • Jose Carlos Hasbún: JoseCaos
  • Rodrigo Frenk: Dead letters
  • Alex McLean: Algorave performance
  • Rennick bell: Algorave improvisation

De 10:00 a 14:00 hrs -> Talleres -> CMMAS

  • Developing a non conventional computer language. Belousov-Zhabotynsky reactions.
    Lugar -> salón CMMAS Imparte -> Jaime Alonso Lobato Cardoso
    Nota: Este taller tiene un costo de material de 1000 pesos.

    The term computer has been used widely for our society since the invention of the automatic electronic machines for performs calculations, but these devices have not always been electronic, even they are only tools for helping calculate, not necessarily automatic. We have the example of the inca quipu, or the mesopotamic abacus. This is also true for concepts as code and algorithm, they are strongly related with the modern computer (as it is used fundamentally in this conference), but is totally correct to think of the morse code as computational code or a cooking recipe. Under this perspective new research has been developed on computers that do not work with electricity. In this workshop we will learn to elaborate a belousov zhabotinsky reactions and reflect on hoy to develop a form based computing language.

  • Browser as Modular Synth: live coding distributed and networked visuals
    Lugar -> salón CMMAS Imparte -> Olivia Jack

    The workshop will explore methods for collaboration and modulation in live-coded visuals. Using WebRTC (web-based streaming), participants will receive and modify video and camera feeds from other participants and multiple devices in real time. The methods are inspired by analog, modular video synthesis, in which each browser/device outputs a signal or stream and receives streams from other browsers/devices. Participants are invited to experiment with feedback, glitch, latency, modulation, and network effects, in addition to more algorithmic ways of generating visuals. Open to all experience levels.

De 14:00 a 15:30 hrs -> Comida

De 15:30 a 16:30 hrs -> Sesión 3: Charla magna de Alex McLean -> Auditorio CMMAS

De 16:40 a 17:40 hrs -> Sesión 4: Panel "Corporalidad, colectividad y desaceleración en las prácticas del Live Coding" -> Auditorio CMMAS

  • Modera: Rossana Lara
  • Emilio Ocelotl
  • Alejandro Franco
  • Malitzin Cortes
  • Tatiana Duque Durán

En este panel se abordarán temas relativos a la relación y diferencias del live coding con la tradición electroacustica, también se discutirán formas de entender la colectividad en el live coding y los rasgos que la caracterizarían frente a otras manifestaciones del arte y la cultura digital, así como sus diferencias y semejanzas con los discursos del capitalismo sobre las redes y economías de colaboración.

De 20:00 a 21:20 hrs -> Concierto 2 -> Comité de Asuntos Intangibles

  • Alejandro Franco & Thomas Sánchez: Iterating Absences/Iterando las ausencias

    Ejecutantes/Performers: Liliana Rodríguez Alvarado (Viola), Adriana López López (viola), Barush Fernández (percusión), Roger Vargas (percusión), Juán Sebastian Lach (teclado), Rodrigo Treviño Frenk (bajo eléctrico), Diego Villaseñor de Cortina (flauta), Luis Sánchez (flauta), Claudia Cisneros (Violoncello), Oscar René Mayoral Landavazo (Violoncello), Diushi Keri (Saxofón), Suleyma Guadalupe Vega Martínez (Trompeta), Andrés Alejandro García Moreno (Contrabajo), Ricardo Hernández Díaz (Fagot).

  • Iterating Absences is a heterogeneous and radical form of live coding performance with an instrumental musical output and a visual AI prediction of the score. The data that will be used to generate the music and the visuals are obtained from an official online database of disappearances of journalists, activists, women in Mexico. It has been observed that a major problem to understand violence in the country is the lack of empathy of a national and international civil society. The instrumental music is performed by 15 musicians using an audio-score system produced algorithmically via SuperCollider. The music generated is a rhythm oriented one in which statistical data is analyzed to generate a series of temporal micro-canons in order to produce a topological, nonlinear, multiple rhythmic structures that will be performed live. We extract frequencies and temporal features from the audio-score. The features are pass to an artificial intelligence entity which tries to predict the image that best matches the input. This method of generating a live composition and a live representation could reveal a meaning at a subconscious level of a sound-image representation.

  • Sean Cotterill & Tony Buckby: sampler/sampler

  • sampler/sampler is a networked performance investigating the commonalities between sampling practices in 16th century embroidery and contemporary electronic music, explored through contemporary live coding techniques. The performance is a digital emulation of Blackwork stitching techniques using a bespoke heuristic interface, in combination with a pattern sampling and sonification engine in SuperCollider. Through the performance, traditional Blackwork patterns are created and rendered as visual and sonic pattern samples by a performer trained in Blackwork embroidery. These patterns are then sent as samples to a second performer who sequences, plays back and manipulates the pattern samples, exploring their use as both sonic and visual data, using improvised live coding techniques.

  • Hernani Villaseñor, Libertad Figueroa, José Carlos Hasbun, Emilio Ocelotl and Eduardo H. Obieta: LiveCodeNet Ensamble

  • Created during October 2013, LiveCodeNet Ensamble is a networked live coding ensemble from Mexico City which explores the possibilities of improvised music and interconnection in order to interact, write and modify source code on the fly within a collaborative environment to co-create music. The Ensemble is connected through a local network; therefore, a mediation of individual processes that build a collective sound is implied. This makes possible an artistic practice that shows the activity of writing code during a collective music improvisation. The main purpose of the Ensemble is to create music through different processes developed by a network of individuals interacting through sound and source code in a context of computer music. For the present edition of ICLC, LiveCodeNet Ensemble will perform a networked improvisation. Members of the Ensemble will write, modify and share their source code using the software SuperCollider.

De 9:00 a 9:30 hrs -> Registro -> Auditorio CMMAS

De 9:30 a 10:45 hrs -> Sesión 5: Presentación de artículos -> Auditorio CMMAS

De 11:00 a 11:55 hrs -> Sesión 6: Presentación de artículos-> Auditorio CMMAS

De 12:30 a 14:30 hrs -> Talleres -> CMMAS & Comité de Asuntos Intangibles

  • Interfaces or tools or anything
    Lugar -> Comité de Asuntos Intangibles Imparte -> Carlos Hasbún

    Workshop about supercollider language extensions or something about developing tools.

  • SuperCollider on Bela workshop: audio and sensors on an embedded platform
    Lugar -> Comité de Asuntos Intangibles Imparte -> Jack Armitage, Joanne Armitage and Shelly Knotts

    This hands-on workshop introduces participants to using SuperCollider on Bela, an open source embedded platform for ultra-low latency audio and sensor processing based on the BeagleBone Black. Bela is designed to generate and process audio while connected to the physical world using all sorts of analog and digital sensors. During the workshop we will show how you can adapt your existing SuperCollider-based live coding setup to the Bela platform. We will also show how you can live code or develop and test an instrument interactively, and later move to a fixed setup where your code runs automatically on startup and is controlled remotely

De 14:30 a 15:30 hrs -> Comida

De 15:30 a 17:40 hrs -> Concierto 3 -> Auditorio CMMAS

  • Alejandro Franco & Thomas Sánchez: Iterating Absences/Iterando las ausencias
  • Andrés Villa Torres: Noises of Forgetting

  • Noises of forgetting is an installation and live coding performance. It plays with the augmentation of dust particles floating in the apparent empty space. Dust are small fragments of matter floating in the atmosphere. Its sources are variated, containing a lot of organic and inorganic matter, from terrestrial to interstellar sources. Its presence is barely visible yet is a reminder that all things are in perpetual motion and transformation, and that everything has a common origin and perhaps common end. Dust is a denouncer of the invisible forces that are in constant play among things, that pull the world together and apart. The cycles of all things are thus interrelated in complex ways, which we can barely perceive, and hardly conceive or describe. To a certain extent, all things are part of the same, yet the greater cannot be grasped. There is an intrinsic greatness within dust and its presence, which transfers a certain notion of the sublime, a certain experience of what is left in the world and of what we will leave behind and will be forgotten.

  • Plática con Anne Veinberg & Felipe Ignacio

  • Anne Veinberg and Felipe Ignacio: CodeKlavier

  • The CodeKlavier is a system which enables the pianist to code through playing the piano as a performative experience. It aims to address two main questions: Can coding be an 'embodied experience' alike to a skilled musician making music? And how can you translate musical 'thinking' into code? Still in an early stage of its development, the CodeKlavier is currently geared to allow the pianist to live code their own electronic effects and accompanying sounds through the interface of an acoustic-midi piano. The CodeKlavier draws upon live coding and pianistic performance practices to forge a new practice which embodies both. In our presentation we will present two versions of the CodeKlavier: “hello world” and Motippets.

  • Jeremy Stewart: pl:2b

  • pl:2b is an audience driven movement performance wherein observers influence performer action and movement through an interactive web-interface via their mobile devices (iOS and Android). Interaction with the mobile interface generates moment-to-moment sound events while also periodically activating vibrational motors attached to performers’ bodies and sending auditory directives to performers’ headsets, altering the course of the performance in real-time.

De 18:15 a 20:15 hrs -> Taller -> Comité de Asuntos Intangibles

  • Collaboration and Learning with Estuary
    Lugar -> Comité de Asuntos Intangibles Imparte -> Luis Navarro, Jamie Beverley y David Ogborn

    In this workshop, we will introduce and explore the possibilities of the newly created Estuary platform for collaborative, projectional live coding. Estuary lets people live code musical patterns in a web browser, and is organized around the idea of providing different interfaces for different situations (such as restricted tutorial interfaces, more complex interfaces meant for a solo performer, and collaborative interfaces meant for groups of people playing together over the Internet). Built on top of the TidalCycles language for live coding musical pattern, Estuary also emphasizes the idea of structure editing — often instead of typing one simply clicks to perform valid changes to a structure. Estuary requires no special installation and so participants are encouraged to bring their own laptops or tables — we will learn about Estuary by logging in and using it from the first moment!

De 20:30 hrs -> Cena con todos los participantes -> Tezla Music Gallery

De 10:00 a 14:00 hrs -> Talleres -> CMMAS

  • Live Coding Ableton Live in Python with the Pulse Sequencer.
    Lugar -> salón CMMAS Imparte -> Nick Rothwell

    Outline: An introductory workshop on the Pulse Sequencer, an open-source Python-based step sequencing environment which traces its roots back 20 years to implementations for Opcode Max version 3, but now works in Max for Live.

    Abstract: The Pulse Sequencer started out as an early live-codeable pattern-based sequencer developed around 1996 as an object for the Max development environment. The sequencer operated in the MIDI domain only, as it predated MSP (Max's audio processing system) by about a year. The design foreshadowed projects like Gibber, featuring a very similar pattern syntax, and was influenced by Pyrite, an embedded language interpreter for Max which went on to become the scripting language for SuperCollider. Following numerous iterations, the current version is written in Python and runs inside Max for Live, giving it access to the Ableton Live DAW environment. Although it works primarily in the MIDI domain, it can also control device parameters, allowing for some step-oriented audio modulation.

    Topics to be covered include:
    • Basics: building chains and pulses
    • Cycling: loop and step control
    • Assembling: chain syntax and shortcuts
    • Timing patterns
    • Chain functions: indexing, transposing
    • Randomness and selection
    • Realtime inputs: keyboard and controllers
    • Parameter modulation
    Skills required:
    • working knowledge of Ableton Live
    • general editing and scripting skills

    Requirements: Ableton Live 9.x, Max for Live

  • Fiction Meets Livecoding in a Future of “Smart” Products.
    Lugar -> salón CMMAS Imparte -> Evan Raskob

    How might livecoding help users control the complex choreography of networked, “smart” gadgets? For example, how might they manage their networked refrigerators for the home that are hooked into smart energy meters, alongside weaponised home security systems linked into indicators of local and global political unrest? A future of networked sensors and computerised gadgets could potentially reduce users to passive actors in a tightly controlled world, or give them previously unseen levels of control over their environments through livecoding. Livecoding used playfully could explore choreography of complex gestures and smart lighting systems. Product designers could livecode supply chain software to choose environmentally sound, local materials for furniture production. Scientists could disseminate research livecoded data toolkits[4]. Using the techniques of “futuring” or “scenario planning,” we will develop short outlines of stories in the genre of “design science fiction.” Structured discussions and ideation exercises led by an academic in Product Design and Interaction will challenge participants to develop provocative but plausible future scenarios that may lead to new areas of research. Provoking discussion between computer-based practitioners and product designers around how smart, networked products could and should interact with their users should spark new insight and collaboration between the two disciplines.

De 14:00 a 15:30 hrs -> Comida

De 15:30 a 16:30 hrs -> Sesión 7: Charla magna de Shelly Knotts: Live Coding Fieldnotes -> Auditorio CMMAS

De 16:40 a 17:40 hrs -> Sesión 8: Panel “Meshworks: diversity and equity in live coding practice” -> Auditorio CMMAS

  • Modera: Aide Violeta Fuentes Barrón
  • Joana Chicau
  • Alexandra Cárdenas
  • Hernani VIllaseñor
  • Eldad Tsabary

De 19:00 a 20:55 hrs -> Concierto 4 -> 2o Patio del CENTRO CULTURAL CLAVIJERO

  • Alejandro Hernández Rosas, Pedro Aristóteles Benítez Vallejo and Emmanuel Anguiano Hernández: Remixing the thought/Remixeando el pensamiento

  • Everything is digital. Everything is connected. Everything is algorithmic. Under this scenario, the domain of technology and the code acquires new significance. The coder is not only able to create, modify or hack any algorithm. He can potentially achieve to abstract the thought in algorithmic form and to manipulate it through the language, the code. The coder rise as a new kind of shaman, the new alchemist; capable of modifying reality through thought. The live coding performance can be seen as a ritual where different thoughts are found; abstracted algorithmically, in code form. They are shared, manipulated and reconstructed by the coders. It transcends the technological sphere and manifests itself.

  • Visitante: #VivoEnCódigo #LivingCode

  • [Exterior. Atardecer/Anochecer] Un parque, una calle, una plaza...una ciudad.
    #ViVoEnCódigo #LivingCode es una pieza en dos partes, que interviene un espacio. Se establece una relación con las personas presentes para invitarlas a “intervenir” en el montaje. De este modo, se acuerda participar de una experiencia de creación colectiva en la que se cumplen algunos postulados del ManifestoDraft (“Muéstranos tus pantallas”…“Los algoritmos son pensamientos”...), pero también se introducen y mezclan las interpretaciones de los participantes. Estamos inmersos en todo tipo de códigos, todo el tiempo; vivenciales, cotidianos, conscientes e inconscientes. Por otro lado, las tecnologías informáticas funcionan a través de códigos que afectan nuestras vidas, pero al mismo tiempo, nosotros los interpretamos y los podemos modificar.

  • Christian Oyarzún: pSY.dR34M5:karmaZqncer'

  • 'pSY.dR34M5:karmaZqncer' is a live-coding audiovisual performance by voodoochild/. The real-time graphics are coded in Processing and controlled via OSC while the music and sound performance is triggered and coded live from SuperCollider. The performance explores the opposition between clean and wide ambients guided by strong white-noise basslines, and noisy and saturated ambients generated by waveform and sampling manipulation. Visually, the same opposition is explored by the presentation of clean tridimensional structures based on spherical and toroidal coordinates versus bitmap filtering, shaders and rendering operations. The performance search to expose and evidence performative and processual aspects of an artwork which main characteristic have been to be produced through the typing of code, in tension with a conventional human role represented in the guitar player.

  • Paola Torres Núñez del Prado: Textile Patching (live coding with Pure Data and embroidered fabric controllers)

  • Live audiovisual performance with textile controllers being sewn in real time while their behaviors are (re)coded live with the Pure Data platform.

  • D. Andrew Stewart and Sang Won Lee: Jimmy raps with live writing

  • I have been working with computer music artist Sang Won Lee (University of Michigan), integrating gestural control of audio with his browserbased performance and visualisation interface, known as Live Writing. Together, we would like to contribute to ICLC 2017 by proposing the performance of a new audio-visual composition, entitled Jimmy raps, that combines Live Writing, the MYO armband, and audio/programming components Max and Ableton Suite. Both Sang and myself are co-developers of this performance.

De 9:00 a 9:30 hrs -> Registro -> Auditorio CMMAS

De 9:30 a 11:35 hrs -> Sesión 9: Presentación de artículos -> Auditorio CMMAS

  • Modera: Juan Sebastián Lach
  • Amble Skuse and Shelly Knotts: Diversity = Algorithmic
  • Jessica Rodríguez & Rolando Rodríguez: Narratives through algorithmic mestizaje. A Mexican memoria
  • Renzo Filinich and Monica Salinero: Sound art in Chile: Code,Technology and Performance
  • Nick Rothwell: Live Coding in the 1990s: The Pulse Sequencer

De 11:35 a 12:15 hrs -> Sesión 10: n 10/Session 10: Charla abierta a todos los participantes-> Auditorio CMMAS

De 12:15 a 14:00 hrs -> Comida

De 14:00 a 18:00 hrs -> Taller -> Comité de Asuntos Intangibles

  • 'On-the-fly e-lit'
    Imparte -> Rodrigo Velasco

    ‘On-the-fly e-lit’ is an invitation to explore diverse interrelations between live coding and electronic literature through ‘s2hs2’; an interface that allow us to assign a visual form for each letter on our computer keyboard. Then sharing OSC messages between TidalCycles and Processing, we will deconstruct text and visuals through live coding, drawing as writing live processes, looking to celebrate unexpected materialities.

De 18:10 a 19:55 hrs -> Concierto de Clausura -> Auditorio CMMAS

  • Esteban Betancur, Luis Rodriguez, Tatiana Duran, Luis Castro, Erica Florez, Edwin Cortes & Sara Henao:Random Corpus Binary

  • Random Corpus Binary es una obra audiovisual en la que el cuerpo es interpretado por la tecnología convertido en una proyección aleatoria sobre pantallas móviles que da lugar a formas humanas sin categorías que emergen por la escritura del código, la aleatoriedad y la improvisación en la que intervienen las artes visuales, el sonido, la programación de software y el diseño en una misma puesta en escena en la que siete personas crean desde la escritura de código en tiempo real.

  • Dragica Kahlina, Joana Chicau & Andres Villa Torres: Transmutations

  • A performance interweaving the live coding of computer vision and music with a constant (de-)formation of the human body and its representation in space and time. On the stage two live coders: one building the computer vision program; the other spinning a musical system; and the performer (dancer) moving her own body— live hacking the system with her movement, interfering and being interfered by an ongoing audio-visual composition. The piece and the performance reflect on the fragility of the bodies, matter and space. On how human and machine perception can encounter similarities and discrepancies, while experiencing the environment and themselves. Keywords: Live Coding > Body > Movement > Thinking in Action > Improvisation > Computer Vision > Programming > Making Visible > Process > Performance > Free/Open Source Software

  • Niklas Reppel: Disentrenchment

  • The performance “Disentrenchment” is about making new connections between musical events, and breaking up entrenched structures. At first, we start with a sequence of notes in a well-known order (like, the beginning of a piece by W.A. Mozart). Over time, new connections are made, the old order is “un-learned”, and new patterns emerge, maybe just as a hint, and might just as well disappear immediately in an ever-changing process. The main compositional parameter is rather a kind of entropy that changes over time, while the original order becomes more and more unlikely. If not constrained by the limits of the concert situation, this could go on forever.

  • Yosuke Sakai & Hiroto Takeuchi: Improvisation in Painting And Composition

  • 'Improvisation in Painting And Composition', started in 2017, is a experimental performance by Yosuke Sakai(painter and programmer) and Hiroto Takeuchi(composer), which connects live painting and live composition via sound spectrogram. Yosuke Sakai will be Sumi-e paiting to make a image treated as a spectrogram(a image representing time, frequency and amplitude). Sumi-e is a Japanese traditional painting, expressed by gradation of black-and-white and bleeding of ink. Regarding a painting as a sound spectrogram, we can make a sound from a painting. And in that respect, you can regard painting as coding. The generated sound is passed to the composer's environment and Hiroto Takeuchi will be editing and composing a piece of music in real time, and played. The painting with brush and ink is not under control, that is to say, the generated sounds by them are different every time although the painter try to make the exactly same one, thus the composer will face the uncertainty and have to handle it. The performance will be highly improvisational.

  • Jaime Lobato: JazzCodes

  • JazzCodes is a real time composition and improvisation project where a jazz quartet interacts with a fifth interpreter, a live coder. As real time composition is the central concept to develop, the only element that repeats each time is how the supercollider “class” shows the audience the structural architecture and process of the piece.

De 20:30 a 01:40 hrs -> Algorave 2 -> Jeudi 27

  • Ulysses Popple (reactive visuals)
  • Evan Raskob: BITLIP A/V SET
  • Sean Lee: Gallium
  • Cybernetic Orchestra: Cybernetic Orchestra and friends
  • Shawn Lawson and Ryan Smith: EV9D9
  • Jeremy Stewart: be_01
  • Jamie Beverley: CrowdPatching-a distributive algorave performance
  • Jason Levine and May Cheung: Scorpion Mouse Performance at ICLC 2017
  • Sean Cotteril: co¥ᄀpt vs howto_co34pt_liveCode
  • Thomas Murphy: Half Asleep
  • Jessica Rodriguez, Marianne Teixido Guzmán, Emilio Ocelotl & Luis Navarro Del Angel: Peleas en la Coliseo
  • David Ogborn: + d0kt0r0
  • Shelly Knotts and Joanne Armitage: ALGOBABEZ
  • Jack Armitage (Lil Data): Lil Data
  • Jason Levine, Charles Bicari, Marc Matatya & Rodrigo Velasco: ELECTROLIVE, 'Aural visions on-the-fly'