Live-Coding for Audiovisual Performance by Kaspars Jaudzems

Workshop recommended for online participation

Key-words: SuperCollider, live-coding

What to expect?

This workshop tends towards practical – workshop host will give introduction and instructions, present background, and theories; participants will follow the lecture and instructions, and do assignments / create original work based on the new knowledge and skills, participants will create artworks in groups or individually

Description:

In this workshop we will investigate how to connect different sources and external devices to SuperCollider with the aim of creating visual components for a live-coding performance or an artwork. We will work on creating a visual installation using sound or sensor data as input.

We will cover a broad spectrum of topics like MIDI, OSC and DMX that can help to connect and synchronize different parts of an artwork, we will also learn some live-coding code patterns and look at some examples and artworks relation to the topics.

Live-coding is used to create sound and image based on different inputs – like media, light systems, dance, text, voice, sound, etc. It is also used as a computer music tool in performances and can be combined with algorithmic composing. Sometimes the process of writing the code is made visible to the viewer via projecting the computer screen. Live-coding techniques are also employed outside performances in producing sounds and audiovisual materials for other artworks.

Workshop participants should have a working installation of SuperCollider and/or TidalCycles on their computers before the start of the workshop. Face-to-face participants can join the workshop with their own computer.

Day 1Day 2Day 3Day 4
How to send MIDI from live-coding environments to different hardware, how to receive MIDI, and what can be done with the received data. At the end of the day, we will write some code patterns to control sound and lights.We will look at the OSC standard, how we receive OSC data from different sensors and Android phones and use this data in a live-coding environment to influence sound. We will investigate how we can send OSC data from a live-coding environment, to create visuals for a sound performance or artwork.We will learn more advanced ways of creating audio-visual patterns using live-coding, or investigate the DMX standard, how we send DMX data to stage lights, we will cover basics of stage lighting concepts like Fixtures, Scenes and Chases and run some examples on the available hardware.Practical creative work. Developing artworks individually or in groups.