Live Audiovisual Performance Set
This page documents my work in live multimedia performance. All audio and video in these works is created, generated, and manipulated live. Most of my works are based around designing virtual instruments that create both visual and auditory effects, often developing into self contained systems with a number of symbiotic relationships. I can perform live sets from 10 minutes (usually only performing one work) to around 40 minutes. Sets can be in surround sound or stereo, depending on the venue.
2017x0x2: Playing Chaos
Performance (music, dance, theatre, etc) has traditionally been an act of creating order in a chaotic world. 2017XOX2 embraces chaos as a means of expression. The performer can still shape the audio and visuals expressively and intuitively, but can never full control the outcome of their actions.
Three interconnected systems form the basis for this project: a particle and simulation based visual system, a synthesis-based audio system, and a human performer. The human performs on a MIDI keyboard, from which they have access to 49 different notes, each of which will trigger a particle in a different position of the x axis in the visual system. The initial velocity of the particle is determined by how hard the key is struck. The particle is then free within the system. The particle may collide with walls in the system which causes it to break into more particles. Every time a new particle is created it produces a new note in the audio system. The new notes are calculated algorithmically, based on the notes the performer has been playing*. The performer also has access to forces within the particle system, namely chaotic movement along the x and y axes which can be applied individually, or at the same time. These effects are applied by holding down different drum pads. The performer may also change the random seed of the chaotic effect which causes the particles to congregate in different areas. A video feedback system is also at work, this causes the trails behind the particles and buildup of colour. The performer may also wipe the visual system clear of feedback at any time.
The audio system feeds off changes in the visual system, translating visual characteristics of the system to audio via synthesis. Aside from triggering new notes on every new particle birth, the audio system keeps track of the overall brightness of the visuals, producing a sound corresponding to the amount of red green and blue in the image. The audio system also keeps track of the amount of chaotic energy applied to the system and creates an auditory counterpart to the visual effect. The acts of cancelling the visual feedback and changing the random seed of the chaotic forces also have auditory counterparts.
Exploration //Transposition// Distortion
Exploration//Transposition//Distortion is a work that fuses the kinetic energy of percussion performance with live cutting and manipulation of prerecorded video. The base material for E//T//D is footage shot of Oxford Circus at rush hour. This footage is then manipulated live by the performer with an electronic drum pad.
NON Concentric ARCs
This work is based around the LEAP Motion 3d hand tracking device. Hand and finger position data is used simultaneously generate audio and visuals. These gestures are then captured and looped to produce endlessly iterating soundscapes from interactions between different loops (video documentation coming soon).
Benjamin Heim is an Australian composer and multimedia performer specialising in immersive art music events and the intersection between technological innovation and expressive performance. Primarily a composer and events producer whose works have been performed around the world, Ben is now making waves in the live multimedia performance sector. In 2017 he has performed at Tallinn music week, as well as producing a unique immersive live streaming event PHANTOMS at the Royal College of Music. His collaborative work Drawing Sound has received much attention in the art world for its innovative fusion of live drawing, machine learning, and generative audio. This work was presented at IRCAM's 2017 VERTIGO Conference and has since garnered a mention in Frieze Magazine . Ben is also known for his commercial scores, interactive video game music, and poetry.