For many TouchDesigner developers, seeing some incredible concert visuals synced to music on stage for the first time directly led to their discovery of the software and the larger interactive and immersive field. They continue to be a source of inspiration for many in the field, and are one of the few venues where you can see real-time generative graphics on a large scale.
For beginner developers, concert visuals are an interesting challenge to create not only visual interest on stage, but also a sense of synchronization with the music that’s being performed. In this post, we’ll look at two different approaches of creating concert visuals in TouchDesigner, using MIDI as well as audio data to animate the visual content.
Generative Audio/Visual Environments – Capstone
To put what you learn from this post into practice, check out this course where I guide you through setting up TDAbleton to generate a simple instancing-based visualizer for an ambient track made in Ableton.
Useful Types of Data for Concert Visuals in TouchDesigner
When working to create concert visuals, one of the most common desired outcomes is to synchronize certain visual elements with the music. There are a variety of ways to achieve this, but the two types of data that we’re going to look at are MIDI data and audio signals.

MIDI is a standard used by musical instruments and interface controllers to send event data between devices.1 Especially when creating concert visuals for electronic music, MIDI data is an easily accessible source of finding out the exact moment when a particular event happens (a drum hit is triggered, a synthesizer note is played, etc.) and so can be of great use for creating concert visuals that are tightly synchronized with the music.

That said, MIDI data isn’t always available. If you’re dealing with live instruments on stage, or even just trying to synchronize certain elements to a backing track, you often have no choice but to use the audio signal itself as the trigger for certain effects. Fortunately, it’s still possible to build up effects this way, especially if the audio has very rhythmic elements.
In this post, we’re going to focus on using TDAbleton for accessing audio and MIDI data. TDAbleton is a toolset linking TouchDesigner tightly with Ableton Live. It offers full access to mostly everything going on in an Ableton set, including project transport, tempo, track MIDI, audio levels, and more. That said, many of the concert visual techniques we cover today are still applicable even if you’re using a different set of tools.
We’re going to cover a set of generic techniques that aren’t specifically tied to one particular Ableton set. Along with that, we’re also going to assume that you’ve already set up your Ableton/TouchDesigner projects to communicate. See the TDAbleton documentation page for more info on the setup process.
Using MIDI Data For Concert Visuals
In this effect, the MIDI note triggers from a synthesizer playing a melody line in the Ableton set trigger both the change in vertical position and color of the five boxes. Each box represents one of the five notes in the melody line, from lowest on the left to highest on the right.
CHOP Network

The MIDI data is brought into TouchDesigner via the abletonMIDI COMP, which gives you direct access to the MIDI note output of any track in the program. Then, a Select CHOP is used to select the individual note channels and remove the rest. The Trigger CHOP is used to generate an adjustable ADSR envelope for each note’s channel. The values here are adjusted to taste. We settled an Attack Length of 0.1 seconds and a Decay Length of 0.5 seconds.
Since this data is going to be used for adjusting the Y axis position of each of the boxes, a Math CHOP is added next to be able to control the range of movement. In the Math CHOP, set the To Range parameter to (-1, 0.3), so that the boxes would sit slightly lower in the Y axis and have a longer path of travel upwards.
Next, a Shuffle CHOP is set to Swap Channels and Samples to generate a single CHOP channel for use in modifying the SOP geometry, which in turn is what actually defines the instance positions. We’ll look at this next.
SOP Network

Above the CHOP network, we have two SOPs which are used to define the position of the instances generated in the Geo COMP. The Line SOP is used to define the initial positions of the boxes (see the image below for settings). Note the expression used for Number of Points: this makes sure that the Line SOP generates the same amount of points as the number of samples in null1. We use an expression here to give ourselves the flexibility to add more instances of geometry later on.
Next, the CHOP To SOP takes the CHOP data from the null1 CHOP and applies it to the Y position of each point contained within the line. The settings are shown below.
The Box SOP that is connected to the Geo COMP has had the Size parameter adjusted to (1, 0.4, 1).
The last step is to step up instancing in the Geo COMP. Turn the Instancing parameter on, and then make the settings changes shown in the two images below to generate, translate and apply color to instances of the Box SOP.
All that’s left is to add a rendering pipeline, adjust the camera, and add a feedback effect! Within the feedback loop, we added a Transform TOP and adjusted the Translate Y parameter to -0.03, which creates the stacked effect seen above.
Get Our 7 Core TouchDesigner Templates, FREE
We’re making our 7 core project file templates available – for free.
These templates shed light into the most useful and sometimes obtuse features of TouchDesigner.
They’re designed to be immediately applicable for the complete TouchDesigner beginner, while also providing inspiration for the advanced user.
Using Audio Data for Concert Visuals
In this example, rather than using MIDI data, we’ll use the Ableton Level device to pull in the audio levels from a kick track. This data is processed by a combination of the Trigger CHOP and Count CHOP. If you’re not using Ableton and instead have access to live audio input (or even playback of an audio file), you can use the audioAnalysis COMP, found in the Tools folder in the Palette, to generate trigger signals from filtered portions of the audio signal.

Here, we’re using the Trigger CHOP to generate quick pulses when the incoming kick audio level passes a certain threshold. The Trigger CHOP’s Attack Length, Peak Length, Sustain Level, and Release Length have all been set to 0. Decay Length has been set to 0.05 seconds. The Trigger Threshold has been set to 0.9 to filter out any potential for double triggering, but this setting is highly dependent on your Ableton set.

After that, the Count CHOP further processes the signal by counting upwards any time an Off to On signal transition is detected in the incoming signal. The Limit parameter is set to Loop Min/Max, and the Limit Maximum is set to 16. This maximum was a somewhat arbitrary choice, as the kick drum triggers in this particular Ableton set are somewhat irregular. The data is being used to rotate the box geometry in the Geometry COMP on the right, and 16 steps seemed to give a more complete sense of rotation than smaller values.
The final piece of the puzzle is using the data from the Count CHOP for the Geo COMP’s Rotate parameter. For this, set the ry and rz to the expression op(‘count2’)[‘kick’]*(45/2). This pulls in the Count CHOP’s data, and rotates the shape in the y and z axis an additional 22.5 degrees with each hit of the kick drum.
Add a render pipeline and some post processing, and you get the result above! I used a combination of the rgb Delay, solarize and bloom effects from the ImageFilters folder in the Palette.
What’s great about this particular approach is that it doesn’t require access to MIDI data to trigger specific video animations for the performance. On top of that, with the Count CHOP keeping track of the number of triggers, you can build up the complexity and have certain components of the animation occur only on specific steps. There’s a lot of room to explore!
A final note: Visualizing the Audio Spectrum Directly
As a final note, it’s worth mentioning that you can actually visualize the audio spectrum directly! We’re not going to go into detail into the methodology here, as Derivative have already put together a great example of this in practice in the POPs Sample package, which you can download from the Experimental Release Notes page of the Derivative documentation. Note that you will need to download the latest experimental release (at time of writing this is 2025.31310) to run these examples.
Once downloaded, open the Overview.toe file and navigate to the Trail_POP Base COMP and then the JoyDivision Base COMP. There you’ll find this effect, which is a cool visualization of the audio spectrum as 3D geometry over a period of two seconds using the Trail POP.
Concert Visuals Tutorial
For a more in-depth look at creating project visuals for concerts and stage performances, watch our YouTube tutorial here:
Wrap-Up
As usual, we’ve barely scratched the surface of the vast realm of possibilities available for creating concert visual effects in TouchDesigner! Whether you’re working with audio, MIDI, or other data sources, there are countless opportunities to design beautiful and dynamic visuals that enhance the scene and amplify the art of live musical performance.
DJs, artists and performers alike love how their concert visuals, often featuring sophisticated projectors and LED walls, transform the stage into an immersive environment. Being involved in creating concert visuals lets you learn new terms and techniques while helping to improve live event design and the overall experience.
We hope that these simple examples give you a bit of direction as you start to think about the tools you have access to to build your own!








