The Interactive & Immersive HQ

Using Cameras as Input Devices in TouchDesigner

Today we will take a look at unconventional ways to use cameras as creative input devices in TouchDesigner. Let’s explore a few examples and techniques.

There are tons of ways to use cameras in TouchDesigner for creative purposes. From simple effects to complex computer vision tasks, we can make the most of live signals both in conventional and unconventional ways.

In this article, we will explore four examples that combine several techniques ranging from glitch textures to particle systems. So let’s turn on the camera and start patching!

Camera-based Glitch Generator

One interesting way to use a camera as a creative input is to distort its signal. We can generate acid glitch visuals that work perfectly in VJ sets or abstract art installations. In our first example, we will start with a Rectangle SOP whose X and Y sizes are constantly modified by a CHOP random number generator.

The generator is basically made up of an LFO connected to a Count CHOP and a Noise CHOP that is modulated by the Count. We then apply some math to get positive values and multiply them.

The Rectangle SOP goes straight into the GEO COMP, where the fun begins. To randomly instance the geometry, we use a simple Python script inside a CHOP Execute DAT, triggered by an LFO CHOP set to Pulse mode.

Here is the code:

import random
def onValueChange(channel, sampleIndex, val, prev):
	random_data = random.random()
	table = op('random_values')
	table.clear()
	
	table.appendRow(['x', 'y'])
	
	for _ in range(600):
		table.appendRow([float(random.random()), float(random.random())])
	return

We generate 600 random X and Y values which are stored in a Table DAT. These values are then used in the Instancing parameters to translate, scale, and modulate the color of the incoming rectangle. To add some spice, we use the camera signal for texture instancing. Finally, we render it and voilà, a live acid texture that pairs perfectly with Autechre or Aphex Twin tracks.

Lo-fi Ambient Camera

In our TouchDesigner projects, we usually aim to work with the best possible input resolution. But what happens when we deliberately decide to get rid of it? Here is an example.

Rather than thinking of the camera as a high resolution device, in this patch we will do the opposite. By lowering the resolution of the Video Device In TOP from 3840×2160 to a dramatic 1×5, we get a lo-fi signal that resembles a contemplative landscape.

To spice things up, we can use a simple technique to add color variations. First, we create four random CHOP lines (see above), merge them, and convert the result to a TOP. Then we add a Reorder TOP, connecting the Video Device In to the first input for the red channel, and the CHOP to TOP to the second and third inputs for the green and blue channels. Finally, we composite the original video input with the Reorder TOP, add a Threshold TOP, and composite it again. The result is a meditative yet evolving landscape made up of just five pixels.

Get Our 7 Core TouchDesigner Templates, FREE

We’re making our 7 core project file templates available – for free.

These templates shed light into the most useful and sometimes obtuse features of TouchDesigner.

They’re designed to be immediately applicable for the complete TouchDesigner beginner, while also providing inspiration for the advanced user.

Background Subtraction and Cached Replicas

If we want to focus our project on the human figure to achieve some cool effects, we can do it with a simple operator pipeline. Let’s see how.

In this example, we connect the Video Device In TOP to the Nvidia Background TOP. Then we composite it with the live signal output to easily perform background subtraction.

Next, we cache the output to delay the incoming video signal using an 8-step delay system. To achieve this, we take advantage of the Replicator COMP. Each replica contains a basic flow made up of a Cache Select TOP (each with a different Cache Index), a Blur TOP, and an Edge TOP whose color is randomly set by a random CHOP system.

Finally, we composite all the replicas together and here is the result.

Controlling Particle Systems with Live Cameras

Rather than displaying the outer world, we can use our camera input in TouchDesigner to control a particle system. To do this, we connect the Video Device In TOP to the Optical Flow component (available in the Palette). As the name suggests, this component analyzes motion in the incoming signal by calculating the difference between consecutive frames.

The core component of the patch is the particlesGpu component (available in the Palette as well). It is a complex particle generator with several useful parameters. For our particle system, we need to create a grid for both the particle source and the particle colors.

The particle source is based on two Ramp TOPs, one for the horizontal and one for the vertical ramps. We connect them to a Reorder TOP and then to the first input of the particlesGpu. The particle color system follows the same logic, but here we use the camera input to reorder the blue channel in the Reorder TOP, which we then connect to the second input of the particlesGpu.

Next, we connect the Optical Flow component to the third, fourth, and fifth inputs to control velocity, particle optical flow, and the particle effector. In the particlesGpu parameters, we use Velocity to drive the External Type, Turbulence Type, and Rotation Type forces.

And here is the result: a shimmering particle system that gently reacts to the camera input.

Wrap Up

Live cameras are a powerful way to generate complex and unconventional results inside TouchDesigner. As we saw, there are several techniques we can use to drive our visuals for real time interaction. So start experimenting and remember, as usual, that the sky is the limit.

Download the patch