The Interactive & Immersive HQ

Datamoshing in TouchDesigner – Part #3

We’ve been getting deeper and deeper into the build of our datamoshing component in TouchDesigner. We’re finally digging into GLSL code and porting over the individual examples from our reference material, and by the end of this post, we’ll have a working shader starting to come together!

Spoiler alert!

If you haven’t read part one and two, I highly recommend you read them. Aside from being a fun component and a cool effect, the deliberately long process I’ve taken to build this is educational in it’s own right, as it shows you behind the scenes a process that all high-end pros go through: converting workflows from other applications into TouchDesigner. This post also picks up immediately where part #2 left off.

And before we dive in, a big shoutout to Ompu Co for their great blog post that has been the reference of this series:

Picking up where we left off

We ended the last post having finished the first example from our reference material. This was basically just a shader that added the values from our motion vector texture (the optical flow) to our original source movie texture:

Here was our final shader:

out vec4 fragColor;

void main()
{
    vec4 col = texture(sTD2DInputs[0], vUV.st);
    vec4 mot = texture(sTD2DInputs[1], vUV.st);

    col += mot;

    fragColor = TDOutputSwizzle(col);

}

Pixel Displacement

The next step from the reference was to use our motion vectors to offset the sampling position of our original texture. The reference code provided is as follows:

fixed4 frag (v2f i) : SV_Target
            {
                float4 mot = tex2D(_CameraMotionVectorsTexture,i.uv);
                //add motion vectors directly to UV position for sampling color
                fixed4 col = tex2D(_MainTex, i.uv+mot.rg);
                return col;
            }

If we go through this quickly, here’s what we can see:

  • We can ignore the first line, as we replace that with our void main() function
  • The next line samples the motion vectors as normal (we already have that)
  • Instead of adding the motion vectors to the col variable (which samples the original source texture), we add the motion vectors to the UV sampling position

Great! So we only need to change the last line of shader to match this but we do need to re-arrange our lines of code a little bit. I’m also going to start adding comments here to make it easier to track what we’re doing:

out vec4 fragColor;

void main()
{
    // sample the motion vectors
    vec4 mot = texture(sTD2DInputs[1], vUV.st);

    // sample the colour texture
    // use the motion vector to offset our UV sampling position
    vec4 col = texture(sTD2DInputs[0], vUV.st + mot.rg);

    fragColor = TDOutputSwizzle(col);

}

Now we should see something like this:

Wait, it’s different!

Now we hit a slight detour here because our result from above isn’t matching the reference shown! It looks all fuzzy looking instead of How come?! Well here’s where we have to do a few tweaks.

The first is that we should switch our GLSL shader to use either 16-bit pixels or 32-bit pixels. I’ve talked about this previously, but a quick way to think about pixel bits is that the bits you have, the more of a value range you can have. 8-bit pixels, which are the normal kind you see on screens and when working with videos/images, only have 255 values that each pixel can hold. When you see issues like banding in gradients this is because there’s not enough value range to properly represent a gradient and certain pixel values end up getting rounded up or down and form those band/breaks in the smooth gradient. Since we’re working with things like motion vectors, which are moreso data than they are “something visual”, you’ll want to jump up to 16-bit or 32-bit to make sure your data doesn’t get scrunched. In this case we can go to the common page of parameters on the GLSL Multi TOP and set it to use 32-bit float (RGBA):

What we should also do is set our UV co-ordinates to extend. Why? Because this is how most apps treat UV’s, but in TouchDesigner by default out UV co-ordinates on a GLSL Multi TOP are set to zero out if they overflow in any direction. We can do this on the GLSL page of parameters by setting the Input Extend Mode UV to Repeat:

The next thing is that we have to start reading between the lines a little. In the previous example, even though the reference code mentioned sampling the _MainTex variable which is the original source image, the paragraph in the text directly after mentions:

This is actually the basis for the datamoshing effect. All it’s actually doing it displacing pixels from the previous frame based on vector motion

The part bolded at the end is important. Displacing pixels from previous frame using motion vector of current frame. While we have our feedback loop setup, we haven’t actually used it yet! And it turns out by here, because of how Unity is working with it’s pixel buffers, the author already is acting on previous pixels, so we’ll make 1 more change in our code. Where we were creating our vec4 col, instead of sampling input 0, which was our source, we’re going to sample input 2, which is now our previous frame feeding back into the shader:

out vec4 fragColor;

void main()
{
    // sample the motion vectors
    vec4 mot = texture(sTD2DInputs[1], vUV.st);

    // sample the colour texture
    // use the motion vector to offset our UV sampling position
    vec4 col = texture(sTD2DInputs[2], vUV.st + mot.rg);

    fragColor = TDOutputSwizzle(col);

}

Once we do that, we start to get something interesting!

Now we’re starting to cook with gas!!! One thing to keep in mind from now on is that our feedback will continue to accumulate till we click it’s Reset parameter. If you screen starts to get too messy and gray, you can use that to get back to a starting state. In the next post, we’ll look at adding reset functionality into our tool.

Get Our 7 Core TouchDesigner Templates, FREE

We’re making our 7 core project file templates available – for free.

These templates shed light into the most useful and sometimes obtuse features of TouchDesigner.

They’re designed to be immediately applicable for the complete TouchDesigner beginner, while also providing inspiration for the advanced user.

Tweaking our optical flow and interpolation

Now we’re still not quite looking like the reference, and that’s ok! We’ve still got a bit of ways to go. One thing we can do immediately is to update the parameters of our optical flow component. So much of this effect is based on motion vectors that we can greatly change the general feeling of it by changing some parameters on the optical flow. The first parameter I like increasing is the Threshold which determines how much something has to move before it actually is considered “moving.” At 0, any pixel that changes in any way is considered live and moving, which isn’t quite how datamoshing artifacts work. Let’s set our threshold to 0.1:

Now if a pixel doesn’t move on screen, then it doesn’t generate any output from the optical flow and results in parts of the screen actually staying still, as opposed to the pseudo-fluid look from the previous section.

I also like to increase the force to something like 2 or 3. This will make the displacements more extreme:

You can continue to play with these until you find settings you like! One final thing you can adjust to your taste is the interpolation between pixels. By default in TouchDesigner everything is interpolated linearly. In cases like this, I find that interpolation just makes it noisy and something like Mipmap interpolation or Linear interpolation are more appropriate. To change those we can select our GLSL Multi TOP and go to the common page of settings and update our Viewer Smoothness and Input Smoothness:

Wrap up

While we aren’t at the end of our journey yet, this post gets us deep into some fun creative effects that you can continue to experiment with until next week’s post, where we’ll be trying to wrap up the component! Enjoy!