The Interactive & Immersive HQ

2025 TouchDesigner Experimental New Features

In the ever-evolving world of real-time immersive design, TouchDesigner continues to redefine what’s possible. The 2025.30060 Experimental TouchDesigner release isn’t just an update—it’s a creative leap forward. Packed with GPU-accelerated geometry, streamlined DMX workflows, ST2110 support, and major Python expansions, this build gives interactive artists, VJs, and visual artists, and programmers new levels of control and flexibility.

Let’s break down the most exciting new features and what they unlock for your next installation, performance, or generative system.

1. POPs (Point Operators) – Geometry Meets GPU

At the core of this release is the brand-new family of Point Operators—POPs. Running entirely on the GPU, POPs introduce a flexible, high-performance framework for manipulating point-based geometry: think particles, point clouds, spline curves, and procedural forms. Each POP carries data-rich point attributes like position (P), color (Color), and normal (N)—with full support for user-defined attributes and dynamic generation.

Whether you’re building particle-driven light sculptures, interactive mesh systems, or real-time audio-reactive geometry, POPs are now your go-to foundation. Bonus: they integrate directly into SOPs, CHOPs, and even laser and DMX pipelines.

2025 Experimental TouchDesigner

2. A Smarter DMX Pipeline for LED and Fixture Control

In the past, TouchDesigner’s DMX control relied heavily on CHOP-based workflows. But with the new POP-driven DMX operators, your lighting logic becomes spatial, modular, and far more scalable.

  • DMX Fixture POP lets you instantiate multiple fixtures in 3D space with their full channel profiles.
  • DMX Out POP merges fixtures into optimized universes for output via Art-Net, sACN, KiNET, etc.
  • DMX Map DAT provides a visual grid of your DMX layout, making debugging and optimization a breeze.

This is game-changing for LED walls, kinetic lighting, mapped environments, and architectural installations.

3. Laser CHOP: Sharper Paths, Cleaner Looks

Laser visuals get a huge upgrade in this release. The Laser CHOP now works natively with POPs and introduces the concept of Corner Points vs Guide Points. This means you can define sharper laser lines and avoid visual hotspots by customizing point repetition behaviors.

Paired with improved blanking calculations and per-point interpolation options, this is an ideal setup for anyone building generative laser systems or syncing visuals across AV formats.

2025 Experimental TouchDesigner
2025 Experimental TouchDesigner

4. ST2110 Support – Broadcast-Ready Workflows

For installations that need to interface with high-end AV systems, TouchDesigner now includes native support for ST2110 video workflows via the new ST2110 In/Out TOPs. You can stream in and out using Blackmagic and Deltacast IP devices with timecode-accurate sync—no more hacky setups for broadcast environments.

5. Python Threading + Environment Management

One of the most exciting under-the-hood upgrades is the introduction of the Thread Manager and tdPyEnvManager. You can now safely run Python scripts in separate threads, install external packages (like OpenCV, HuggingFace, etc.), and manage environments from within TouchDesigner.

Need to run a live Stable Diffusion model? Parse API or OSC messages from multiple sources? These tools make asynchronous, multi-process design much easier and more robust.

6. A Machine Learning Pipeline, Ready to Run

Coming off the back of #5, an included Depth Anything V2 tutorial walks you through using the Thread Manager and Python Env tools to load a depth-estimating neural net inside TouchDesigner. Whether you’re using ML to drive interactivity, analyze camera feeds, or shape real-time geometry, the foundation is now fully integrated.

2025 Experimental TouchDesigner

Get Our 7 Core TouchDesigner Templates, FREE

We’re making our 7 core project file templates available – for free.

These templates shed light into the most useful and sometimes obtuse features of TouchDesigner.

They’re designed to be immediately applicable for the complete TouchDesigner beginner, while also providing inspiration for the advanced user.

7. Native 3D Texture Support

TOPs like Composite, Blur, Chroma Key, and Level now support native 3D textures, making them perfect for volumetric content and procedural simulations. This unlocks multi-dimensional data visualization, voxel-style content, and more advanced shader pipelines.

For generative visuals, this means you can now texture and process across X, Y, and Z—not just two dimensions.

8. Media Metadata Export

Both image and audio export tools now support embedded metadata:

  • Movie File Out TOP writes EXIF data for formats like PNG, TIFF, and JPEG.
  • Audio File Out CHOP tags metadata for WAV, MP3, OGG, and AIFF.

Combined with new spherical and stereo video flags, this is a big help for installations that require archival, documentation, or external playback.

9. UI + Text Rendering Improvements

From the new “Face Camera” toggle on the Geo Text COMP to drag-to-size columns in Lister, the UI elements in TouchDesigner got more precise and responsive. There’s even a Prevent Display Sleep toggle for the Window COMP—great for unattended installations.

2025 Experimental TouchDesigner
2025 Experimental TouchDesigner

10. A ton of other stuff

There are plenty more awesome things coming in like pattern matching improvements, H.266 support, Touch In/Out CHOP: Multi-Sample Mode,  Script CHOP + DAT – Modify Outside of Cook, WebRender TOP: DPI & Headers, ZED SOP/CHOP/Top Overhaul, all Out Operators now have ‘Select’ Parm, and the clock CHOP now has a countdown! I’m using it now to countdown til the official release but until then, check out the documentation to explore the many additional features not listed here.

Final Thoughts

The 2025 Experimental release is a playground for creative technologists. POPs offer an intuitive and GPU-accelerated way to manipulate geometry. DMX, laser, and texture workflows are more capable than ever. And with the new Python toolkits, you can finally blend real-time interactivity with deep computing without breaking your project.

Whether you’re building projection-mapped architecture, responsive LED floors, AI-driven installations, or spatialized visuals, this release sets a new bar for interactive work.