In last week’s post we talked about syncing multiple systems and the different kinds of sync you’ll find out in the world. This includes weak sync, medium sync, tight sync, as well as hardware syncs. All of these have a time and place for using them and today we’re going to dive into how some of these could be implemented in TouchDesigner.
If you missed the last post, I highly recommend you go back and read through that, otherwise you might not know all the things we’re talking about in this post.
Weak sync is easiest to implement. Usually it involves using a protocol that is reliable and connection-oriented. TCP is a good protocol for this because it ensures message delivery and systems maintain an active and connected state between them. This is unlike a protocol like UDP which isn’t connection oriented (so your systems never actually know if someone is alive on the other side) and doesn’t guarantee message delivery (dropped packet means that packet is lost). We don’t need to spend too much time talking about weak sync because there isn’t much to it. In TouchDesigner, we could create a weak sync system by having our controller send data out through Touch Out CHOPs set to Streaming (TCP/IP) in the parameters. The receivers would use Touch In CHOPs to receive that data. This would look something like this where the left side is the controller and the two processes on the right are the receivers:
You can see from the small gif that whenever I pause and un-pause it, they only ever drift about a single frame. Between different systems this would become a bit bigger of a discrepancy, but you can see it can work for simple use cases, especially like we mentioned in the last post, if screens and displays are edge to edge.
This is where things start to get interesting. Now we’re talking about sending “driving” signals like timecode or frame counters from the controller to the receivers. In this case, we’re usually less worried about reliability of signals and more concerned with making sure the latest and greatest values are sent as fast as possible down the wire. In TouchDesigner, you’ll often see folks using either Touch In and Out CHOPs set to Messaging (UDP) or they’ll use OSC In and Out CHOPs. This is because UDP is a very fast protocol without a lot of overhead. The driver signal could be generated in so many different ways. You might have timecode generated or even something as simple as a Timer CHOP counting frame numbers for you like this:
Similar to before, left side is our controller and the two processes on the right are the receivers. I’ve setup a Timer CHOP on the controller side and it’s spitting out the running frame counter over UPD via Touch Out CHOP to the receivers. The receivers get the signal and use it to drive the index of a Movie File In TOP. Like we mentioned in the last post, this “medium sync” is actually considered pretty good. When you’re working between different applications and systems, this is the kind of sync you’ll use. You might use timecode or some other form of continuous signal, but those signals end up driving all of the content generation on the receiver systems.
Tight sync usually won’t be capable between different systems. It would be very difficult to create a tight sync between something like TouchDesigner and Max MSP because we’re essentially talking about engine-level synchronization. In TouchDesigner we are able to create tight sync between different systems using Sync In and Out CHOPs. How to use those is a whole other blog post, but the idea is that the controller not only is sending data down the pipe, but the Sync CHOPs are actually taking control of the rendering engine inside of TouchDesigner and keeping all of their frame steps synchronized. The signal we’re using to control the content could still be the same ones used in medium sync but the difference are the Sync In and Out CHOPs.
To achieve this, the receivers have their Realtime flags turned off and the Sync CHOPs are constantly reporting their current frame rendering status back to the controller who then moderates all of the other systems moving through their render steps together.
The final type of sync to look at is hardware sync. Unlike the previous kinds of sync we discussed this is only related to how the frames are output from your system and travel to the displays themselves. It doesn’t have to do with any of the content or internal. Going through and setting up your displays for sync will be different depending on what kind of system you’re using. Nvidia have their Quadro Sync, AMD has their own sync, and if you’re running signals over SDI that’ll have its own setup. The most common setup I’ve worked with over the years is using Nvidia’s Quadro and their Quadro Sync. The nice thing about this is that once you setup the Quadro Sync according to their documentation, the TouchDesigner side is a single button you turn on on the Window COMP called Hardware Frame Lock:
Once this button is on, then TouchDesigner’s frame outputting will be tied into the Quadro Sync system which will ensure your TouchDesigner systems across however many systems or GPUs are all outputting frames for the displays at the same time. Again it’s important to remember this has nothing to do with content sync, so you’ll likely end up using one of the other forms of content sync along with hardware sync.
In this two part series, we covered a lot of stuff. We covered the different kinds of sync and when they might be used. We’ve also jumped into how these different forms of sync take form in TouchDesigner. This is by no means an extensive how-to but it will give you the tools you need to make more informed decisions about what your needs might be and it will make it much easier to collaborate with other pros in the field. Enjoy!