The Interactive & Immersive HQ

Time for RTX?

You’ve all heard me over the years talking about saving money and trying to make hardware last as long as possible. I often harp on about how CPU architecture is relatively unchanged over the years in terms of raw performance and how GPU’s, while getting progressively faster, are generally unchanged in terms of features. nVidia has thrown a wrench into my plans! The new features in the nVidia RTX series of graphics cards are genuinely exciting and are making me consider upgrading my GPU, here’s why.

Trickles of features

There have been a lot of cool features trickling out of nVidia that require the RTX series of cards, and more specifically the RT cores that accelerate AI and machine learning workflows. You may have heard of automagic green screen features or audio noise suppression setups that are running on the GPU. What makes these special is that they can have incredibly high quality results while being fully accelerated by the GPU (as you’ll see shortly). These have trickled out as test videos or features in random apps but never as a fully cohesive product or package until now.

nVidia Broadcast Engine and Broadcast App

These are the two new packages that finally make it worth upgrading. The nVidia Broadcast App was officially announced early this month and is a standalone desktop app that packages together all the features we’ve been teased for months now. It runs on your desktop and can incorporate:

  • Removal of keyboard keys, background noises, air conditioners, fan noises and other non-vocal noises
  • Background removal without the need of a green screen, as well as background replacement (like the feature on Zoom calls)
  • Auto-framing (using face tracking) to automatically crop and zoom in on you during the calls (maybe more useful for streamers than normal people, but still interesting!)

Like I mentioned previously all of this is running in real-time on the GPU and honestly looks and sounds fantastic. Check out these samples:

More important than their standalone Broadcast App, although it’s very cool, is the Broadcast Engine. The Broadcast Engine takes all these available features and provides them inside of an SDK framework so that other applications can build these features in natively! This is going to change a lot of workflows across so many industries. There are even a few partners that are releasing their features over the next few weeks that I’m already excited about.

NDI Screen Capture HX

Readers of the blog know that I love NDI Tools. Everything in that toolset is useful and makes NDI and overall powerhouse. I use it all the time especially when it comes to screen capture. Screen capture has been a touchy thing over the years, with some apps working better than others, but NDI Scan Converter has become my favourite over the years. I use it for everything from capturing Powerpoint presentations on laptops and sending that to my server and even for recording our workshops in The HQ PRO! One of my biggest complaints recently was the CPU overhead required to run Scan Converter and get that screen captured texture. By using nVidia’s new Broadcast Engine, NDI will soon release a new tool called Screen Capture HX, that takes advantage of GPU-acceleration to essentially remove all CPU overhead from the screen capture workflow. This will mean you’ll have more precious CPU cycles for generating content or running your favourite apps.

While this may be something exciting for me, I know what will really excite you.

Get Our 7 Core TouchDesigner Templates, FREE

We’re making our 7 core project file templates available – for free.

These templates shed light into the most useful and sometimes obtuse features of TouchDesigner.

They’re designed to be immediately applicable for the complete TouchDesigner beginner, while also providing inspiration for the advanced user.

Notch Face Tracking

As big supporters of Notch, we’re incredibly excited that they’re coming out of the gates integrating some of the amazing features of the Broadcast Engine. Specifically, the advanced face tracking features are being implemented into the real-time content creation workflow we all know and love.

You can see a quick demo of it in action in this video where the lovely Armin is showing it off:

What’s really cool about the demo are the specifications. Armin mentions a few of them, but I think they’re worth taking note of:

  • 126 key points of the face tracked, including lips, contour, eyes, eye lids, nose, and more
  • a real-time generated mesh of the face with over 6000 polygons that moves with full 6 degrees of freedom
  • Hugely reduced latency since everything happens on the GPU, previously you could feels up to 4 frames of latency because of the CPU workflow

Those features aren’t something you want to shake a stick at. If you wanted that level of facial tracking previously, you’d either have to use down-sampled and high-latency implementations on the CPU or you’d have to pay thousands of dollars to use high-end face tracking SDKs and softwares. Now we get the best of both worlds because of this amazing integration of Broadcast Engine into Notch.

Wrap up

I can’t express how excited I am that the features we’ve been seeing from nVidia are finally rolling out in nicely packaged format. The nVidia Broadcast App is great for quick streaming situations, but the Broadcast Engine is what will start to bring awesome AI and machine learning workflows that are GPU-accelerated to all the apps we use and love. Over the next few weeks we’ll be seeing some updates to the NDI Tools as well as Notch, and I can’t wait to show off those features live.

Links and further reading

To check out the nVidia Broadcast App, you can learn more here:

Learn more about nVidia Broadcast Engine here:

Learn more about NDI’s new GPU-accelerated screen capture here:

Learn more about Notch’s new face tracking integration: