We covered the multi uses of NVIDA broadcast in the Interactive & immersive HQ in the past, but now a lot of the AI features from NVIDIA Maxine are now integrated into the broadcast app.
There are great new effect features you can start integrating so we will cover all the latest and original effects and their particular use. It’s a powerful tool that doesn’t require any fancy sensor or camera (aside from an appropriate GPU), but any webcam can be applied.
Latest NVIDIA Video Effects
Eye Contact (beta)
A year ago, NVIDIA announced in NVIDA Maxine that there is Augmented Reality SDX that offers AI-powered, real-time 3D eye tracking, which allows simulated eye contact just using your camera input. It wasn’t until recently NVIDIA added the effects on the Broadcast App. So how does it look?
Most of the time, it looks natural. Your eyes look wide, and if you’re looking off to the side, it eases the transition back to the center rather than snapping. Since it is still in beta, it could be better. I find if I look up, there are some jittery moments. Using a wide-angle camera, such as the Kinect Azure, seems unnatural. The eyelids can appear as if they are dancing, depending on where you are.
The eye contact feature can be helpful if you’re doing a video call presentation. Still, if you’re recording something, such as a tutorial where your face is front and center, it can be risky to where there can be moments where your eyelids seem like it’s twitching. That being said, have I used it in a tutorial before? Perhaps. Which video was the eye contact feature on? You tell me!
You can watch a demo video on the Eye Contact feature here.
If you want a moodier look, there is the option to have a vignette. Face tracking is an option where the vignette follows your face. This effect isn’t particularly interesting, but if you did need some slight vignetting this allows you to apply it and control it without the need for yet another 3rd party processing app.
Video Noise Remover (beta)
In the past, NVIDIA Broadcast had video noise remover, but now it is listed as a beta. NVIDIA is using different technology for video noise remover. It is beneficial if you’re using a built-in laptop webcam, especially if it is in low light. The noise remover isn’t noticeable if you use an HD camera with optimal light, but in a pinch it can really help remove distracting noise, flickering, and artifacts from a live video feed.
Similarly, in Zoom or Google, you can add background blur. What is nice about the NVIDIA broadcast is the option to slide the strength of the blur, unlike the versions built into apps like Zoom.
There is the option to choose for performance or quality. The quality gives a cleaner silhouette, especially for hair. In both cases, the quality and responsiveness is quite a bit better than what you’ve likely experienced with Zoom or Teams.
You can swap your backgrounds via Zoom and other video chatting apps. There is an option to choose between performance or quality. This can be helpful for joining online broadcasts with branded background or having something a bit more unique than the generic background blur.
Background removal was a huge deal when the NVIDIA broadcast added the background removal effect, which is still fantastic today. It is relatively clean, and you can achieve the same effect it would use to require having a particular sensor such as Intel RealSense or Microsoft Kinect. If you ever wonder how we remove our backgrounds in The Interactive & Immersive HQ video tutorials, pssst… this is it. It’s easy to use, reliable, and high quality.
You can Zoom in on your camera frame. The effect has a slider so that you can control the frame’s proximity. I usually turn this on if I use a wide-angle camera in video calls, especially if I don’t want to see everything in my background.
Combining it with background removal is a fun hack for auto-framing. It can be a fun use for a player tracking installation.
Note if someone is walking behind your background, the framing could jump to the other person, and you become off the frame. It can auto-frame up to one person at a time. The auto frame can also be jarring since the tracking ease isn’t super great.
Get Our 7 Core TouchDesigner Templates, FREE
We’re making our 7 core project file templates available – for free.
These templates shed light into the most useful and sometimes obtuse features of TouchDesigner.
They’re designed to be immediately applicable for the complete TouchDesigner beginner, while also providing inspiration for the advanced user.
You can have multiple effects running at the same time. Certain effect combinations don’t work, such as background replacement with background blur. Simply because they would interfere with each other, you can use eye contact with any background-related effects.
NVIDIA Broadcast in TouchDesigner
How do you use NVIDIA Broadcast in TouchDesigner, you ask? It’s simple. Drop a Video Device In TOP and select NVIDIA Broadcast as your input. Then BAM, you have all the effects integrated into your TOP.
NVIDIA Broadcast in Other Apps
Using the NVIDIA broadcast in your streams or video conference calls is as easy as selecting your camera input as an NVIDIA broadcast.
Although not perfect, a lot of the NVIDIA Broadcast effects are unique, high quality, and responsive. You can now achieve features that in the past it would require a specialized camera sensor to do or much more complex processing chains. It makes installation creation more accessible or even just video calls fancier. How do you use NVIDIA Broadcast in your workflow?