The Interactive & Immersive HQ

Depth Sensors in 2021

As the years pass and newer and newer gear comes out, we often have to re-evaluate what we’re using in our installations. Depth sensors are a prime example. The amount of companies that have entered and exited the depth sensors market in the last few years is wild. In this post, we’re going to look at the recent depth sensor news and offer recommendations based on what we’ve been using on installations.

Microsoft’s Depth Sensor

Let’s start with the tried and true: Microsoft Kinect. Now Microsoft isn’t immune to drama in the depth sensor space. The Kinect 2.0, or XBOX One Kinect, is a staple of our industry. I have a number of them in our office and continue to use them on installations. Sounds good? Bad news is they actually discontinued the Kinect 2 at the end of 2017 / beginning of 2018. That’s more than 3 years ago and we’re still using this sensor. It’s easy to use, integrates with almost any software at this point, is stable, has great tooling built around it, and has good quality sensor. At the moment, if you want to use a Kinect 2, you can still buy an Xbox One sensor on websites like Amazon, but annoyingly you’ll have to buy a 3rd party OEM adapter to connect it to your computer. Not the end of the world, but not exactly thrilling for permanent installations. Even with all that said, it’s still my go-to sensor.

Then there’s the Kinect Azure, the new kid on the block. It’s what is supposed to replace the Kinect 2….problem is it’s technically still only a dev kit and the tooling is still FAR behind what Kinect 2 has. The promise of this device is that it’s easier hardware to work with, has wider field of view, and uses GPU acceleration on it’s processes! But like all dev kits, you’ll find the tooling involves a lot of DIY, command line, or workarounds. The GPU acceleration is great but does actually (currently) add latency, so it feels slower to use than a Kinect 2. There’s a lot of promise for what machine learning can introduce to Kinect Azure, but the moment I’d only recommend it in environments where the wider angle will really help you or if a little bit of latency is acceptable, as this will eventually grow to better than Kinect 2 over time. But at the moment, it’s not my first choice!

Intel

RealSense….R.I.P. The new announcement last month was that Intel was fully discontinuing their RealSense line. They haven’t announced an arrangement for what might replace it or if they’ll even bother replacing it. Yay! RealSense used to be the go-to alternative to Kinect when you didn’t exactly need skeleton tracking as the RealSense units came in all kinds of variants to suit your physical installation needs. They were easy to work with, extend their signals, and mount as they were generally quite small. Overall great units that gave you an option to use if Kinect 2 didn’t meet the project requirements. Now that they’re officially discontinued, I’ll most likely avoid RealSense units going forward. I had used them a few times successfully, but if I’m going to use a discontinued product, I might as well use one with tons of community support and tooling, like the Kinect 2.

Get Our 7 Core TouchDesigner Templates, FREE

We’re making our 7 core project file templates available – for free.

These templates shed light into the most useful and sometimes obtuse features of TouchDesigner.

They’re designed to be immediately applicable for the complete TouchDesigner beginner, while also providing inspiration for the advanced user.

Zed

Zed is an interesting company. They’ve been around for a while but they never really got as much hype as Kinect sensors or even RealSense. What they lack in hype they make up for in great hardware. I’ve always thought the Zed cameras have the cleanest depth map of any of the sensors I tried. They didn’t have as many of the fancy features that come with Kinect or RealSense, like skeleton tracking or feature detections, and I think this is what kept them a bit behind in terms of adoption. Luckily, it seems like they did find a niche and have continued to develop their sensors. The newer Zed 2 and the Zed 2i look like generally great units to use AND what I’m most excited about is to see their commitment to tooling. They’re starting to build middleware software now that can augment the sensor data with machine learning for things like object or person detection, getting all the camera orientation data, skeletal tracking, and 3D point cloud scanning. It really feels like Zed is listening to the requests of people who’ve used Kinect and RealSense cameras, and I’m excited to give their newer sensors a test run and see if they could become the long term solution for installation work.

The only real downside currently is that the toolkits seem to be in development and not widely adopted, especially in our community. Which means you’ll have to be doing a bit of self-discovery and DIY on getting the sensors up and running. One positive thing is that the latest builds of TouchDesigner should be able to interact with it using the Zed operators that were recently updated to use Zed 2 SDK.

Depth Sensor Alternatives!

Let’s say you’re burnt out from depth sensors or you want other options. What are the areas you should look to? I would suggest machine learning. It has come A LONG way in the last few years. Not only in terms of features and capabilities, but also in terms of usability and documentations. One of my favourite machine learning applications is RunwayML. It let’s you dive into actually using machine learning models in creative ways in less than 10 minutes. No need to install multiple frameworks or dependencies, and no need for any specific hardware or GPU power on your computer since it runs in the cloud. This means you can jump between tons of different models and setups with a few clicks. If you haven’t tried it, it’s definitely worth a look because the have models for everything from background subtraction to skeleton tracking.

There are also other ways to get machine learning implemented into your system. You can look to implement the popular machine learning frameworks like Tensorflow natively on your system or you can lean towards integrated tool kits, as you’re likely not the only one experimenting with machine learning in your development environment. There are lots of github repos community members have made showing how to use machine learning in TouchDesigner.

Wrap up

It’s unfortunate that the depth sensor market is in such limbo. We’re waiting for dev kits to mature as we’re still clinging to discontinued hardware in the mean time. Between the advanced in machine learning and units like the Zed cameras, it does leave me hopeful that we’ll continue to get better and better ways of sensing spaces in our interactive and immersive experiences.