We all know how many different software packages are out there and it can frankly paralyze people in their decision making process. In our last post we talked about TouchDesigner and Notch, and today we’re going to dive into Unity and Unreal. These two game engines have become quite popular in our industry, let’s dive in!
Let’s start with Unity since it gained popularity in the interactive and immersive media space before Unreal. It came into our scene around 4-5 years ago and made quite the splash being one of the first game engines to start dropping their prices. Some folks might remember, before Unreal was the free tool we have now, you had to email them to request getting a copy of the engine! You couldn’t just hit download and get to work! This was one of the big plays that Unity made that helped jumpstart its asset store. And that’s important because when we think about why Unity is so loved by developers, a lot of the focus is on extensibility and the Unity Asset Store.
It’s quite remarkable how many assets you’ll find in the asset store and how cost effective they can be. Everything from models, sounds, textures, scripting templates, full game templates, and even node-based logic programming environments can be found on the asset store. For a lot of developers, the ability to get up and running quickly with store assets means faster project turn around and lower development costs because you aren’t needing to build everything from scratch by yourself.
Two other great features of Unity are it’s ability to build your project for just about any platform and it’s new commitment to our industry. Firstly, it doesn’t matter if you want to run on PC, Mac, Linux, iOS, Android, or anything else. As long as your project’s features/plugins are compatible on the target platform, you can compile your project into an easy to ship executable package that can just be double-clicked and run on the target. This means rolling out projects can become so much easier and more cost effective in terms of licensing. Unity’s final great features are its commitment to our industry in terms of new features. I’ve seen some amazing works come out of the new VFX graph, which is a node-based environment for making real-time effects. There’s the Shader graph that lets you make materials in a node-based environment. And Unity bought Bolt, which is a node-based environment for building your game/application logic, and they’re working on integrating it natively into Unity. So clearly they’re understanding that artists like node-based environments and they’re deep into the process of creating those environments.
Where does Unity fall short? Well, similar to what we’ll see with Unreal, it’s still a game engine at it’s heart. While it’s doing a great job at bringing new features that are exciting for us, you’ll still need to learn some basic game development techniques to really take advantage of the engine. The way projects are structured and the user interface is laid out are all based around game design paradigms. It can also be hard to get a lot of our common hardware integrated into Unity without relying on 3rd party plugins that sometimes go a long time without updates. In TouchDesigner, a brilliant thing we have is first part support for tons of hardware. We have nodes from Derivative that can connect us to just about anything and it’s easy to take that first party support for granted. As I’ll describe with my Unreal anecdote about webcams shortly, you might find things that are incredibly easy to do in TouchDesigner or Max are quite tricky and feel very hacky in these game engines.
Get Our 7 Core TouchDesigner Templates, FREE
We’re making our 7 core project file templates available – for free.
These templates shed light into the most useful and sometimes obtuse features of TouchDesigner.
They’re designed to be immediately applicable for the complete TouchDesigner beginner, while also providing inspiration for the advanced user.
Now the nice thing is that we’ve already talked about one game engine. A lot of the elements of Unreal are similar to Unity, so we can keep this section short and focus on the specifics. Unreal had a slightly different trajectory which is important to know. Whereas Unity was praised for it’s asset store and extensibility, Unreal is really known for it’s high-end rendering engine and integrated tools. It’s hard to find a development environment where you get as high-quality of a real-time render as you do with Unreal out of the box. Lighting, materials, camera effects, post processing, raytracing, path-tracing, AO, GI, whatever you need, it’s usually a checkbox to turn on and quite high performing as long as you’ve got the hardware for it. This makes Unreal a new go-to tool for industries like architecture, where real-time photorealistic renderings are now easier than ever using built in libraries like the Twinmotion library. Similarly, the Quixel Megascans are incredibly high-quality assets that with UE5 coming out soon, will be directly available inside the engine for your drag-and-drop usage. It’s incredible to think that this is what free drag-and-drop assets look like in 2021.
The integrated toolkit inside of Unreal has also become hugely popular. While Unity is working on integrating Bolt into it’s engine, Unreal has had blueprints up and running for a while. Blueprints are the node-based environment inside of Unreal for creating logic and scripting your applications without needing code. Just like Unity, Unreal also has Niagara, which is a no-code environment for making high-end particle effects and simulations. There’s also a node-based material editor, so that you can make complex and custom materials without needing to drop down into a GLSL script. All these things make Unreal an attractive option.
Virtual productions equally love Unreal, and I think this is where a lot of Unreal’s hype in our industry started. Unreal saw an opening in a big new market and took the leap head first. Integrating native support for Blackmagic and AJA capture cards, build an entire portion of their site dedicated to virtual production and film/tv work. This attracted a lot of eyes and high-profile use cases like The Mandalorian sealed the deal and now Unreal is quickly becoming a standard tool for real-time broadcast and virtual production environments.
Where Unreal falls short is quite similar to Unity. It’s still first and foremost a game engine. You can feel that you’re working in an environment that wasn’t completely built from the ground up for the work you’re doing. Once you get used to it, it’s fine, but it’s not like working in TouchDesigner or Notch where you feel very clearly “this environment was made for the work I’m doing.” One of my funniest experiences in Unreal was trying to plug in a webcam. Coming from TouchDesigner, where one node gets me full access to all my video devices, I was slightly shocked that I needed to follow a 30 minute YouTube tutorial just to get a webcam plugged in, and it was still confusing as all hell. And I think this experience is felt when you start to try to do some things that might be standard in interactive/immersive projects and find that they’re not as straight forward yet in these game engines. The Unreal Marketplace, while quickly growing and full of great assets, is still dwarfed by Unity’s asset store. And finally, from what I’ve heard from colleagues, if you’re targeting multiple platforms, Unity’s ability to export executables is a bit more reliable and easier to work with.
Ultimately the choice between Unreal and Unity usually comes down which features might be more useful to you, because both can integrate with TouchDesigner and operate effectively in an interactive real-time environment. How you end up using all of these different softwares together depends on your goals and current skill sets. While there’s a lot of different softwares we can choose from, I hope these breakdowns of how the pros are looking at all these tools helps inform your decisions and ensure you pick the right tools for your project. Enjoy!