The Interactive & Immersive HQ

Unreal Engine 5.2 Overview for Immersive Design

Greetings, TouchDesigner developers! In today’s blog we’ll be having a look at the latest release of Unreal Engine, focusing on novel and improved features that influence immersive and interactive design, and highlighting their potential impact on your projects. If immersive design within Unreal Engine is a topic of your interest then, this overview is tailored for you. Let’s get started!

Unreal Engine and Immersive Design

Developed by Epic Games, Unreal Engine has been a staple in the gaming industry, there’s no doubt about that! However, its capabilities expand far beyond just games. When thinking of immersive design it offers a powerful platform for virtual production, virtual and augmented reality setups. This is due to its capability of generating detailed and responsive environments, while providing generous and accessible tools for user interaction. 

Now, if you are yet to be familiar with the capabilities and design of Unreal Engine, here are two links that will serve you well:

What is Unreal Engine?

https://www.unrealengine.com/en-US/unreal-engine-5

Unreal Engine 5.2 Release Notes:

https://docs.unrealengine.com/5.2/en-US/unreal-engine-5.2-release-notes/
Now, let’s delve into some of the newest features for immersive design!

Procedural Content Generation Framework

Perhaps one of the most significant new features available is the Procedural Content Generation Framework, a tool that allows the creation of iterative and procedural scenes. But what does procedural really mean? This refers to a content generation technique where sets of rules or algorithms are applied to objects, instead of manually intervening them. Think of it as a recipe given to your computer, providing instructions for the creation of objects (typically many of them) and guaranteeing an efficient and consistent output. Imagine having to design one thousand trees for a scene you’re building. The task of doing it manually, one by one, would be a monumental one. Now, through this framework you are able to do so in a fast and efficient manner.

 Unreal Engine Immersive TouchDesigner

Our very own Jack DiLaura has provided a detailed overview of this tool, which you can read here:

https://interactiveimmersive.io/blog/3d/unreal-engine-procedural-content-generation-for-immersive-installations/

Virtual Production

nDisplay

In terms of virtual production several improved features have been added to this latest release. Some of the most exciting features include an update of the nDisplay system, which allows you to have multiple video streams (inputs and outputs) with optimal performance.

 Unreal Engine Immersive TouchDesigner

Multi-Process and Offscreen Rendering

Unreal’s developers define this as “cluster configuration methodology,” which improves parallel resource usage in render nodes of the nDisplay. This means that performance for setups of computers with multiple GPUs is improved, being able to run different processes in parallel, on the same computer.

Virtual Cameras

A crucial improvement of  the Virtual Camera in UE is an update of the WebRTC Pixel Streaming system. This is the system that allows you to send and receive video over the web. Now, from just one server, it is possible to send multiple video streams simultaneously, while keeping them separated and uniquely identified.
This improvement also allows you to choose what input device (cameras or controllers) you want to use for each virtual camera. So, if before you only had access to a single virtual camera, you can now have multiple cameras at once, each one of them capturing a different scene, but being controlled by a single system.

DMX

Several improvements have been added for the control and design of DMX scenes and fixtures. The main refinement refers to the simplification of the DMX control console, which allows for quicker debugging and faster control of physical and virtual fixtures. Filtering and populating features have been added for smoother workflow and property tweaking.

 Unreal Engine Immersive TouchDesigner

The DMX UI has been improved in order to allow live show designers and virtual production operators to drive real-time previews that include:

  • Batch add fixture patches
  • Updated naming conventions (OX vs X)
  • Selection for universes
  • Copy-paste arrays of fixtures
  • Multi-select and move fixture patches
  • MVR Import Options

Also, Media IO improvements now allow users to capture thumbnails and previews in immediate manner by simply dropping video files, and choosing which frame of the video will be exported as a thumbnail.

iOS Stage App

This update brings the opportunity of using a native application for iOS devices like an iPad on-stage during virtual production projects. It’s mainly designed to offer a simplified control for various on-set tasks and needs. Three main categories are now accessible:

Light Cards: Think of these as graphical representations used to simulate lighting conditions of the virtual environment you are working on.

Flags: Also related to lighting, typically these offer the option of blocking and/or shaping light.

Color Correction Windows: Real-time color adjustments for your scene, providing further customization for mood and aesthetic matching within your project.

Machine Learning Deforming Sample

If you are interested in the use of ML in your projects then this one is an exciting one! 

Essentially, the Deformer Sample utilizes Machine Learning technology to aid in the deformation of digitally-created characters. It encompasses simulations for muscle, flesh and cloth based on 3D, 4D and MRI data!

Check out this cloth simulation:

 Unreal Engine Immersive TouchDesigner
 Unreal Engine Immersive TouchDesigner

Albeit, this technology is not yet available for “live puppeteering,” it certainly gives insight into some of the groundwork necessary for this use case in the near future. And in AI and ML the near future is always very near! This will represent an evolution from current deformation techniques derived from sensor-use like Kinects or ML skeleton tracking; which if you have tried to use… can feel somewhat janky and unnatural!

Get Our 7 Core TouchDesigner Templates, FREE

We’re making our 7 core project file templates available – for free.

These templates shed light into the most useful and sometimes obtuse features of TouchDesigner.

They’re designed to be immediately applicable for the complete TouchDesigner beginner, while also providing inspiration for the advanced user.

Niagara Fluid Simulation

You know how popular fluid simulations are in our field of work, right? Well, great improvements have been made to the Niagara simulator!

 Unreal Engine Immersive TouchDesigner

The main advances here include better performance of the simulations. For instance, 2D Gas and Liquid simulations are now 200% faster! While 3D simulations are 60% more efficient! All this has been possible thanks to the development of “superior memory management” within Unreal! So in conclusion: better template systems, efficient workflows and pure enjoyment!

Pixel Streaming

This tool, which allows us to run and interact with packaged Unreal Engine applications (whether on a PC, web browser or cloud service) without the additional need of installations, is now enhanced for VR and XR support! So, if you are using a VR device like a Quest 2 or HTC Vive, you can stream the Unreal Engine application directly to your headset. But it doesn’t stop there, you can also interact with the VR environment using your device’s controllers; paving the way for futuristic and hassle-free VR experiences!

Keep in mind that not all VR devices have been tested, and therefore some may not be supported so be sure to research your preferred devices!

Unreal Engine and TouchDesigner resources

By now you should be aware of the broad capabilities of software like TouchDesigner and Unreal Engine, and how these expand into different realms ranging from Virtual Production, Game Design, Immersive and Interactive design and Live show design. This can certainly be an overwhelming panorama to absorb all at once! However, we encourage you to slowly and gracefully discover your own path and uses of these amazing software. Here at the Interactive and Immersive HQ we are constantly exploring new and accessible resources, so be sure to check our publications for continuous new material. For now, I’ll leave you with trainings by Jack DiLaura and Elburz:

Controlling Virtual Lights in Unreal Engine via OSC from TouchDesigner

Immersive Plugins in Unreal Engine (OSC, OpenColorIO, USD file support, and more)

Texture Sampling with Niagara Particles in Unreal & TouchDesigner

Awesome Unreal Engine Resources For Immersive Artists and Developers

Wrap Up

We hope that your desired path for approaching this amazing software is clearer after this brief overview! Whether you’re a pro or a beginner, it’s useful to stay up to date with the improvements brought to this universe. We deeply encourage you to do so, since it will only benefit your craft!

Be sure to explore knowledge resources, and of course… keep us informed of your advancements or doubts regarding Unreal Engine!

Until next time!