As augmented reality technology continues to evolve and new platforms are continually being developed, immersive designers and developers often find themselves questioning which platform to use: Web AR, Social AR, or AR apps. In the context of Live Events, each platform presents unique advantages and challenges, both from the perspective of the user and the creator. In this article we analyze these differences to give you a better understanding of what is possible with each platform.
Web-based AR, or Web AR, is a form of AR that runs directly in a web browser, removing the need for users to download an app to experience the AR content, increasing user engagement at live events.
From the user’s perspective, the main advantage of Web AR is its simplicity and accessibility. Because all they need is a web browser and a stable internet connection, Web AR reduces friction for users, particularly those reluctant to download new applications onto their devices.
From the developer’s perspective, the main benefit of Web AR is the universality of web standards. However, developers might face challenges due to the limitations of current browser-based technologies. Web AR often doesn’t offer the same level of depth sensing, motion tracking, and 3D rendering that specialized AR apps can provide. Additionally, programming for Web AR can be more complex due to the need to account for various browser and device compatibilities. Even though Web AR doesn’t have the same size limitations as Social AR, because it runs in a browser, it becomes the developer’s responsibility to ensure the effect optimization, which can take a lot of time. The developer must also ensure that it is compatible with a wide range of devices.
Some of the common platforms to develop Web AR are:
- AR.js: A lightweight and efficient library that works with A-Frame and three.js. It’s an open-source option that supports marker-based and location-based AR.
- Niantic 8th Wall: A robust and comprehensive library that provides many features for Web AR, including SLAM (Simultaneous Localization and Mapping), face filters, and image targets.
- Zappar: Provides a range of utilities for creating Web AR experiences, with compatibility for various web-based platforms, such as Three.js, A-Frame, and Babylon.js.
Social AR refers to AR experiences that are delivered through social media platforms like Instagram and Snapchat. The benefits to users are clear: these AR experiences are delivered on platforms they already use regularly and understand intuitively. There’s also the potential for virality. If the AR experience encourages user-generated content, it can spread rapidly within the platform. Also, for brands and clients is attractive to have an experience that they can leverage in their regular social media channels even after the event is over.
For designers and developers, working with these toolkits doesn’t require extensive AR development experience and can be an easier starting point than Web AR or AR apps. However, there are downsides. The capabilities (and sometimes the content) of Social AR experiences are often limited by the platform they’re hosted on. Additionally, the use of proprietary tools required for these platforms requires learning new skills and adapting to the constraints of these tools when translating an effect from one platform to another. Although Social AR has stringent size limits, this can be seen as an advantage: Once your effect is approved and posted, it will work consistently for all users, as the optimization of the back end is handled by the social platforms themselves.
As for documentation, the major social media platforms that offer AR capabilities, like Instagram, Snapchat and TikTok, provide extensive resources for developers, including active online communities where the companies connect with creators.
The three platforms used at the moment to develop Social AR are:
- Meta Spark AR: Meta’s platform for creating AR experiences on Instagram and Facebook. It offers capabilities for face and image tracking, world AR, and much more. Effects are limited to 4MB
- Lens Studio: Snapchat’s AR creation tool. Like Spark AR, it allows for a range of AR experiences, including face and landmark tracking, and some unique ones like pet detection and body detection. The biggest draw for Lens Studio is its incredible integration with Machine Learning, allowing any user to create their own custom models. It also contains the most advanced world tracking of the Social AR tools. Effects are limited to 10MB local and 100 MB with remote assets.
- Effect House: TikTok AR creation tool. Just like Meta Spark and Lens Studio includes face, hand and image tracking and other advanced tracking options. Additionally, it offers unique post processing looks and built in hand gesture detection. Although its machine learning (ML) capabilities might be more limited compared to Lens Studio, Effect House does provide its own set of ML tools for users to work with. Effects are limited to 5MB.
Get Our 7 Core TouchDesigner Templates, FREE
We’re making our 7 core project file templates available – for free.
These templates shed light into the most useful and sometimes obtuse features of TouchDesigner.
They’re designed to be immediately applicable for the complete TouchDesigner beginner, while also providing inspiration for the advanced user.
Standalone AR Apps
Standalone AR apps provide the most control over the user experience, with the ability to create complex 3D models, animations, and interactions. This allows designers to create truly immersive and customized experiences. However, in the environment of a live event, the main challenge with AR apps is user adoption, as they require users to download an additional application.
For developers and designers, creating an AR app for live events can be both challenging and rewarding. It requires deep knowledge of AR programming, often necessitating advanced skills in software development. However, it allows for the creation of truly immersive and unique experiences. These apps can take full advantage of smartphone hardware capabilities, enabling advanced features like occlusion, physics-based interactions, and spatial audio.
The documentation for creating AR apps for live events is extensive, especially for popular AR development platforms like Unity and ARKit. However, the level of complexity is also significantly higher.
Some of the common platforms to develop Standalone AR Apps are:
- Unity: The game engine is widely used for AR development due to its flexibility and robustness. It works well with AR libraries such as Vuforia and ARFoundation (which allows developers to write once and deploy across ARKit and ARCore).
- Unreal Engine: Another potent game engine that developers use for AR development. It provides high-quality visuals and a robust physics engine based on nodes, making it a great choice for creating immersive AR experiences.
- ARKit (Apple) and ARCore (Google): Both are powerful AR libraries for creating AR experiences on iOS and Android, respectively that can be integrated in Game engines. They offer advanced features like environmental understanding, user interaction, and motion tracking.
- Niantic Lightship: In my opinion, one of the most powerful AR libraries that integrates with Unity. It includes real time world-mapping, semantic segmentation of surfaces, and geo-localization. It is the toolset behind the popular game Pokemon Go.
Choosing the right AR platform depends on the objectives of your live event, your target audience’s behavior, and the resources at your disposal.
Web AR can help you quickly engage users with low-friction experiences, Social AR can leverage social networks for greater reach and virality, and AR Apps can provide deeply immersive and custom experiences.
While these choices come with their own sets of challenges, the continuously evolving AR tools and libraries, are broadening the capabilities and easing the development process across all these platforms. The question isn’t whether you should incorporate AR into your next live event, but rather, which form of AR will best serve your target audience.