The Next Stage in our Unity and Unreal Engine Tooling
I’m really excited to announce that we’ve released major updates to our Unreal Engine and Unity plugins.
They’ll make it easier for everyone to prototype and build using our tech straight out of the box. These updates are also part of a longer-term project to invest in and evolve our game engine tooling for VR. Expect much more to come!
What are we doing with Unreal Engine?
There are some very significant updates to our Unreal plugin (V4.0). Major features previously only accessible in Unity are now available to Unreal VR developers.
- Our award-winning Interaction Engine is now compatible with Unreal Engine. This is a layer that exists between the game engine and real-world physics, making interaction with virtual objects in VR feel natural, satisfying, and easy to use.
- Our Hands module is also compatible. Easily bind Ultraleap data to your own hand assets, or use our optimized and pre-rigged hand models.
- We’ve added functionality for UI Input. You can now retrofit Unreal Motion Graphics (UMG) UIs so that they can be interacted with using hand tracking.
- Our Unreal Engine documentation has been overhauled and expanded. It also has a new home on our comprehensive developer resources site.
Download our new Unreal plugin here.
What about Unity?
We’ve made some back-end changes in release 5.0 of our Unity plugin for VR that will allow us to make more regular releases. However, we’ve worked hard to do this in a way that doesn’t negatively impact our large and active Unity community.
There’s also some exciting new features:
1. We’re making your workflows easier by providing our Unity Plugin as either a Unity package or via the Unity Package Manager (UPM)
2. The Unity 5.0 plugin will be compatible with Unity 2020 LTS, meaning we can support content that has been built for different render pipelines:
- Universal Render Pipeline (URP): Optimizes content to be less processor-intensive. Hand tracking content built in this way works better for mobile HMDs, such as those built with the Qualcomm Snapdragon XR2.
- High-Definition Render Pipeline (HDRP): Allows you to build with high-definition graphics. This is particularly important for developers working with hand tracking on high-end enterprise headsets such as Varjo’s VR-3/XR-3.
3. We’re introducing preview packages, to allow developers to test and input into features in development.
4. Our Unity documentation has been overhauled and expanded. It also has a new home on our comprehensive developer resources site.
Download our new Unity plugin here.
Recommended by LinkedIn
Why are we doing this now?
Every year Gartner releases a new version of their Hype Cycle for Emerging Technologies. This relatively simple graphic helps companies understand where Gartner see the maturity of emerging technologies.
Increased demand for VR has resulted in a steep increase in demand for more intuitive ways to interact in VR. VR controllers aren’t for everyone, or every use case. Hand tracking is one of the solutions that solves this user need.
Hand tracking itself has not yet appeared on Gartner’s published Hype Cycle. But if we make a more detailed breakdown of the 4 sections of the cycle, we can take a guess at where hand tracking sits today:
We’ve seen a general increase in headsets with integrated hand tracking coming out in the market, and a corresponding increase in hand tracking enabled content. As a result, we can make a reasonable assumption that hand tracking is on the Slope of Enlightenment.
This is a stage at which “methodologies and best practices [are] developing”. Our team have been developing hand tracking VR applications longer than anyone else. Via our documentation and game engine tooling, we’ll be sharing this learning as widely as we can.
We’ll be clearly outlining the features in our demos, exposing them with code snippets, and providing example scenes that show interactions in context. This will take a bit of time to work through, but you can expect to see significant changes over the next 12 months.
What’s next?
New features for both Unity and Unreal Engine will be released far more frequently going forward.
Adding to Gemini for Windows, next year Gemini for headsets built using the Qualcomm Snapdragon XR2 chipset will also be on general release. This will bring with it Unity and Unreal Engine compatible plugins.
We’re also working on versions of the plugins that are compatible with the OpenXR hand tracking API. It means that Unity and Unreal Engine VR developers will be able to easily deploy to multiple headsets.
In the meantime, be sure to check out our XR design guidelines which cover all aspects of designing for hands in XR.
Tell me what you think!
I’d love to hear your feedback on our new Unity and Unreal Engine tooling. What do you like? What’s not working so well for you? And what features do you want next?
And don’t forget to share the awesome things you create with us at @ultraleap_devs.
Happy building!