At Facebook Connect, the Unity Labs team shared a new proof of concept, Unity Slices: Table, to show how Passthrough VR can support mixed reality social experiences focused on tangible interfaces.
This upcoming proof of concept, vertical slice game made for the Oculus App Lab, brings up to four players together for classic tabletop fun. Over the past year, our team at Unity Labs has been prototyping and iterating on this experience, and we’re excited to give you a behind-the-scenes look at how we made it happen.
Building a prototype
The first milestone that needed to be reached while building the prototype was to align the virtual representation of a table to a physical table or desk. Since Oculus Quest 2 does not yet offer a way to accurately detect planes in Unity, the team had to adopt a manual approach for using the controllers, so that users could quickly and precisely align the virtual desk to the table.
Once the alignment was complete, the team needed to find a way to network it. While this might seem simple, there are a number of factors to consider when centering your social experience around a table. Whether you’re in the same space or connected remotely, the tangible table interface is shared and needs to look and feel right for everyone participating.
With support from Oculus Quest 2’s articulated hand tracking, the team managed to build a system that allowed them to turn any table into a giant touchscreen. But this is VR, and the aim was to do more than build flat interfaces, so the team began experimenting with reactive 3D objects.
The team primarily tested a game of chess by prototyping voluminous 3D pieces that collapsed vertically as the user approached their hand. However, early user testing revealed that the shape of the objects could lead players to misunderstand how to properly interact with them. Since the pieces collapsed vertically, users thought they had to raise their hand and point downward to interact with the piece they wanted to move. This was a problem because hand tracking systems don’t work as well when they can’t clearly discern the top silhouette of your hand.
Despite the reluctance to turn away from the appeal of futuristic 3D interfaces, the team decided to make pieces users could interact with flat. This had two key benefits: First, it was easier to select the pieces users were targeting. Second, users no longer felt the urge to contort their hands; they simply touched and dragged.
They learned that, while interactions along a surface should remain simple and loosely similar to traditional touchscreen interfaces, there is new potential around what can be derived from a user’s hand positioning relative to an object in 3D space. Unlike touchscreens, interfaces can light up in anticipation of being touched, which makes them more playful and predictable as a result.
It took the team awhile to get to a system that worked smoothly, but the moment they first hopped into a networked session with expressive avatars, and could both see and hear the other person tapping their table over the voice chat as if they were in the same room, was truly mind-blowing. It felt almost magical to bring this tangible part of their reality into a shared experience.
If you’re not yet familiar with Passthrough VR, it is a novel way of bringing AR to VR headsets by showing a video feed of the real world inside the headset. While remote virtual experiences are already pretty fantastic, Passthrough also enables you to share your physical space with others in the same virtual experience. Not only can you see the avatar of a friend connecting from a distant place, but you can also observe the people sitting right next to you.
Once we incorporated Passthrough into this experience, the team were faced with a variety of options for implementation. With traditional AR, content simply exists on top of the real world, and the illusion of presence is easily broken as you move your hand up to the content and notice it hiding your hand, for instance. With Passthrough, however, the team could selectively control how the real and virtual worlds merged together, giving them the ability to create an occlusion mesh that allows your hands to feel as though they truly blend with the experience.
Going further with this idea of moving beyond overlaid content, and controlling how the real and virtual interact, the team began playing with the idea of portals into virtual worlds that you could stick your head and hands right into.
The team landed on a bubble of reality encircling the table. When engaged in a game or app, the virtual content has your full attention, and the virtual world around this content can similarly come to life. But when you look away from the table, the real world is present, so you can easily eat or drink without taking your headset off. Watching hands transform as they move in and out of the bubble has been a favourite XR experience among the Labs’ team members.
As they further fleshed out three ways to experience reality, they wanted to give their users the freedom to move between them with a simple interaction. That’s how they landed on a slider-based approach. The power to move freely between virtual and physical reality makes for an impressive experience that needs to be tried to understand.
Getting your hands on Passthrough and Unity Slices: Table
You can also start building your own Oculus Passthrough experiences with the experimental functionality found in Oculus SDK on the Unity Asset Store.