Microsoft, Meta, and Accenture in VR: Toward an Open Metaverse

Microsoft and Meta in VR

At the recent Meta Connect, Zuckerberg ended his keynote with Meta’s value of openness; being more like Android instead of iOS. And to their credit, Meta has done a great job of being open and partnering with others in the Metaverse. This is fantastic for the industry and even better for the interoperability of VR apps.

 

The shockwave-producing announcement at Meta Connect this year was that Accenture, Microsoft, and Meta are partnering up in the Metaverse. Essentially, Accenture says that they know how to work in the Metaverse, Microsoft has the collaboration software needed in the Metaverse, and Meta has the hardware for the Metaverse. This means that you’ll soon be able to use Teams and Microsoft office on Meta headsets - in virtual reality. To further sweeten the deal, you’ll also be able to play Xbox games in your headset thanks to the new Xbox Cloud Gaming integration.

Ultimately, this announcement is great for the VR industry and serves as a stamp of approval for our enterprise customers. Partnerships increasing interoperability are always welcome, but they look better on the surface than what they truly are. In this post, I'll discuss why we need interoperability, what interoperability we need, and what’s still missing.

Why is interoperability essential in VR?

Unlike the non-immersive world of laptops and phones, [standalone] VR is extremely resource-constrained and at the same time has an extremely high visual quality requirement. In the non-immersive world, you have enough computational power and memory to run multiple apps simultaneously and easily switch between them.

VR has a high bar for acceptable user experience. If the quality bar is not met, users will think the interface is broken and prefer old interfaces; similar to the sentiment around touchscreen phones before 2007.

What makes a good user experience in virtual reality?

Smooth and high-quality graphics. The acceptable level for graphics in virtual reality is 10x higher than that of non-immersive tech and we are 4-8 years away from the computational power that can run apps in VR the way we run them on non-immersive devices today.

*Sidenote: the visual requirements for AR applications are likely much lower than for VR because with VR you are in the world created by the app whereas, with AR, the app is in the world created by you. For AR the likeliest UX barrier is the input interface.

Consequently, the resource constraints and high minimum acceptable visual quality bar mean that for the next 10 years, VR apps should draw on every ounce of computational power available on the hardware – i.e., apps should run one at a time.

But for productivity use cases in VR, users need a quick way to switch between experiences without first going to the Meta app store page. Imagine having to go back to the Mac home screen and relaunching Slack every time you want to switch back from Chrome – this is the current state of VR.

Given the constraints, what we need in VR is a way to launch apps directly from within any other app with specific login credentials moving securely between each app. This can be achieved by having a highly interoperable VR ecosystem. One in which more power is given to the developers, allowing them to ensure that the minimum acceptable user experience bar is met.

Exactly what interoperability do we need?

Two types of interoperability are needed in the VR world. 

The first is between VR apps – allowing apps to launch each other with specific parameters/payloads. Meta is applying a stopgap solution with deep links, but it’s not scalable and puts too much extra work on developers. The ideal solution would be a new operating system for immersive devices that allows applications to map entry points to external calls automatically. I.e., when an app is created, there should be an almost automatic generation of functions that can be called when launching the app – very similar to API calls we make today.

The goal is to be able to launch your Walkabout Minigolf server from an app like Remio. To pass the user credentials between the apps and do automatic login. To pass a payload to the receiving app that defines what happens when the app is exited; e.g., if you exit Walkabout, relaunch Remio directly into your company’s server for the team to debrief and wrap up the meeting. In fact, for our enterprise customers, it’s equally important to have this inter-app launching feature be available and activated for entire teams. With deep links or Android’s native app launching capabilities, we can already achieve this first type of interoperability. It's a lot of work, however, because it needs a custom setup for each enterprise customer and each third-party app – not very scalable. 

The second type of interoperability is between “apps” and “worlds”. Apps are what we have available on our smartphones; Spotify, Chrome, and Youtube. Worlds are what VR developers usually create; Rec Room, Altspace, and Remio. In VR, apps are used inside worlds.


“He who defends everything, defends nothing.”
— Frederick the Great


Creating a high quality virtual reality world is difficult and creating apps to use within this world purely increases the difficulty and complexity. The best for end-users and the VR ecosystem would be for app creators to open up their work for other developers to leverage. E.g., a Youtube and Spotify SDK allowing any VR world to launch these apps in their world. Note that it would need a level of customization uncommon in the non-immersive world – in many cases, you’d want to give some UI control to the world creator such that they can ensure continuity and persist immersion. Another good example is the Oculus browser, which is one of the best in VR. Many other VR apps have made an in-app browser based on Chromium, but every implementation I've tested doesn't provide a user experience as good as the Oculus browser. The same goes for the Oculus keyboard, voice input, and their in-app equivalents.

*Sidenote: Apps can be used “outside” of VR worlds, but then we rapidly run into computational constraints again. The partnership between Microsoft and Meta seems to be going in this direction. Users would be able to open their Teams app from anywhere in the VR world, but the experience would not be native to the world you are in. A stopgap solution, but not the right solution. At Meta Connect, they also announced that you could open the Oculus browser from within any app going forward.

What’s missing?

As mentioned, in terms of being open, Meta has done well, but there is uncertainty in probably one of the most important parts of the Metaverse – avatars. Thus far, Meta has been vocal about their avatars being available on non-immersive platforms such as Instagram and even on Zoom, but they have been uncharacteristically quiet about avatar support for other VR hardware. If Meta avatars aren’t supported on all virtual reality devices, any developer using or planning to use the Meta avatars (which are great by the way) for their content, will be locked into the Meta hardware system.

Hence, I’m grateful for the existence and recent funding of Ready Player Me. The Metaverse desperately needs more companies like this that focus on one specific area and own it well enough to compete with the large platform players. Ready Player Me avatars aren’t ready for VR yet – they are unoptimized and require months of developer effort to port, but I’d be surprised if making their avatars VR-ready isn’t on the near-term roadmap. Ultimately, healthy competition like this on all fronts of the Metaverse is the key to an open Metaverse.


Previous
Previous

The Ultimate New Hire Onboarding Checklist

Next
Next

The Detrimental Impact of No Team Building and How VR Might Solve It