A Hands-On Review of Working in the Meta Quest Pro: Insights and Challenges

When it first became apparent that Meta is building the most performant high-end XR device, the assumption was that this device would change everything for the future of computing platforms. Thus for most of 2022 we were thinking that when the Quest Pro arrives, we’ll finally have a mixed-reality (XR) headset that can replace your computer screen. I’ve been fortunate to have my hands on the Quest Pro for a while now and I like it, but it’s definitely not the big change most of us expected. The major question however is whether this headset is actually ready to at least replace your screen for work.

In this post I’ll share my experience of working in the Meta Quest Pro. Let me start by saying that I'm incredibly picky about pixelated text on screens. So much so, that I use the LG Ultrafine 24-inch monitor, which sacrifices screen size for a higher pixel per inch (PPI) ratio. It's the only affordable screen with a PPI comparable to Apple Mac screens (now before you jump to conclusions, I'm not an Apple fan, I'm neutral. But their screens are great quality). My sensitivity to PPI means I have high expectations for VR display quality. Despite this, I know plenty of people who work 40+ hours a week in their VR headset.

I put the Quest Pro to the test by dedicating 4 weekends to do all my work exclusively through the headset. I experimented with both Meta's Horizon Workrooms and ImmersedVR as my monitor app. I was hoping to try vSpatial too, but my windows machine was buried away, and all I had was my Mac during this time. Unfortunately, vSpatial is only available for Windows.

Let me give you some context for those who are new to working in VR headsets. Essentially, you connect your headset to your usual computer, and it becomes both a screen and an environment for you to work in. This connection is wireless, so no cords required, but it does mean there may be a slight lag between what you do on your keyboard/mouse and what you see on the virtual screen.

With that context, I’ll start off with the lessons learned that are independent of the virtual screen app used. After that we’ll dive into the details of both platforms I tested.


App Independent Observations

Resolution

While the Quest Pro has better resolution than the Quest 2, it still falls short of retina display. A quick rundown on retina display in VR: it's a combination of screen pixels per inch (PPI) and normal viewing distance. For retina display on smartphones, you need a PPI of around 300 at a normal viewing distance of 25-30 cm, which translates to a 57 pixels per degree (PPD) rating. The highest PPD the human eye can see is 60. In VR, PPD is the most important metric as the screen is close to your eyes. The Quest Pro has a PPD rating of 22.69, which may not seem great but isn't as noticeable in practice. However, it still falls short of retina display, which causes a slight blurriness that irks me.

The bright side is, if Apple stays true to their reputation for quality screens and the much-rumored Apple XR headset comes out this year, we'll finally have a standalone XR device with retina display! (Again for the record, I'm neutral when it comes to Apple, but their screens are of great quality.)

All that being said, the resolution is more than enough for video calls in VR. You can either be represented by an avatar, show your real face with the headset on, or switch to audio-only for yourself. Videos from other callers will look the same as they would on a computer screen.

Although I don’t notice this with games, the high-resolution sweet spot is annoying when working because your natural behaviour is to keep your head stationary and move your eyes to different parts of the screen. When doing this to the very edge of your screen, you will start to notice a decrease in resolution. It could be better with foveated rendering, but I apart from Red Matter 2, I haven’t noticed foveated rendering yet. This is a double-edged sword because you either need a high resolution to have a small enough virtual screen that fits in your sweet spot, or you need a sweet spot that is big enough to have a big screen that is big enough to overcome the low PPD problem.

User Experience

I wrote this post from within the headset and took a few photos from the first-person perspective to show what it looks like in VR. However, the process of accessing these photos is a real hassle. I have to first share them from within the headset, then retrieve them from Facebook, and finally paste them here. It would be much easier if I could just take a picture and have it immediately accessible on my computer for pasting.

Speaking of UX pains, 2-factor authentication (2FA) for websites is a big pain while working in VR. This can easily be solved with biometrics or software on the headsets – possibilities are endless, but developers haven’t gotten there yet. Let me dive into the detail of the pain here. The headset isn’t as easy as other headsets to flip up against your head, so my instinct is to keep the headset in place and get the 2FA code on my phone. My first attempt strained my eyes so much by accident that I immediately developed a headache. High-resolution passthrough would also solve this issue if you would actually be able to read text on your phone with passthrough. The same goes for when my partner quickly asks me to look at her new dress – it would be great to turn around and look at the dress through the headset. Also, the face-recognition on my phone doesn’t work when I have the headset on, so I have to take it off in order to unlock my screen. And then the lenses get dirty if you rest them on the top of your head.

Lenses

A big issue for me on the Quest 2 was the steaming up of the lenses. The Quest Pro solved this issue nicely, but it still happens occasionally. I don’t know if there’s a good solution here because I know people wearing glasses experience the same issue and I also experience this on my Rayban Stories.

The interpupillary distance (IPD) adjustment should be automatic. I previously reviewed the Quest Pro and thought it was understandable for Meta to remove the feature to save on weight and power consumption since most users would only set their IPD once. I was wrong. Every time you put on the headset, it’s in a slightly different position, which means the sweet spot for your eyes would be slightly different. Thus you should be resetting the IPD every time you put on the headset. This is especially true for the Quest Pro because the lenses actually slip out of place quite easily, even turning your head too fast could move them out of position. Apps like Workrooms do guide you to the perfect IPD each time you start the app, but for reasons listed above, I have to flip up the headset regularly during working sessions, and when you bring it down again, the position is different, but the IPD-guide doesn’t re-appear. Also, my IPD is 60 and the lenses are a bit too bulky, so they press against my nose and move back to about 61 every now and then while wearing the headset.

As mentioned above the headset doesn’t always fit on your head the same way, more so than the Quest 2. Sometimes the lenses are super close to my face, other times too far. It’s like the tilt of the headset changes quite a bit because it doesn’t snug your face as tightly as the Quest 2. A nose fitting or the full light blocker might solve this problem.


Other

One advantage of the Quest Pro is that it promotes better posture when working on your computer. This is especially handy when traveling, as a laptop often requires you to hunch over and strain your neck and shoulders. With the Quest Pro, you can sit upright in a natural and healthy posture, while working on multiple large screens no matter where you are. Although the headset is still slightly too heavy, uncomfortable, or tight to be the preferred choice, it's almost there.

The proximity to your Wi-Fi router can significantly impact input delay. In my case, my home router was about 10 meters from my computer, resulting in an annoying delay. During most of my testing, I utilized my phone as a Wi-Fi hotspot for my laptop and headset to avoid the delay and the results were perfect. [Using a mobile hotspot can be problematic when your partner picks up your phone and walks to a different room to order Chipotle.]

That covers all the general lessons. Next, let's delve into the lessons learned specific to the applications.


Horizon Workrooms

Connecting your computer to Workrooms is the simplest process of the three monitor apps that I have tried so far. The user experience is fantastic, as it only takes about three clicks to get set up. However, the process of getting multiple team members into the same workroom is not straightforward, which is a significant issue that Remio has addressed effectively.

Screens

While the ease of screen setup is a definite advantage, it comes at the cost of flexibility. Workrooms do not allow for any customization of the monitor setup, making it challenging for those who prefer to have multiple screens or a larger single screen. You cannot alter the size or location of your virtual monitor in Workrooms, which is particularly frustrating since the previous version of the app did allow you to increase the screen size.

Another annoyance is the permanent visibility of the menu panel.


Environment

The virtual environment has its ups and downs. Nearby objects are well-rendered and smooth, but the surrounding environment isn’t a skybox, causing a blurry appearance, similar to the Oculus home, which always makes me question the quality of my eyesight. The gains from not using a skybox is that you can add motion to the environment like moving waterfalls, clouds, trains, and mist in the distance.


My dream for immersive environments is to have live 360 videos streamed into your headset from idyllic places around the world, like Kogel bay in South Africa.

The full passthrough home is a nice touch and combined with the open sides of the headset means that you’re much more aware of your surroundings. Working in the Quest 2 you could sometimes get a frightening surprise when someone approaches you from the side of your desk. However, the passthrough home can also be annoying and glitchy depending on your surroundings. For instance, I have a desk that faces a window with burglar bars and I found that the bars continually bounced and jittered in my view, causing headaches.

Other

Workrooms has a handy feature of automatically detecting your hands on your keyboard and displaying them in the passthrough mode, so you can see your real hands. Some folks may prefer to see their avatar's hands instead, but that's not an option yet.

The audio quality of audio streamed through this connection is disappointing and choppy. I tried listening to Spotify on my computer through the headset speakers, but it didn't sound great. I think a different speaker configuration would improve the audio quality.

ImmersedVR

The onboarding and setup is much better than it was 6 months ago when I used it in my Quest 2 – fewer clicks to set up.

Screens

The monitor customization in Immersed is much better, allowing you to change almost anything with improved screen resolution compared to Workrooms. However, with all this flexibility comes some challenges. Firstly, setting everything up just right takes a significant amount of time, and secondly, it's hard to know if the monitor is perfectly aligned - you always question if the screen is tilted by 2 degrees. The screen manipulation may also take some time to get used to.

In contrast, Workrooms uses your desk setup to align the screen, which feels comfortable all the time, but it requires you to set up your desk every time you use the app, which can be annoying. The ideal solution, which seems inevitable, would be to automatically detect your desk and lock the screens to that position and rotation.

One advantage of Immersed is that you can make the screen much larger, making text easier to read than on Workrooms.

Environment

Immersed boasts remarkable resolutions for its skyboxes and distant surroundings, which I appreciate greatly, given my frustration with low resolution skyboxes in our VR development journey. The high resolution skyboxes in Immersed create a visually pleasing VR experience, in contrast to the slightly irksome blurry environment in Workrooms.

Unlike with games where your body is fully immersed and involved, I found it much less immersive to use Immersed with a virtual environment and the open sides of the Quest Pro. And I don’t want to use passthrough because it’s still a bit annoying to see your surroundings through a blurry grainy filter.

Other

The delay when using Immersed does seem to be about 10% more (i.e. 10% slower reactions) than Workrooms, but it’s still not hindersome.

One frustrating aspect of using Immersed VR is that it blacks out your physical computer screen. This can be problematic if you're trying to troubleshoot the application or share something with someone nearby. During my few hours of using Immersed, there were two instances where the virtual screen froze for about a minute, and I wasn't able to figure out the cause. Unfortunately, because my Mac screen was black, I lost some valuable work time and couldn't continue on the Mac.

Conclusion

The VR industry is still grappling with the question of what's preventing widespread adoption - hardware or software. Based on my own experiences with two major issues, pixel persistence density (PPD) and passthrough, I'd say that hardware is the bigger hindrance in using VR as a monitor.

As previously mentioned, I know many individuals who work in VR for more than 40 hours a week. I consider the Quest Pro to be the best headset for VR work at the moment, but I don't think it will be embraced by the general market. I'm eagerly awaiting advancements in this space, as I think we're close to having a lightweight pair of glasses that can replace heavy, clunky monitors.

“Most people overestimate what they can achieve in a year and underestimate what they can achieve in ten years.”



Previous
Previous

Unlocking the Power of Virtual Reality: Tips for Successful Meetings and Events

Next
Next

What we’ve learned from running dozens of in-person VR events