Meta is testing a foundational upgrade for its Quest VR headsets. It allows you to tap and scroll through virtual elements using your fingers, without the need to use controllers. You’ll be able perform actions that you may already know from your smartphone like scrolling up and down pages, activating buttons, and typing on an onscreen keyboard with just your fingers.
Direct Touch is the new experimental feature. It’s part of the Quest 50 software update, which’s currently rolling out. After waiting for weeks, the update finally arrived. I was so excited that I turned it on immediately.
The Quest 2’s hand tracking feature uses its external cameras to track your hands. Inside the headset, they will appear in VR as shadowy hand-like shadows. (CEO Mark Zuckerberg’s video for Direct Touch shows more detail in the hand and arm. These shadows can be used to estimate when your hand will touch a menu or window. Direct Touch allows you to make contact with things and they will scroll or lighten up when you do. Although scrolling can be a bit jerky, it is usually much more responsive than I expected.
Direct Touch typing is a pain. Tap on a section of the UI that allows you to type text. The Quest’s keyboard appears under the window. You can then “press” specific keys to spell out what you are typing. It’s difficult to know where you are typing, or even what, because there is no place for your fingers or hands to rest. Imagine the lack of feedback that the iPad’s keyboard provides. Now imagine there is no glass. The UI can sometimes mistakenly think that I pressed a key other than the one I intended, even when I use VR hunt-and-peck to try to write even one word. The keyboard can suggest words when you type, which can be helpful in an emergency.
Quest’s web browser has the best display of Direct Touch controls, with its poor typing and good scrolling. The search engine will probably correct any spelling errors I make in a web search. Scrolling up or down works fine, as well as tapping on links. Strangely, Verge’s homepage does not scroll past the Quest’s Top Stories list, but tapping on any of the six stories I can see works better than what I expected.
You can see me using the browser by clicking here:
The majority of other Quest apps I tried were usable with Direct Touch. However, many Quest Store apps, including Meta’s Horizon Worlds social network, aren’t updated to work with your hands. Without a controller, they wouldn’t open. Although I didn’t expect apps like Beat Saber would be better if I had a controller, I hoped I’d have at least the chance to play with them.
Direct Touch is an experiment at the moment. Every mid-air poke makes me doubt that my hand will actually “touch” the Quest’s virtual UI. It frustrates me to use it for more than a few seconds at a stretch. It becomes tiring to hold my arms out in order to move around the UI. Meta’s other gestures that don’t require a controller, (which involve pinching), are more reliable but I find them less intuitive.
Direct Touch is a very cool idea
Despite all this, I think that the idea Direct Touch is still very cool. My VR headset gives me the feeling of being in a sci-fi movie. I scroll and tap on virtual surfaces, and even though my words per minute drops by 99 percent, I still feel like I am living out a sci-fi fantasy. I also don’t believe any of my taps will perform as I expect. Direct Touch is much more user-friendly than the Quest’s controllers when it works. Although it’s an asterisk, I find that just popping on the headset and scrolling through a document with my hands eliminates much of the friction I associate with using the Quest. Direct Touch can be finicky so I need to ensure the controllers are always nearby.
It is also easy to see the potential for this technology, especially if Meta’s yet-to-be-realized AR glasses do come to fruition. You won’t need a controller to wear those glasses around the world. We may not be just using Meta devices with our hands in air, but Apple’s long-rumored mixed Reality headset could allow users to type on screen keyboards. It seems that Apple may also be exploring these types of interactions.
I will mainly continue to use the Quest’s controls for now. If I need to quickly check something on my headset, I might leave the controllers on my table and use my hands to do it. Although it may take three times as long, it is a hell of a lot more fun.