Apple’s developer session gave an in-depth view of the various ways users (ultimately) will control its new Vision Pro head-mounted display, including a virtual keypad that you can type on while in mid-air. The “Designing for Spatial Input” session is where we learned about it. Two members of Apple’s design team walked prospective developers through the best practices to create apps for the new platform.
Apple wants users to interact with headsets by looking at UI components and making small gestures while their arms are relaxed on their lap. Apple designer Israel Pastrana Vicente admitted in his developer session that “some tasks can be better suited for direct interaction,” such as reaching out to touch UI elements. Apple calls this feature “direct touch”. Apple also supports the use of physical keyboards, trackpads, and game controllers.
“Some tasks require direct interaction”
Let’s now talk about the virtual keyboard of the Vision Pro. Apple designer Eugene Krivoruchko says that the virtual keyboard should provide plenty of audio and visual feedback to compensate for “missing tactile” information when using a peripheral. Krivoruchko says that buttons will display a hovering state when the finger is placed above the keyboard. They also have a brighter highlight as you get closer to the button surface. It provides a distance cue, and guides the finger towards its target. The state change occurs quickly and responds to the contact. It is accompanied by a spatial sound effect.
Meta also released a similar experimental feature to allow Quest users to touch virtual keyboards or menu buttons. uploadVR note, however, that Apple’s Vision Pro, with its depth sensors is likely to perform better than Meta’s.
Direct touch can be used to interact directly with other elements of the system. Apple’s demo showed the wearer using a pen to draw a heart and write a word in Markup while making a tap motion. Krivoruchko says that although the main interaction is through the hand of the user, it also uses eye-tracking in order to enhance the gestures. You control the brush cursor using your hand. It’s similar to a mouse. But if you tap and look at the opposite side of the canvas, the cursor will jump there, landing exactly where you are looking. The designer says that this creates an accurate feeling and allows you to quickly cover a large canvas.
Our hands-on experience as well as developer sessions such these are starting to bring the experience into focus.