The Vision Pro is a mixed-reality headset with an alloy frame and fabric head strap that holds a “spatial computer” over the eyes of the user to project a computer interface.
A piece of laminated glass bends over the front of the headset and holds a variety of sensors and cameras that project the exterior surroundings into the lenses within the headset so that users can see what’s going on around them.
Computing systems within the headset simultaneously project a digital user interface into the room, so that applications such as web browsers appear to be floating three-dimensionally wherever the user looks.
This digital interface is controlled by eye movements. The eye works as a “cursor” selecting the various apps and screens and the hand, without any hardware, can make minor movements to select options, change screens, arrange multiple windows in the spatial array and expand or contract the various screens.
The interface also includes a keyboard function that appears before users. They can type directly onto this keyboard with auditory settings to make the haptic elements feel more physical.
Voice commands can also be used and the headset is equipped with spatial-audio speakers. It can be directly connected to a power source or a portable external battery that allows for two hours of use.
Its frame has a dial on the upper right that acts as a power and “home” button and controls the “blend” between the virtual and actual physical environment projected into the headset.
When turned up, the virtual element can blot out the actual environment, completely immersing the user in a 360-degree digital environment, using visual assets filmed with spatial image-capture technology.
Apple‘s iPhone 15 Pro models can directly capture this spatial video, which can be uploaded to the Vision Pro, allowing users to send each other manually captured 3D environments.
Hundreds of apps have already been built for the interface, on top of the apps available with past Apple devices, with more from third-party developers expected as the device gains more users.
Current hardware, such as the MacBook laptop can also be used with the device, allowing users to integrate physical keyboards with the interface.
Integration with the physical environment was an important element of the interface. When the virtual dial is turned all the way up, spatial awareness sensors will indicate to a user when someone interacts with them in the physical environment.
When this happens, people outside appear through the veil of the interface.
The laminated glass fronting of the device also allows for certain functions that indicate the state of awareness of the user.
When in full use, the screen will project a colourful pattern and when the person needs to speak, their eyes can be captured by internal cameras that are then projected onto the laminated screen in real-time.
When the design was announced in June 2023, Apple CEO Tim Cook said the device would usher in a “new era for computing”.
Hundreds of people gathered at Apple’s 5th Avenue retail store in Manhattan last Friday for the release of Apple’s marquee headset. Cook was present for the launch and press and potential customers were ushered into the subterranean, Bohlin Cywinski Jackson-designed store.
Because the device requires some customisation to fit on the face of the user, a sales representative was required to walk each person through the interface, and Apple installed a series of special benches to facilitate this process.
Other mixed-reality headsets include a device by Finnish technology company Varjo, which was announced in 2018.
The photography is courtesy of Apple.