Skip to content

A first try of Apple’s $3,500 Vision Pro Headset

    I got a taste of Apple’s vision for the future of computing on Monday. For about half an hour, I wore the $3,500 Vision Pro, the company’s first high-tech goggle, due for release next year.

    I walked away with mixed feelings, including a nagging sense of skepticism.

    On the one hand, I was impressed with the quality of the headset, which Apple touts as ushering in an era of “spatial computing,” where digital data merges with the physical world to unlock new possibilities. Imagine wearing a headset to assemble furniture while instructions are digitally projected onto the parts, or cooking a meal with a recipe displayed out of the corner of your eye.

    Apple’s device had high-resolution video, intuitive controls, and a comfortable fit, which felt superior to my experiences with headsets made over the past decade by Meta, Magic Leap, Sony, and others.

    But after wearing the new headset to view photos and interact with a virtual dinosaur, I also felt there wasn’t much new to see here. And the experience triggered an “ick” factor I’d never had before with an Apple product. More on this later.

    Let me start at the beginning. After Apple unveiled the headset on Monday, its first major new release since the Apple Watch in 2015, I got to try a pre-production model of the Vision Pro. Apple employees took me to a private room at the company’s Silicon Valley headquarters and sat me on a couch for a demo.

    The Vision Pro, which resembles ski goggles, has a white USB cable that plugs into a silver battery pack I tucked into the pocket of my jeans. To put it on my face, I turned a knob on the side of the headset to adjust the fit and attached a Velcro strap above my head.

    I pressed a metal button on the front of the device to turn it on. Then I went through a setup process, watching a moving dot so the headset could track my eye movements. The Vision Pro has a series of sensors to track eye movements, hand gestures and voice commands, which are the main ways to operate it. Looking at an icon is like hovering over it with a mouse cursor; to press a button, tap your thumb and forefinger together and make a quick pinching motion equivalent to clicking with a mouse.

    The pinch gesture was also used to grab and move apps on the screen. It was intuitive and felt less fiddly than swinging the motion controllers that typically come with competing handsets.

    But it raised questions. What other hand gestures would the headset recognize for playing games? How good will voice controls be if Siri voice transcription on phones is currently not working properly? Apple isn’t sure what other gestures are supported yet, and I wasn’t allowed to try the voice controls.

    Then it was time for the app demos to show how the headset can enrich our daily lives and help us stay connected.

    Apple first guided me through viewing photos and a video of a birthday party on the headset. I could turn a dial on the front of the Vision Pro counterclockwise to make the photo backgrounds more transparent and see the real world, including the Apple employees around me, or clockwise to make the photo more opaque to immerse myself.

    Apple also let me open a meditation app in the headset that showed 3D animations while soothing music played and a voice instructed me to breathe. But the meditation couldn’t prepare me for what came next: a video call.

    A small window popped up – a notification of a FaceTime call from another Apple employee wearing the headset. I stared at the answer button and squeezed to take the call.

    The Apple employee in the video call used a “persona,” an animated 3D avatar of herself that the headset created using a scan of her face. Apple portrays video conferencing through the personas as a more intimate way for people to communicate and even collaborate in a virtual space.

    The Apple employee’s facial expressions looked lifelike and her mouth movements were in sync with her speech. But the way her avatar had been rendered digitally, with the uniform texture of her face and the lack of shadows, I could tell it was fake. It looked like a video hologram I’d seen in sci-fi movies like Minority Report.

    In the FaceTime session, the Apple employee and I had to work together to create a 3D model in an app called Freeform. But I stared at it blankly, thinking about what I saw. After three years of being largely isolated during the pandemic, Apple wanted me to engage with what was essentially a deepfake video of a real person. I felt myself shutting down. My “ick” feeling was probably what technologists have long described as uncanny valley, a feeling of unease when a human sees a machine creation that looks too human.

    A technological tour de force? Yes. A feature I would like to use with others every day? Probably not soon.

    To round off the demonstration with something fun, Apple showed a simulation of a dinosaur that moved towards me when I held out my hand. I’ve seen plenty of digital dinosaurs in virtual reality (almost every headset maker that’s given me a VR demo has shown a Jurassic Park simulation in the last seven years), and I wasn’t thrilled with this one.

    After the demo I drove home during rush hour and processed the experience.

    Over dinner I talked to my wife about the Vision Pro. The Apple glasses, I said, looked and felt better than the competing headsets. But I wasn’t sure if that mattered.

    Other headsets from Meta and Sony PlayStation were much cheaper and already quite powerful and entertaining, especially for playing video games. But whenever we had guests over for dinner and they tried the glasses, they lost interest after less than half an hour because the experience was exhausting and they felt socially disconnected from the group.

    Would it matter if they could turn the knob on the front of the headset to look into the real world while wearing it? I suspect it will still feel isolated since they are probably the only person in a room wearing one.

    But more important to me was the idea of ​​connecting with others, including family members and colleagues, through Apple headsets.

    “Your mother is getting old,” I said to my wife. “If you’re doing FaceTiming with her, would you rather see her deepfake digital avatar, or a messier video call where she holds the phone’s camera at an unflattering angle to her face?”

    “That last one,” she said without hesitation. “That’s true. Although, I’d much rather see her in person.