I Tried the New Apple Vision Pro

I Tried the New Apple Vision Pro

THE BIGGEST TECH company in the world is entering an entirely new realm. After years of rumors and hype, Apple announced Vision Pro, a headset the company calls its first spatial computer, last June.

Ask just about anyone else, and they’ll call it virtual reality (or mixed or extended reality, depending on your source). Whatever it is, it’ll be available for tech fans soon, as Apple set a launch date of February 2. I had a chance to test out the Vision Pro this week at Apple’s NYC showroom ahead of its availability for pre-order this Friday. After my experience, I’ll just call it The Future.

I tried the Vision Pro in a controlled demo setting, which took about an hour from fitting to finish. I don’t have a great deal of experience with other headsets you might consider to be comparable to Apple’s device—just a few go-rounds on the Meta Quest 2 unit currently collecting dust beneath my couch—so I wasn’t entirely familiar with the feeling of having a computer strapped to my face. What I did recall from my time with the other unit was a lot of waving my arms and cartoony graphics. That was not what awaited me here.

The Apple Vision Pro’s Fit

Getting set up for the Vision Pro starts with a process familiar with iPhone users: a facial scan, similar to the one used for the FaceID biometric feature. I was told that this would be used to determine the proper fit for me. This was all I needed, since I don’t have any issues with my vision. The device requires lenses, so glasses-wearers will need to add their prescription info at this point in the process.

My head is on the smaller side and has always been tough to fit, and this was no exception. The initial scan indicated that I might be able to use the larger strap on its tightest setting, or the smaller one at its loosest. Sure enough, the larger strap didn’t fit, so I opted for the smaller one. Switching between them was a click of a magnetic clasp. I could control the fit with a knob on the rear of the strap, and its padded loop was comfortable. After having the Vision Pro strapped in for better part of an hour, however, I felt a slight strain in my neck. I switched to the double strap for the last portion of the demo and was immediately much more comfortable.

a silver and black cylindrical object

Courtesy of Apple

I don’t think this will be a huge issue for most users, and I say this as someone with experience wearing heavy objects on my head. When I played football, the first few days of every season would be spent acclimating to the added weight of the helmet. After that period, I wouldn’t notice the added weight at all. The Vision Pro isn’t quite as heavy as a football helmet (although it is more front-loaded on the face), but I wouldn’t be surprised if there were an acclimation period here, too.

I did laugh at the thought that if the Vision Pro is adopted en masse, its use might help to combat the effects of tech neck (the craning posture adopted by screen device users) as the headset wearers develop their neck muscles looking forward instead of downward.

Using the Apple Vision Pro

This is one of the most hack-y conventions for a tech review, but I’m not sure how else to say it: Using the Vision Pro is straight out of a sci-fi movie. A built-in suite of LEDs and infrared cameras track eye movement, so navigating the UI (which will be instantly familiar to anyone who has used anything in Apple’s software ecosystem) is as simple as looking at an app icon.

Once the object of my attention was in my sight, I noticed a quick audio and visual response. From there, all I needed to do to launch a program was tap my thumb and forefinger (a simpler version of the Apple Watch’s Double Tap movement). I didn’t have to lift my hand into my frame of vision, I didn’t have to reach to grab at insubstantial pixels. I looked, I tapped, and I was in the app. I picked this up quickly, needing just a few tries to figure it out. I did run into small moments when I tapped to no response thanks to my habit of crossing my arms when I sit; Apple’s system worked better when it was clear that my left and right arms are separate.

Once I was in the apps, navigation remained easy. The demo covered the Photos app, where I learned some more movements while marveling at the pictures that filled up my entire field of vision. Pinching the frame edge of a photo (with one hand) and moving it forward allowed me to zoom in. Scrolling through a collection of images and video was simple too—just pinch and flick the wrist up and down, and I was able to shift down and up.

a person wearing sunglasses and sitting on a couch

Courtesy of Apple

Using the Vision Pro is like jumping into an iPad and interacting with the OS with your hands. Familiar motions, mediated through touchscreens on smartphones and tablets for the 15-plus years we’ve used smartphones, are now untethered from those surfaces. The interaction between human and device was now just a bit more seamless, and I was giddy thinking about the potential uses that will be developed once the tech makes its way out into the world. I also felt like Tom Cruise in Minority Report, pinching and scrolling my way through the demo.

The other standout feature is one of Apple’s major points of focus for the device: I could see and interact with people in the room with me. I wasn’t just in a black box, and I wasn’t wholly detached from the reality of my surroundings. This isn’t because the device’s screen is transparent; instead, it uses a pair of cameras to capture the room (and people) around me. How much I perceived of that space depended on what I was viewing on the device. When one of the other people in the room caught my attention while I had an app open, he slowly phased into focus as my eyes locked onto him. Once I shifted my eyes away, he faded back out. It was a bit surreal, but much preferable to my other experience using a headset device, when I had no idea that my dog had entered the room and was looking for attention. I later observed how the Vision Pro looks when I wasn’t wearing it, as I approached an Apple employee at the demo. A sheen of light covered the screen, then receded into his eyes as he looked at me. I do appreciate the ability to interact, even as it remains slightly uncanny.

The Apple Vision Pro’s Programs

The demo walked me through some fairly simple Vision Pro programs, including its standard browser, photo, and video viewing experience. I can already imagine myself spending hours looking back through photos I’ve taken over the years using this massive format (I do this enough already with my phone)—but the Spatial photo and video format is really special. These frames, which you can capture using the Vision Pro and iPhone 15 Pro, are immersive 3D.

The Spatial photos were a bit trippy, but the video was remarkable. The demo placed me at a table in a family’s kitchen, watching a mother talking to her kids with a dad cleaning up the sink in the background. The sensation was immediately affecting; I was there, ready to chime in. I immediately imagined how this could feel with my own nephews sitting at the table—clearly, this will be a potent tool for people to relive memories with their loved ones. There was a distortion at the edge of these clips, though, that reminded me of the way that filmmakers often present memory in their art. I immediately recalled a more cautionary piece of fiction, though—”The Entire History of You,” the Black Mirror episode that depicts the tragic potential of crystal clear memory recall on demand. The Vision Pro is a long way from always-on implantable video recording, though, so I won’t worry just yet.

Viewing other media was similarly impressive—I watched a clip of The Super Mario Bros. movie in an mountain environment, then, much to the delight of my seven-year-old self, the trailer for Star Wars: A New Hope in a land speeder in Tatooine as part of the Disney+ app.

The last, biggest standout for me was Immersive Video. I was shown a clip highlighting this, which Apple calls a new entertainment format featuring 180-degree 3D 8K recordings using Spatial Audio. The sizzle reel cycled between documentary footage, an Alicia Keys performance, and live sports. I have never experienced anything like it.

graphical user interface, website

Courtesy of Apple

When a slackliner stepped off a ledge, I felt like I was suspended in mid-air with her. Baby rhinos sidled up to me and I anticipated them rolling into my lap. I was in the crowd on the first baseline of an MLB game, watching the umpire hock a lugie into the dirt, then electrified when a soccer player kicked the ball through a goalie’s hands into the net. I don’t even like soccer, but it was thrilling. If Apple is able to make a major play with Immersive Video, it could very well change the way we engage with moving pictures, including how we watch live events.

Apple Vision Pro Final Thoughts

I walked away from my demo feeling like I had just been given a look at the next big thing, but that doesn’t mean there were no drawbacks. As I mentioned earlier, the fit wasn’t exactly perfect. I think by swapping the normal Solo Knit strap for the more supportive Dual Loop band fixed that for me, but it would be preferable if I didn’t have to make as many concessions. The device is also tethered to a battery pack (Apple specs estimate two hours of general use and 2.5 hours of video playback totally wireless, so most people will likely want to keep it plugged in unless they’re on the go). Standing up and slipping the pack in my pocket was easy enough, but that wouldn’t be the case if I wanted to jump around. The unit was also a bit heavier than I’d like to be used in a fitness setting with lots of movement (and there are no active fitness features being touted at launch), so exercise buffs will probably have to wait for a hardware update for extended reality workouts.

I also realized that I was feeling queasy as the demo progressed. I’ve had some issues with motion sickness before—mostly on roller coasters or paragliding—but never when I wasn’t moving. I spent the later sections of the demo gritting my teeth. I wasn’t sure if maybe my vision wasn’t totally right either; the sensation wasn’t unlike putting on a pair of glasses with the wrong prescription. I could see everything just fine, but I could feel my gorge rising up my throat. Finally, I asked for a break and took the unit off, just before the end of the session. It was that powerful.

a man wearing goggles

Courtesy of Apple

Once it was off, I realized I was doing this on an empty stomach, and I had performed an intense workout earlier in the day. I had a quick snack and a drink of water, and I immediately felt better once the unit went back on. I’ll chalk that up to user error—but I can also imagine that users might have to acclimate to the Vision Pro in a way they don’t now with their computers, tablets, and phones. Next time, I’ll come to it with a full stomach.

Even while the Vision Pro was a remarkable piece of tech, the $3,499 price tag can’t just be ignored (and if your eyes aren’t perfect, tack on an extra $99 or $149 for readers or prescription lenses). That’s equivalent to multiple laptops and iPhones (not to mention more than a month of my rent)—or if you do think it’s fair to compare that to other headsets, more than two or three times the cost of the most expensive units currently on the market. That sticker shock will be a tough pill to swallow for lots of casual tech fans. That said, I do remember when a good TV cost about that much (and high-end 8K models can still hit similar and higher price points).

Still, the Vision Pro is the type of device that fully captures the imagination. I can’t stop talking about it. I’m envisioning use cases beyond what I saw and how the device will fit into the daily lives of its adopters, and I keep coming up with more potential ways it might be used. More than anything, I was impressed by the way Apple was able to translate its familiar interface into an entirely new framework. That’s the breakthrough here. The hardware should improve to be more user-friendly, both in terms of its function and its cost. That happens with every type of device. The key is that the system behind that hardware is already there. If enough people are ready to enter this new field of reality—whether you call it virtual, mixed, extended, or anything else that fits—we could be on the verge of another massive shift in how we interact with computers and the world around us.

Headshot of Brett Williams, NASM

Brett Williams, NASM

Brett Williams, a senior editor at Men’s Health, is a NASM-CPT certified trainer and former pro football player and tech reporter. You can find his work elsewhere at Mashable, Thrillist, and other outlets.

Read More

Leave a Reply

Your email address will not be published. Required fields are marked *