The reviews are in and the technical press praises the Apple Vision Pro headset for delivering on the company’s promises. It’s well designed, the video and sound are surprisingly accurate, and the “Minority Report”-style gesture interface is future-proof. No one knows exactly what it’s for, or if even the Readiest Players One will spend $3,500 on it, but hey, those are gadgets for you.
Yet this is one new gadget frontier. The Vision Pro, like Meta’s similarly equipped Quest 3 and Quest Pro headsets, uses so-called ‘passthrough’ video cameras and other sensors that capture images of the outside world and reproduce them within the device. They give you a synthetic environment made to resemble the real one, with Apple apps and other non-real elements in front of it. Apple and Meta hope that this virtual world will be so attractive that you won’t just visit it. They hope you will live there.
That could unfortunately have some very strange and very messy consequences for the human brain. Researchers have found that widespread, long-term immersion in VR headsets can literally change the way we perceive the world – and each other. “We now have companies advocating that you spend many hours in it every day,” says Jeremy Bailenson, director of the Virtual Human Interaction Lab at Stanford. “You have a lot of people, and they wear it for many, many hours. And everything is scaled up.”
Meaning: Our brains are about to undergo a massive, society-wide experiment that could rewire our sense of the world around us, and make it even harder to agree on what reality means.
The short-term effects of virtual reality are well known. People in synthetic environments tend to misjudge distance, both far and near. That’s no surprise: even in the real, three-dimensional universe, our ability to determine how close or far something is depends on all kinds of external factors. Virtual environments, with their lower resolution and synthetic 3D, make things worse – which is especially bad if you’re one of those users who posts videos of yourself doing things like skateboarding and driving while wearing a mixed-reality headset . You think your hands are in one place, but actually they are in another, and soon you’re driving your Honda Civic through a supermarket.
Objects in a headset can also end up in a funhouse. This is called object distortion: things become distorted and change size, shape or color, especially when you move your head. A video display cannot compete with the processing speed and reliability of your eyes and brain.
These are all, as the IT people say, known problems. A few minutes or an hour, long enough to play a game or watch a movie, are minor annoyances. But wear perception-altering glasses for days — like Bailenson’s team of researchers did just that – and the problems are getting worse. Way worse.
The team spent a few weeks wearing Vision Pros and Quests on college campuses, trying to do all the things they would have done without them (with a sitter nearby in case they tripped or hit a wall). They experienced “simulator sickness” – nausea, headaches, dizziness. That was strange, considering the experience they all had with all kinds of headsets. And they felt all the effects of distance and distortion: they thought the elevator buttons were further away from their fingers, or had trouble bringing food to their mouth. But as any of us would, they adapted: their brains and muscles learned to compensate for their new view of the world.
That seems like a solution, but it isn’t. When people adapt to a perceptual change long enough, the real world starts to look wrong in the opposite direction. If you were wearing glasses that turned your vision upside down, you would have to readjust when the glasses were removed. The longer you are in a funhouse world, the longer the strange perceptual aftereffects last. So people who spend their workday in a Vision Pro may go home at night with a miscalibrated targeting system and what feels like a mushroom hangover.
This is where the passthrough video becomes unique important. Old-school cyberpunk saw virtual reality as an all-encompassing synthetic environment. New-school techies, meanwhile, proposed an augmented reality of digital pop-ups floating on clear lenses, Google Glass-style. But both approaches have their limits. Complete, feeling-isolating VR hasn’t advanced much beyond niche entertainment, while AR tends to make both the apps and the real world look bad. From a visual perspective, passthrough is the least worst solution, but the social consequences are scarier.
Because passthrough captures and then re-renders reality, it can have an unnerving, distancing effect over time. When Bailenson’s colleagues actually tried to talk to people, the world turned into one giant, confusing Zoom. Video chats are, as we’ve all experienced plagued by delays in responses and missed social cues. Conversations lose their subtlety, but for a meeting it’s good enough. But passthrough increases the effect: the people you talk to start to resemble unreal. Up close they look like avatars. Further away they become just part of the background.
Bailenson describes the feeling as a sense of social absence. Other people just aren’t quite there. He doesn’t put it like that, but I’m waving the caution flag: long-term use of passthrough headsets could make it easier to think of other people as sub-humans – non-player characters in a gamified, uncanny valley.
We all live in our own perception bubbles. Every person has slightly different sensory thresholds: we see colors a little differently, hear at different levels of acuity, are more or less sensitive to different smells. And we process all that with brains that are uniquely tuned, first by our genes, and then by a lifetime of neural changes, of thought and action.
But overall we agree on some common points. Even if your blue looks a little different than mine, we can agree on the color of the sky. Maybe my tolerance for chili peppers is higher than yours, but we both know when we’re eating them.
Headsets make the walls of those sensory bubbles even thicker and harder to bridge. We already lack of a common political basis. With millions of Americans wearing VR headsets for hours on end, we may not be able to agree on our vision physically reality. The headsets will bring things into our visual world that no one else can. The objects are not objective.
And that’s not all. “These headsets can not only add things to the real world, they can also remove them,” says Bailenson. He first realized VR’s strange editing feature when he played a game on the Quest 3 that “turned off” parts of the real walls around him and replaced them with a virtual scene. “I’ve been doing VR and AR for a while,” he says, “and I’ve never seen deletion work so well in my life.”
At first that seems pretty great. Are you stuck in a crowded bus? Remove everyone and replace them with the first-class cabin of a jumbo jet. Do you hate intrusive billboards? Replace all commercial images with soothing vistas of your choice.
But what happens when the technology gets good enough to remove homeless people, for example? Or Pride flags? You can see here where I’m going: literal erase. When science fiction writer William Gibson came up with the concept of cyberspace, he described it as a “consensual hallucination.” This is the exact opposite: billions of discrete, undivided hallucinations, each involving a special snowflake.
“What we’re going to experience is that by using these headsets in public, common ground disappears,” Bailenson says. “People will be in the same physical place and experiencing visually different versions of the world at the same time. We will lose common ground.”
Fixed: Everyone always panics about new consumer technology, and the panic is almost always the same. The new form of sensory input will harm children! It’s a dangerous distraction! It’s socially alienating! They said it about the iPhone, about the Walkman… hell, half a millennium ago they said it about the book. New technologies are emerging and we are adapting.
And I don’t have to lean hard on my nerdiness to come up with fun sci-fi applications for passthrough. The real potential here is the ability to see the world’s invisible information metastructure: translation overlays; pop-up tags that display people’s names and pronouns and where you know them from; walking routes; x-ray vision linked to the user manual for assembling an Ikea coffee table. Link my shopping list to the aisles I need to visit in the supermarket. Maybe even expand my vision beyond what my flesh-sack eyes can do, and let me see into the ultraviolet, or sense electric fields. Passthrough has limits, but it can also have superpowers.
Like me, Bailenson is no outsider; he likes VR and thinks the new headsets are excited. He knows that screens will get better resolution and faster rendering over time. New algorithms will minimize distortion. It’s not the technology he’s worried about. It is the extent to which we will become immersed in it.
“The world will be all right,” he says. “People adapt to the media. These headsets are incredible. But philosophically, I don’t believe we should wear these headsets for hours every day.”
We’ve been here before – and very recently. About a decade ago, no one thought about the unintended consequences of throwing millions of people into it impossible to moderate social networks. And we all know how that turned out. Now we’re about to put millions of people in helmets that will give us all our own editable reality. That’s why the kind of research Bailenson is doing on passthrough headsets is so important. “I encourage all scientists to act with some urgency to understand them,” he says.
In the meantime, while he’s at work, maybe don’t forget to take off that Vision Pro every now and then. The longer you hold out, the more you turn yourself into a human guinea pig – one with very, very poor depth perception.
Adam Rogers is a senior correspondent at Business Insider.