Thursday, 18 September, 2025 UTC


Summary

Meta Horizon Hyperscape, rolling out now in the US, lets you capture a real-world scene with your Quest 3 and visit it in VR with photorealistic graphics.
At Connect 2024 last year, Meta released a Quest 3 demo app, Horizon Hyperscape Demo, showcasing six near-photorealistic scans of real world environments. At the time, Meta said it eventually planned to let you scan your own environments.
Now, the company is delivering on that promise.
0:00
/1:10
There are three steps to creating a Horizon Hyperscape capture. The first two take place in the headset, and the third on Meta's servers.
Inside your Quest 3 or Quest 3S, the capture process starts by having you pan your head around the room to create a scene mesh, the same way you would when setting up the headset for room-aware mixed reality. This takes 10 to 30 seconds, depending on the size and complexity of your room.
The next step is the most laborious by far. You now have to walk around the room and get rid of the 3D mesh by bringing your head close to it in all places. This takes several minutes.
From here, your work is over, but the Hyperscape isn't ready yet. The scanned data is uploaded to Meta's servers, and 2 to 4 hours later you get a notification that your scan is ready to view.
As with the other volumetric scene capture and reconstruction technology we've seen in recent years, Horizon Hyperscape is possible due to Gaussian splatting.
And like in the demo from last year, the rendered Hyperscapes are cloud streamed from Meta's servers, leveraging the technology it internally calls Project Avalanche. None of the hard work involved with Horizon Hyperscape happens on-device.
Screenshots of the virtual versions of four scanned environments.
I tried Horizon Hyperscape in-person at Meta Connect 2025. I first visited a room in real life, and then in its Horizon Hyperscape capture in VR. I also virtually visited four captures of further off places, including the UFC Octagon and Gordon Ramsay's home kitchen.
While this wasn't the sharpest VR I've ever seen, as Quest 3 has four times less angular resolution than Meta's Tiramisu prototype, it was the most graphically realistic. That these captures were created with the Quest 3 headsets many of us have at home, and that they are viewable in them too, is nothing short of groundbreaking.
It wasn't perfect, to be clear. I could see blur on some details like small text, and by performing a deep squat I could see significant distortion underneath furniture, in areas the headset never saw during the scan. But these minor imperfections aside, this was still the most realistic automatic scene capture I've ever seen.
One thing that blew me away is how well Horizon Hyperscape captures the background of scenes. Looking out the window of Gordon Ramsay's home, I could see his garden at the correct distance and perspective that it should be, with remarkable detail given how far away it is from the location of the capture. No other automatic scene capture technology I've tried does this anywhere near as well.
In a future update, Meta plans to make Horizon Hyperscape multiplayer, via Horizon Worlds, letting you scan an environment and invite friends over to visit it. There's no specific release timeline for that capability, but Meta says it's coming "soon".
Meta Horizon Hyperscape Capture (Beta) is "rolling out" on the Meta Horizon Store in the US, with support for Quest 3 and Quest 3S. More countries are also coming "soon".