Valve confirmed that it's not currently working on a new first-party VR game.
Today saw Valve officially announce Steam Frame, a “streaming-first” standalone VR headset that's launching in “early 2026”. While the company is aiming to make your existing Steam library more valuable, this naturally raised the question: following 2020's Half-Life: Alyx, is Valve developing new VR games for the headset?
Speaking to UploadVR during our recent visit, Valve told us that it's “not talking about content today.” However, Road to VR says that "a member of the Steam Frame team" denied that it has any VR content in development, offering what the publication described as a "simple and definitive no".
While Alyx wasn't a launch title for the Valve Index headset, the groundbreaking title arrived less than a year after launch. Before that, Valve had previously confirmed it was developing a flagship VR game, whereas Steam Frame will seemingly rely on existing and third-party titles.
As for Steam Frame itself, the newly announced headset uses a lightweight modular design and runs a VR version of Valve's SteamOS, which it previously used with Steam Deck. This also uses an updated version of the Proton compatibility layer, meaning it can run almost any Linux, Windows, and Android games.
If you've spent money on games from Steam, Valve's aim with Steam Frame is to make that library more valuable.
50 percent of Steam users still run their games at 1080p, according to the latest Steam Hardware Survey. How many of those gamers have never seen a monitor refreshing faster than 60Hz? Or a modern VR display running its content at 120Hz?
Ignoring the subset of people, mainly gamers, who have a concept of the difference between 30Hz and 60Hz, how many people worldwide have never seen motion displayed on a screen faster than 60Hz?
In 2026, Valve will follow the Steam Deck by starting to sell its most portable standalone PC ever. You wear Steam Frame on your face with experimental display support up to 144Hz.
What Is Steam Frame?
Steam Frame and controllers, image provided by Valve.
Steam Frame is a VR headset, personal computer, and probably a whole lot of other things once its community verifies Valve's claims that you can swap out the operating system or plug in other accessories into its high-speed nose port.
"We don't block anyone from doing what they want to with their device," Valve's Lawrence Yang said.
Developers and Steam Frame buyers can use high frame rates in different ways, but at its core this headset features a stereoscopic portable display as its compute unit. There's dedicated hardware to enable high quality streaming from nearby PCs also running Steam and, while in standalone mode, you can still get something like a 1440p picture on what feels like a 70-inch virtual display running at 90, 120 or potentially 144Hz.
UploadVR played Hades 2 at Valve HQ in a demo that conveyed the power of having a virtual display comfortably floating anywhere with smoothly animated content displaying in the Steam Frame at rates usually only associated with high end gaming monitors tethered to desks. Arms at your sides with a controller in each hand, Valve says Steam Frame works in the dark too.
Steam Frame, Steam Machine, Steam Controller, and Steam Deck, image supplied by Valve
"All of these games can support arbitrary resolutions and refresh rates," Valve's Jeremy Selan said during our briefing. "We're thinking about it as very much a per-game setting. Some games just by the nature of the play want to be very high refresh and more modest res. When we were showing you Hades, it was at 1440p at 90 Hertz."
Hades remains a strong memory from my time in Steam Frame at Valve headquarters because of that high frame rate, and it bears calling out amid all our coverage that having some “VR and non-VR” games running at an extremely high frame rate – even reclined on a couch or bed – will be an eye-opening experience for many people, including Steam Deck owners.
"While it is a wireless streaming first headset," Yang told UploadVR. "We did want you to be able to play your Steam library when you're not next to your PC or laptop, so we made Steam Frame a PC by itself. It has an ARM chip that's running SteamOS."
Could playing existing games at high frame rates in a lightweight standalone headset make it hard to go back to 60 or 30 fps screens entirely?
For long-time UploadVR readers as well as newbies learning about the market for the first time, it’s important we convey just how meaningful this feature alone might be in daily use as Valve works to optimize the system and rally developers into supporting new surfaces for their games on Steam.
Steam Frame, image supplied by Valve
During our time with Valve I dug into questions of openness, tracing the path from the Vive and wired PC VR-only Index to the standalone Steam Frame with wireless streaming.
“This is a Linux OS, this is SteamOS brought to ARM. If there are other operating systems, for instance, that support these chip sets, you're welcome to do so,” Selan said. “You'd at that point be responsible for the tracking stack and everything else, but very much in the spirit of Steam Deck, this is your device, your computer, you own it, you can mod it and extend it in any way.”
Steam Input Alignment
Steam Frame Controllers that ship with Steam Frame.
Valve looks to align controller input in Steam Frame with traditional gamepad and Steam Deck, with backward compatibility to existing VR content on both Quest and PC. That’s why there are two index buttons instead of one on each controller, matching the shoulder and trigger buttons on traditional gamepads.
Valve’s Jeff Leinbaugh introduced the controllers by saying their design is “going along with all the goals of the headset and a lot of our hardware. We want this device to work with all of your VR games but also all of your non-VR games and just make your whole Steam catalog more valuable, deliver you a bunch of value no matter what you happen to be playing.”
For flat games, the new Steam Controller can be used with Steam Frame instead of the tracked controllers.
What Is Steam Frame's Price?
"It's a premium headset, but we're really aiming to be cheaper than Index," Selan said. "While I said we're gonna be premium, we're still trying to be very cost considerate."
We're due for months of debate over the definition of "cheaper than Index" not due to any fault of Valve, but because that's not a comparison many people know how to make conceptually.
Valve Index was $999 for a room-scale VR kit plus a user-provided Windows PC to drive it. Steam Frame is a standalone headset from Valve. Many people, in their heads, will be comparing the price of a component to a computer.
Trade-Offs: Wi-Fi & Display
Steam Frame and controllers.
Steam Frame's creators acknowledge trade-offs in its design, like the lack of HDR or a true-black display technology like the Steam Deck OLED. My colleague David Heaney asked about the potential of a higher end headset one day exchanging the LCD in its design for OLED or HDR.
"I think about HDR every day," Selan said.
Where Apple brings to Vision Pro its iPad app and Apple TV content libraries as the cornerstone of its leap into VR — while building new Apple Immersive Video content along the way — Valve representatives declined to talk about content made for Steam Frame by their developers.
Valve is “optimistic” about bringing Half-Life: Alyx to standalone at some point. Until then, that's what the streaming focus is about. The story now from Valve is about putting its hardware in end-to-end wireless control of Steam games in more places while improving frame rates, latency and resolution wherever possible along the way.
What makes good Apple Immersive Video so powerful is the amount of photons hitting camera sensors and then reconstructed for your view on a virtual display at a high resolution and frame rate. Almost nobody notices when an iPad app that typically runs at 30 fps on a physical tablet also runs at 30 fps on a virtual display, but you’ll jump when something comes at you in 180-degree video delivered in Apple Immersive Video at Apple Vision Pro’s frame rates and quality levels. You’ll have to stay tuned for our review of the M5 Vision Pro, released in 2025, to see if that specification bump from the M2 of 2024 has a meaningful effect on frame rates across the broader Apple software ecosystem. Still, you should keep some of this in mind as Valve seeks to make an impact with new hardware centered on Steam games in 2026.
"We ask ourselves at Valve, what can we do well?" Selan said. "We keep coming back to the Steam games that you already own."
Developer Feedback & Community
Valve is looking to its developer and user communities to do the lifting here in centering Steam in the market for VR games running on ARM. Developer applications for Steam Frame kits are open today.
The market for VR content on ARM is currently dominated by Quest, but many top apps have ports of their Android-based software packages on other storefronts, like those from Pico or HTC. As of last month, this is a market also being formally chased by Android XR for the Google Play Store.
"The same way Steam OS has been fully expanded and extended by the community. Our hope is to do that same thing for VR. So this would be considered an open PC," Selan said. "This would be like the biggest open VR headset device, and like everyone can richly work together to make that better and better."
The Best Headset Demo Ever?
Steam Frame's soft-back battery pack slips into the facial interface for tight packing.
Valve didn't show experiences like The Lab with mini games like Longbow from 2016, or even Beat Saber from 2018, that might've indicated tracking regressions compared with the Valve Index or HTC Vive laser base stations surrounding a play area. We didn't even see SteamVR Home, just a compositing system for content in SteamOS on ARM.
Apple planted its flag in VR hardware as the future of personal computing with its first public demo of Vision Pro in the middle of 2023, with software showing full control over photos, videos, FaceTime and more. Plenty of software Apple is known for, like Final Cut Pro and GarageBand, still isn't present in its headset from 2025.
Valve plants a flag for all of Steam in VR with its first public demo of Steam Frame near the end of 2025. There's a long path of optimization ahead to make Steam games of all kinds run well with Valve's new headset and input.
Pulling up a Linux desktop in VR for the first time in a headset as lightweight as Steam Frame conveys something about this medium and its steadfast believers that I felt in awe to see with my own eyes. You can use whatever terminology you want to describe this medium, but progress never stops.
We'll be watching Valve's optimization developments closely across all platforms.
Closed vs. Open
The latest generation of VR headsets sees us logging into Samsung, Google, Steam, Apple or Meta accounts to access large quantities of digital content. That doesn't feel truly "open", even if along the way we're getting some unlocked bootloaders and the promise of OS-swapping.
Earlier this year, I did the absurd thing of purchasing a 2 terabyte microSD card which I stuck into my Steam Deck. I've installed everything I could possibly want onto it, including 100s of gigabytes of content I don't expect to work properly in Deck mode ever. Now, instead of waiting hours downloading a 100 gigabyte game from Valve's network to a freshly reformatted PC, I can simply transfer it locally right from the Deck.
I had that card with me at Valve HQ because I had another hope in putting my Steam library on that card. They warned "no screwdrivers" ahead of our demos and said not to take anything apart. So, while deeply curious, I didn't take the step of sticking my card into the headsets there and trying to log myself into Steam Frame.
That said, I look forward to the day I can pop that card out of my Deck and into a Steam Frame and see what experiences work smoothly as I click play on absolutely everything. Valve told us they're planning to distribute review units sometime early in 2026.
Near the end of my time at Valve HQ in 2025, six years after I first visited their offices in 2019, I asked them to frame for me the difference between a computer that's "closed" and one that's "open."
Valve's Pierre-Loup Griffais responded.
"If folks on an experience that's more curated and more closed off are having a good experience, that's fine. But in general, we see that people that are trying to experience a variety of games in different ways, there's a bunch of stuff that they might wanna do that we haven't thought of. And what we always observe is that there's a ton of value that is usually distributed laterally in the community, where users between themselves will share stuff that will make the experience better. And that is only possible in an open platform. Like we don't want all the value in a platform like that to be flowing up and down through us, and for us to be determining what's a good experience or not on behalf of all those users that might have different opinions and different aspirations."
"So it's really important for us to keep that open because it creates those kinds of effects that eventually leads to a better experience. Also, anyone that's using this stuff can also go and contribute patches and develop on it. And so we're excited to be able to have stuff get even better because people now want to contribute to it."
"In fact, a lot of the developers that are working on open source have started because they were users and they just want to improve a specific aspect and they go deep into it."
"The lines between user and developer has always been very blurry for us. We've come from a world where some of our most popular game properties actually started out as mods. And modding on PC was always like a strong thing that we were always trying to support. Because so many good concepts and new game genres, you know, free to play, mobile, all that stuff came through mods, initially. If you look at the history of video games, different genres, different ways to experience games, different peripherals, a lot of it came from PC because PC was an open platform where different companies could innovate in different ways, but also users could mod. People that created closed off platforms based on some of those concepts, they're gonna take some of those concepts and kind of freeze them in time. And then PC's gonna keep moving forward because it's open and we have all this value. And we are just applying PC to VR. So it's nothing new for us. We've always applied PC to VR. It's just some folks have opted to like branch it off in different directions, but I think we're just doing the same thing as we've always been doing."
From Valve Index To Steam Frame
In 2019, when I first tried the Index, a Valve representative told me "this is going to ruin you" before I tried Beat Saber running at 144Hz.
Beat Saber was later acquired by Facebook that same year and, sure, at some point in Steam Frame I'd like to see how Meta's cornerstone title performs in terms of tracking and frame rate.
In retrospect, Beat Saber at 144Hz did ruin me in the sense that, when I invited Index into my home, I don't think I really experienced those frame rates in any substantial way running an NVIDIA RTX 2080 for most of the lifetime of the headset.
In Steam Frame, Valve manages end-to-end controller inputs and frame delivery not just in the standalone mode, but when streaming from PCs like Steam Machine too. Even a few minutes playing your favorite game with a few milliseconds less lag in input, or higher frame rates visually, will feel like invisible magic while meaningfully adding value to your day. Ultimately, that's exactly why Valve is making Steam Frame.
UploadVR's Ian Hamilton and David Heaney went hands-on with Steam Frame at Valve HQ.
If you missed it, Valve just officially announced Steam Frame, a "streaming-first" standalone VR headset launching in "early 2026".
Steam Frame has a lightweight modular design and runs a VR version of Valve's SteamOS, the Linux-based operating system used in Steam Deck. With an evolved version of the Proton compatibility layer it can run almost any Linux, Windows, and Android game, including SteamVR games. Many titles won't perform well on the mobile chipset, though, so Steam Frame has a wireless dongle in the box to leverage the power of your gaming PC – hence Valve's "streaming-first" positioning.
The headset does not require or support base stations. It tracks itself and its included controllers using four onboard greyscale tracking cameras, two of which can be used for monochrome passthrough, and it also has eye tracking for foveated streaming.
Steam Frame will replace Valve Index on the market, which the company confirmed to UploadVR is no longer in production, and joins Valve's "family" of hardware products, which will also soon include a Steam Machine consolized PC and a new Steam Controller.
You can find a full rundown of the design, features, and specifications of Steam Frame in our news article here. This article describes our impressions of using the headset at Valve HQ, where we were invited to a hardware briefing that included hands-on time with the new Steam Controller, Steam Machine, and the Steam Frame headset.
Ian's time with Steam Frame was mostly spent in standalone titles on SteamOS, while David's time was entirely in Half-Life: Alyx streamed from a nearby gaming PC using the wireless adapter included in the Steam Frame box. Here's what they thought of their time with Steam Frame.
Ian's Impressions: Standalone Use
In quick succession I played Ghost Town, Walkabout, Moss 2, and Gorn 2 in the lightweight standalone SteamOS headset, and I also briefly tried some Half-Life: Alyx streaming from a nearby Windows PC.
Ghost Town is one of the best VR games of the year and Valve says I played the PC VR version – the version made for x86 processors – completely in standalone through a compatibility layer. Walkabout Mini Golf's build was more fully featured than the one shown during the demo day at Samsung a couple weeks earlier, allowing me to putt with one controller in full VR joined by an iPhone player logged into the same room code via the Pocket Edition of the game. I enjoyed waving at Quill in Moss 2 and, in Gorn 2, I punched barbarians with my fists using the analog sticks to move myself out of the way of their attempts to hit me. Playing mostly seated, they all worked smoothly with Steam Frame running as a standalone personal computer – no streaming from a PC.
Photo by UploadVR at Valve HQ.
Portal 2 ran on a large virtual display, as if on a giant Steam Deck, with what seemed like a very high frame rate. That was a really nice, responsive experience. So was stretching out my farm in Stardew Valley to keep an eye on most of the farm at once. Both of these flat games are pretty powerful to see running well directly on such a lightweight device alongside any number of standalone VR games.
I opened the Linux desktop, went to Chrome and voice searched for the No Time For Caution scene from Interstellar on YouTube. I kicked off my shoes at Valve HQ (apologizing for doing so) and stretched out horizontally on a couch. I propped up a pillow behind my head and left the controllers on my stomach with the screen stretched across the sky. Matt Damon said "there is a moment" and I watched him blast into the gray of space with my controller drifting off with him.
Regressions in controller tracking compared with Valve Index and its SteamVR 2.0 base stations may grate against developers and players who’ve come to expect rock-solid tracking from Steam-based laser systems outside the play area. I tried nothing like Longbow, for example, from Valve's original Lab experience, nor something with lots of physics objects like Boneworks, nor anything with fast motion like Beat Saber.
Steam Frame controller (photo by UploadVR at Valve HQ).
With the controller in my hand, my index finger had some difficulty reaching both the index shoulder and trigger buttons while also keeping my middle finger on the grip button. Grip straps should be sold optionally at launch and there are capacitive sensors along the base of the controller intended to see when the 4th and 5th fingers release. I saw it in action in Half-Life: Alyx, with Alyx’s pinky and ring finger occasionally moving as I released my grip from that part of the controller. It didn’t seem super responsive, but it also wasn't strapped to my hand and the grips of the Index controllers were never particularly responsive either. The input from the 4th and 5th digits hasn’t proved necessary to game developers for half a decade, so I'm not too worried about it being well supported here. Still, we will closely watch what developers say about their feedback on the Steam Frame controllers.
Steam Frame with a non-shipping clear prototype of the modular compute unit to show the components at right (photo by UploadVR at Valve HQ).
IPD adjustment is done via a wheel on the top of the headset and, after I got it set right, I largely forgot the headset's weight as it disappeared split between the rear and front in a remarkable feat of engineering. There’s no battery up front but no adjustment knob at the back — you pull on the soft straps at the side to adjust fitting — with the dual-cell thin battery on the back held behind a cushy foam. In hand, the compute unit feels a bit like I imagine a mainline Apple Vision might, with the rear component of Valve's headset able to collapse inside of the front for more compact travel than any other headset I'd want to use.
The back half of the Steam Frame can fit inside of the facial interface for transport, making this the most compact design for travel (photo by UploadVR at Valve HQ).
On head, Steam Frame is a relief compared to all headsets with a battery hanging on the front of your face. The absence of the battery there is easily the most impactful feature of its design. Even though Google and Samsung hang the battery in a pack in Galaxy XR like Vision Pro, I found Steam Frame’s cushy back-mounted battery design to be an enormous relief particularly after spending four days in Android XR’s first headset.
Of course, that’s only after a few minutes watching a movie reclined on a couch while missing OLED displays every second, but Steam Frame feels like glasses or perhaps even a sleep mask because of how well spread out its weight feels across the head.
A Steam Frame Wireless Adapter comes in the box with each headset intended to manage the link to a nearby PC, including to the planned Steam Machine. We’ll be looking for the Steam Frame-verified label on VR games for Steam in the year ahead, and looking to test what it means to truly pump Steam throughout the home with dedicated Valve-managed wireless connections. There’s a lot of space for developers to play here in SteamOS, jumping off a Steam Machine or Deck and into a Frame.
Steam Frame next to the Steam Machine with e-paper faceplate (photo by UploadVR at Valve HQ).
Valve has a lot to accomplish here during a turbulent time in global relations and specifics like cost and availability aren’t finalized. Valve representatives think they can get Half-Life: Alyx running performant in standalone, but they’re not promising it yet and it’s clear there’s still a lot for them to do.
David's Impressions: Wireless PC VR
My two Steam Frame demo sessions involved streaming Half-Life: Alyx from a nearby gaming PC that had the headset's included wireless adapter connected to a USB port.
A hands-on demo can never definitively reveal whether a headset is comfortable to wear for hours, but even in the relatively short time I used Steam Frame it felt significantly lighter and less burdensome than any other fully-featured standalone headset. The visor itself weighs just 185 grams, a remarkable achievement, and the entire unit including the rear battery just 440 grams, meaning the weight is incredibly well distributed across your head.
Further, the material Valve is using for the facial interface and rear padding is an evolved version of the ultra-snug fabric used in the Index, which even six years and dozens of aftermarket accessories for other headsets later, I still find feels the softest on my face.
Me wearing Steam Frame at Valve HQ.
While I'm cautious about making sweeping conclusions until I have the headset in my home, my initial impression is that Steam Frame is the most comfortable VR headset yet, for my face at least.
When it comes to making Steam Frame an ideal headset for connecting to SteamVR on your PC, Valve is using a combination of both hardware and software cleverness to refine the compressed wireless streaming experience.
Steam Frame has two separate wireless radios. One is used as a client, connecting to your home Wi-Fi network on the 5GHz band for the general internet connection of SteamOS. The other is for a 6GHz Wi-Fi 6E hotspot, created by the headset, that SteamVR on your PC automatically connects to via the USB adapter included in the box. It's a dedicated point-to-point connection between Steam Frame and your PC.
This gives Valve precise firmware-level control over the entire network stack for wireless PC VR, and eliminates the problems you might experience using other standalone headsets for this, such as being bottlenecked by a router that's either too far away, blocked by too many walls, congested by other traffic, or just supplied by your ISP because it was cheap, not because it's any good.
The other feature Valve has implemented to make the wireless PC VR experience as good as it can possibly be is foveated encoding. Steam Frame has built-in eye tracking, and when you're using PC VR it's always used to encode the video stream in higher resolution where you're currently looking.
Steam Frame has the wireless adapter in the box (photo credit UploadVR).
The result of this hardware and software effort, in my demos, was a relatively high detail and stable image that felt as if it could have been arriving from a DisplayPort cable.
The exception to this stability was that in the second demo room, I saw a frame skip issue at a regular interval. Asking Valve's staffers about this, they debugged it as an unexplained frame spike on the Windows PC side, showing me the SteamVR performance graph on the PC monitor so I could visually confirm this. The issue didn't occur in the first demo room, and is unlikely to be inherent to the product.
Steam Frame's pancake lenses made the image look clear and sharp throughout, with a similar feeling to Quest 3's lenses (including the wide eyebox) but with a slightly taller field of view, and that increased vertical field of view meaningfully contributed to an increased feeling of immersion. There did appear to be some minor pupil swim, however, meaning the geometric stability of the scene ever so slightly shifted as I panned my head.
I asked Valve's Jeremy Selan about the idea of using dynamic distortion correction, having the eye tracking continuously update the lens distortion coefficients, and he told me that they "haven't found the need" but "could if we wanted to". It probably isn't a big enough issue for most people to notice or care.
Steam Frame's lenses (image by UploadVR at Valve HQ).
The only real problem with the image I saw was its poor contrast, given Valve's description of Steam Frame as a "premium headset".
If you currently use an OLED or micro-OLED headset for PC VR, or even LCDs with Mini-LED backlighting, Steam Frame's contrast would be a huge downgrade. Valve is using regular 2160×2160 LCD panels, with no local dimming of any kind, and in the dark sewers of City 17 I saw the same detail crunch I see with any other plain LCD headset.
To be clear, this is the contrast experience that the majority of SteamVR users have today. But much of this usage comes from headsets that were bought for around $300. Valve isn't yet giving a price for Steam Frame, but said it's aiming to sell it for less than the $1000 Index full-kit. To what degree the regular LCD panels are an acceptable tradeoff will depend on exactly where Steam Frame's price lands.
Photo by UploadVR at Valve HQ.
Overall, Steam Frame felt like a device optimized to be the ideal wireless PC VR experience but without being unaffordable for too many people. It's incredibly comfortable, its wireless adapter bypasses the common issues of home Wi-Fi networks, and its lenses are sharp and clear. It lacks the ultra-high-detail and rich contrast of 4K micro-OLED headsets, but it's also set to lack their multi-thousand-dollar price tag.
I suspect Steam Frame could be the headset to finally convince many tethered PC VR diehards to make the leap to wireless, and I'm eager to spend more time with the headset to see how it performs over multi-hour sessions in a real home environment.
Valve just officially announced Steam Frame, a "streaming-first" standalone VR headset launching in "early 2026".
Steam Frame has a lightweight modular design and runs a VR version of Valve's SteamOS, the Linux-based operating system used in Steam Deck. With an evolved version of the Proton compatibility layer it can run almost any Linux, Windows, and Android game, including SteamVR games. Many titles won't perform well on the mobile chipset, though, so Steam Frame has a wireless dongle in the box to leverage the power of your gaming PC – hence Valve's "streaming-first" positioning.
0:00
/0:08
The headset does not require or support base stations. It tracks itself and its included controllers using four onboard greyscale tracking cameras, two of which can be used for monochrome passthrough, and it also has eye tracking for foveated streaming.
Steam Frame will replace Valve Index on the market, which the company confirmed to UploadVR is no longer in production, and joins Valve's "family" of hardware products, which will also soon include a Steam Machine consolized PC and a new Steam Controller.
My colleague Ian Hamilton and I went hands-on with Steam Frame at Valve HQ, and you can read our impressions here. This article, on the other hand, provides a full rundown of the design, specifications, and features of Steam Frame, based on the information provided to us by Valve.
Lightweight Modular Design
Steam Frame will come with a replaceable battery strap, with built-in dual driver speakers and a 21.6 Wh rear battery.
The strap itself is fabric and the rear battery unit has soft padding, meaning it can "collapse" against the lenses for portability and naturally deform when your head is resting on a chair, sofa, or bed.
There's an optional front-to-back top strap, but it's not attached by default.
Steam Frame and the Steam Frame Controllers (image from Valve).
The core frontbox of Steam Frame weighs just 185 grams, Valve says, while the entire system with the default included facial interface, speakers, strap, and rear battery weighs 440 grams.
That makes Steam Frame the lightest fully-featured standalone VR headset to date.
The rear battery of Steam Frame's included default strap (image from Valve).
Steam Frame is a modular system, and Valve will make the CAD and electrical specifications available to third parties to build custom facial interfaces and headstraps. Someone could, for example, build a rigid strap with an open interface, or a fully soft strap with a tethered battery. Expect a range of accessories.
2K LCDs & Pancake Lenses
Steam Frame features dual 2160×2160 LCD panels, meaning it has twice as many pixels as the Valve Index and roughly the same as Meta Quest 3.
The panels have a configurable refresh rate between 72Hz and 120Hz, with an "experimental" 144Hz mode, just like the Index.
Steam Frame's lenses (image by UploadVR at Valve HQ).
Valve says the multi-element pancake lenses in front of the panels offer "very good sharpness across the full field of view", which the company describes as "slightly less than Index", and "conservatively" 110 degrees horizontal and vertical.
Lens separation is manually adjusted via a wheel on the top of the headset, letting wearers match their interpupillary distance (IPD) for visual comfort.
Wireless PC Adapter With Foveated Streaming
Steam Frame does not support DisplayPort or HDMI in. It is not a tethered headset. Instead, Valve is going all-in on compressed wireless streaming, aiming to perfect it with a combination of clever hardware and software.
The headset has two separate wireless radios. One is used as a client, connecting to your home Wi-Fi network on the 2.4GHz or 5GHz band for the general internet connection of SteamOS. The other is for a 6GHz Wi-Fi 6E hotspot, created by the headset, that SteamVR on your PC automatically connects to via the USB adapter included in the box. It's a dedicated point-to-point connection between Steam Frame and your PC.
The wireless adapter is included in the box (photo by UploadVR at Valve HQ).
This gives Valve precise firmware-level control over the entire network stack for wireless PC VR, and eliminates the problems you might experience using other standalone headsets for this, such as being bottlenecked by a router that's either too far away, blocked by too many walls, congested by other traffic, or just supplied by your ISP because it was cheap, not because it's any good.
Of course, some enthusiasts already have a high-quality Wi-Fi setup for PC VR, with a high-end router or access point in the room where they play. Valve tells us that such people can continue to use their setup instead of the adapter if they really want, but suspects they won't choose to.
The other feature Valve has implemented to make the wireless PC VR experience as good as it can possibly be is foveated encoding. Steam Frame has built-in eye tracking, and when you're using PC VR it's always used to encode the video stream in higher resolution where you're currently looking.
0:00
/0:05
While this feature has existed as part of Steam Link VR for Quest Pro since the app launched in December 2023, Valve says on Steam Frame the foveated streaming has lower latency and greater precision, thanks to the company controlling the entire rendering stack on the headset side.
Linux, Windows & Android Apps Standalone
Steam Frame can run Linux, Windows, and Android applications through a combination of compatibility layers and emulation.
As with other SteamOS devices such as Steam Deck, Steam Frame can run Linux titles natively as well as Windows applications via Proton, the compatibility layer Valve has been working on for almost a decade now in collaboration with CodeWeavers.
But while Steam Deck is an x86 device, the same CPU architecture as a gaming PC, Steam Frame uses the mobile-focused ARM architecture. That supports a huge advantage: Steam Frame can natively run Android APKs, including those you download in the web browser, as long as they don't require Google Play Services. But it also means that Steam Frame can't natively run x86 applications, which the majority of Steam games are.
Photo by UploadVR at Valve HQ.
To solve this, Valve has been investing in FEX, an open-source tool for emulating x86 applications on ARM Linux devices that it has integrated into Proton on Steam Frame. The company tells UploadVR that the performance impact here is "shockingly small" – on the order of a few percent.
The ability to run x86 Windows applications means that Steam Frame can, in theory, run almost any VR title on Steam.
However, the key word here is "run". Steam Frame features a roughly 10-watt chipset originally designed for use in smartphones, and has only a fraction of the power of the gaming PC hardware that most SteamVR titles were designed for. Thus, while you can run visually simplistic and well-optimized titles at relatively low graphics settings, and there'll be a "Steam Frame Verified" tag for such titles, for high-fidelity VR gaming, such as playing Half-Life: Alyx you'll want to leverage your PC.
Snapdragon 8 Gen 3 + 16GB RAM
Steam Frame is powered by Qualcomm's Snapdragon 8 Gen 3 chipset, paired with 16GB of LPDDR5X RAM.
Two models will be sold, one with 256GB UFS storage and the other with 1TB, and there's also a microSD card slot for expanded storage. In fact, you can even transfer the microSD card from your Steam Deck or Steam Machine, and your games will automatically be available to play.
Non-shipping transparent internal prototype (photo by UploadVR at Valve HQ).
So just how powerful is Steam Frame's chip? Well, the XR2 Gen 2 series used in pretty much every other non-Apple headset features the Adreno 740 GPU from the 8 Gen 2 smartphone chip, and the 8 Gen 3 is the successor from the year after with the newer Adreno 750.
On paper, Steam Frame's Adreno 750 GPU is 25% more powerful than the Adreno 740 in Meta Quest 3, and this difference increases to over 30% when you factor in the fact that Quest 3 slightly underclocks its GPU, while Valve confirmed that Steam Frame does not. Further, the effective performance difference will be even greater in titles that leverage eye-tracked foveated rendering.
The CPU, on the other hand, is much more difficult to compare, as the XR2 Gen 2 uses a non-standard core configuration and 2D benchmarks run on headsets don't induce the maximum clock speed. But based on what we know about the chips, expect Steam Frame to have around 50% improved single-threaded performance compared to Quest 3 and around 100% greater multithreaded.
Essentially, from a standalone performance perspective Steam Frame is notably more powerful than other non-Apple standalone headsets, though still significantly less powerful than a gaming PC.
SLAM Tracking & Monochrome Passthrough
Steam Frame has four outwards-facing greyscale fisheye cameras for inside-out headset and controller tracking via computer vision. You don't need SteamVR base stations, and the headset doesn't support them anyway.
Two of the cameras are on the top corners, and the other two are on the front, near the bottom, widely spaced.
One of Steam Frame's greyscale fisheye tracking cameras (image from Valve).
To make headset tracking work in the dark, Steam Frame also features infrared illuminators, bathing your environment in IR light that the cameras can see.
You can choose to see the real world around you via the two front cameras at any time, though the view is monochrome, and lower resolution than the passthrough on headsets with dedicated mixed reality cameras. But combined with the IR illuminators, the advantage is that it lets you see in the dark.
Front Expansion Port
While Steam Frame has only low-resolution monochrome passthrough by default, it has a user-accessible front expansion port that in theory enables color cameras, depth sensors, face tracking sensors and more to be added.
Valve says the port offers a dual 2.5Gbps MIPI camera interface and also supports a one-lane Gen 4 PCIe data port for other peripherals.
"There is certainly enough flexibility in this port to do anything people are interested in doing", Valve's Jeremy Selan told UploadVR.
Included Controllers With Gamepad Parity
The included Steam Frame Controllers have a relatively similar ringless design to Meta's Touch Plus controllers, and are also tracked by the headset via infrared LEDs under the plastic. However, while Touch Plus controllers have 8 IR LEDs each, 7 on the face and 1 on the handle, Steam Frame Controllers have 18 each, dispersed throughout the face, handle, and bottom, which should make them more resistant to occlusion.
The bigger difference between Touch Plus and Steam Frame Controllers is the inputs. Valve has put all four A/B/X/Y buttons on the right controller and a D-Pad on the left controller, while both have an index bumper in addition to the index trigger.
Steam Frame Controllers (image from Valve).
The idea here is that the Steam Frame Controllers have all the same inputs as a regular gamepad, meaning they can be used for both VR and flatscreen gaming. You can switch between VR and flatscreen seamlessly, and you'll need less space in your bag when traveling.
Steam Frame Controllers feature capacitive finger sensing on all inputs and the handle, as well as advanced tunneling magnetoresistance (TMR) thumbsticks. TMR technology means they should have improved precision and responsiveness compared to traditional potentiometer thumbsticks, and should be significantly more resistant to drift – an issue that plagued the Valve Index Controllers.
Unlike the Index controllers, Steam Frame Controllers don't have built-in hand grip straps. But Valve says it will sell them as an optional accessory for people who want them, a similar strategy to Meta.
Steam Frame and the Steam Frame Controllers (photo by UploadVR).
As with Touch Plus controllers, the Steam Frame Controllers are powered by a single AA battery. They should last roughly 40 hours, though this is highly dependent on how much the haptic actuator gets activated.
Steam Frame does not currently support controller-free hand tracking. It requires some form of controller.
Spec Sheet & Competitors Comparison
Here's a full list of Steam Frame's specs, directly compared to Meta Quest 3 and Samsung Galaxy XR for context:
Valve Steam Frame
Meta Quest 3
Samsung Galaxy XR
Displays
2160×2160 LCD
2064×2208 LCD
3552×3840 micro-OLED
Refresh Rates
72-120Hz (144 Experimental)
60-120Hz (90Hz Home) (72 App Default)
60-90Hz (72Hz Default)
Stated FOV
110°H ×110°V
110°H × 96°V
109°H × 100°V
Platform
SteamOS (Valve)
Horizon OS (Meta)
Android XR (Google)
Chipset
Qualcomm Snapdragon 8 Gen 3
Qualcomm Snapdragon XR2 Gen 2
Qualcomm Snapdragon XR2+ Gen 2
RAM
16GB RAM
8GB
16GB
Strap
Soft + Battery (Modular)
Soft (Modular)
Rigid Plastic (Fixed)
Face Pad
Upper Face (Enclosed)
Upper Face (Enclosed)
Forehead (Open Default)
Weight
185g Visor 440g Total
397g Visor 515g Total
545g Total
Battery
Rear Pad
Internal
Tethered External
IPD
Manual (Dial)
Manual (Dial)
Automatic (Motorized)
Hand Tracking
❌
✅
✅
Eye Tracking
✅
❌
✅
Face Tracking
❌
❌
✅
Torso & Arm Tracking
❌
✅
❌
Color Passthrough
❌
4MP
6.5MP
IR Illuminators
✅
❌
✅
Active Depth Sensor
❌
❌
dToF
Wi-Fi
7 (Dual Radios)
6E
7
PC Wireless Adapter
✅ (6GHz Wi-Fi 6E)
Discontinued (5GHz Wi-Fi 6)
❌
Default Store
Steam
Horizon Store
Google Play
Unlock
PIN
PIN
Iris
Data Ports
1x USB-C (USB2)
+
2x MIPI / Gen 4 PCIe
1x USB-C (USB 3.0)
1x USB-C
Storage
256GB / 1TB
512GB
256GB
MicroSD Slot
✅
❌
❌
Controllers
Steam Frame Controllers
Touch Plus
+$250
Price
TBD
$500 (512GB)
$1800 (256GB)
Steam Machine
While Steam Frame supports any gaming PC that can run SteamVR titles, Valve is also releasing its own desktop PC running SteamOS, which, as well as being able to act as a living room console, could make getting into PC VR a more streamlined experience than ever.
Steam Machine is more than 6 times more powerful than Steam Deck, Valve tells UploadVR, with a discrete CPU and GPU, not a unified APU architecture.
The RAM and storage are user-upgradable, Valve confirmed, while the CPU and GPU are soldered on.
You'll "eventually" be able to wake Steam Machine via a Steam Frame, without needing a physical display or other peripherals attached, though Valve couldn't say whether this functionality will be available at launch. When this does arrive, it means you'll be able to just grab your Steam Frame and jump straight into high-performance PC VR at any time in seconds.
"Aiming" For Cheaper Than Index
Valve isn't yet giving a specific price for Steam Frame or Steam Machine, saying that it doesn't yet know and referencing the volatility of the current macroeconomic environment.
The company did however tell UploadVR that it's aiming to sell Steam Frame for less than the $1000 Index full-kit.
"As soon as we know pricing, we'll be sharing", Valve said.
The soon-to-be Steam hardware family (image from Valve).
Steam Frame is set to launch in "early 2026", alongside the new Steam Machine and Steam Controller. It will be available in all the same countries where Steam Deck is sold today, and fully replaces Index in Valve's lineup.
If you're a developer, you can apply for early access to a Steam Frame kit today, though there are limited units available.
Requisition VR: Hunt & Extract, a PvPvE extraction shooter where you duct tape items to form new weapons, gets its SteamVR relaunch today.
Following its full Steam release in 2023, you may recall that the physics-based co-op game Requisition VR received a major revamp with July's Quest launch. Turning this into a PvPvE (player vs. player vs. environment) extraction shooter with a post-apocalyptic setting, this new edition is out later today on PC VR.
Hunt & Extract maintains the original game's crafting system, where you use duct tape to combine objects like sticks, cleavers, and more for new weapons. Environmental traps are also available but can attract zombie hordes, and you team up with friends to defeat AI and human opponents alike. Any loot obtained during these runs is then used to upgrade your base and weapons.
Stating it's been rebuilding Requisition VR “from the ground up” across the last two years, developer Spheroom describes Hunt & Extract as “an entirely new game” that initially began as an update. Later becoming a “full-on reinvention,” the studio confirmed it's launching a separate edition to preserve the original version for its fans.
Requisition VR: Hunt & Extract is out today on PC VR, and it's live now on Quest. Owners of the original Requisition VR can also claim the new game for free via the official Discord server.
2023 saw Triangle Factory release Breachers, a popular 5v5 tactical shooter often described as VR’s answer to Rainbow Six Siege. Now, the studio is scaling up its ambitions with Forefront, a 32-player shooter that aims to bring the large-scale warfare and destructible environments of games like Battlefield to virtual reality.
Recently launched in early access, I’ve spent the past week storming the beaches of Forefront, and it successfully replicates many of the elements that make games like Battlefield so compelling. With just a few small changes, I can see Forefront becoming one of the more popular and successful VR games of its type.
The Premise
Forefront takes place in the near future of 2035, a time when an energy corporation known as O.R.E. has declared war against nondescript local governments over control of a rare mineral. The story is perfunctory, a plausible backdrop for vague military entities to shoot at one another. You play as a dude on one side of the conflict, and you’ve got to shoot the dudes on the other side.
It's structured around large-scale, squad-based multiplayer battles that pit infantry and land, sea, and air-based vehicles against each other. Each match unfolds with two 16-player teams fighting for control over sprawling maps full of semi-destructible environments.
Players choose to play as one of four (currently) classes: assault, engineer, medic, and recon (sniper). These class types will be familiar to most who have played shooters in the last 25 years, and each comes with their own sets of primary and secondary weapons, equipment, and abilities, and supplementary equipment. For example, the assault class can toss down an ammo resupply box while a medic can bring defibrillators out to revive fallen teammates.
Much of this will read as standard to the genre. And it is. The difference comes from VR.
0:00
/1:45
A montage of Forefront gameplay clips captured by UploadVR on Quest 3S
Gameplay
Forefront is rich with tactile mechanics. You manually reload weapons, grabbing and tossing an empty mag, replacing it with one ripped from your flak jacket, and chambering a round. Tossing a grenade requires stowing your primary weapon, yanking the grenade from its place on your body, and pointing with your free hand to direct where and how far you want the egg to go. Zip lines and parachutes need to be gripped.
Weapons respond differently depending on how they’re held (one or two-handed), and physical movement plays a large part in whether you’ll win a firefight. This immersion is a real strength. ADS (aiming down sights) feels realistic in VR. Tossing a grenade feels weighty. Despite a learning curve and the occasional awkward fumbling inherent in complex VR environments, the hands-on gameplay works really well to engage you in the moment.
The arsenal of weapons and vehicles is vast and impressive. Shotguns, SMGs, handguns, RPGs, assault rifles, and sniper rifles - everything that shooter fans likely expect is here. Tanks and helicopters and gunboats allow for drivers, passengers, and gunners to rip across the large environments fast and loud. There’s even a jet ski for when you want to fire up the Wave Race 64 soundtrack and take a break from all the fighting.
Progression is also familiar. Accomplishing in-game objectives, eliminating enemies, earning assists, etc., award experience for both you and your weapons. Leveling up provides character upgrades, load-outs, equipment unlocks, and more weapons.
Online network issues were non-existent in my time with Forefront on Quest 3S. There were plenty of available lobbies and matches, sessions packed with full squads, quick matchmaking, and reliable connections.
0:00
/0:27
A classic Battlefield-esque moment...
Visually it's impressive, the lighting is generally striking, and if we take a moment to notice, the environments are actually rather beautiful. Sound design is handled well, with directional audio being well-implemented.
In short, the structure and mechanics of Forefront bring together the tactical class roles, robust gunplay, vehicular warfare, and compelling progression mechanics of the established games in the genre. If the developers’ target was “Battlefield in VR,” Forefront is mostly dead on. It’s a great and uniquely effective game that brings VR shooters to a more even footing with their flatscreen counterparts.
But Forefront is not perfect.
0:00
/0:25
The Recon (sniper) class could use some work.
Collateral Damage
The Recon class is woeful, almost useless. Sniper rifles are the only primary weapons that the class can use, and as these guns stand right now, they are currently underpowered (not a one-shot kill) and almost impossible to use effectively compared with other classes.
While aiming through the scope is novel, since you have to physically raise the gun up to your eye like we've often seen in VR, actually hitting your target is extremely difficult. I've made some absolutely ridiculous snipes in non-VR gaming, but holding the rifle here feels jittery and imprecise with no aim smoothing option like Sniper Elite VR offers. You could argue that this makes sniping more realistic, but is Forefront a marksman training app or a video game? After a few hours with the sniper rifle, I abandoned the class secure in my opinion that the devs should add a very subtle aim assist.
I could also complain about the vehicles being rather pointless, since they're extremely fragile and short-lived. Also, the fully interactive guns can be finicky to use - instead of grabbing the stock, for example, our in-game hands grip the magazine or bolt. I’d also love to see the landscapes more densely populated by buildings and foliage.
Beyond these issues, which could be described as nitpicks or simply not my flavor, are several other almost glaring omissions of established genre norms. For example, Forefront doesn’t currently have any meaningfully implemented directional damage indicator, which means that you often can’t tell which direction you’re taking fire from. This is pretty irritating. There’s no pinging system, which makes communicating with teammates difficult for those who don’t want to mic up. Hit markers are vague, almost to the point of total irrelevance. There’s no party system.
Of course, Forefront is currently in early access, so there's still room to grow. Forefront’s current roadmap is extensive and displayed prominently on a whiteboard propped up on the deck of the virtual aircraft carrier that serves as the game’s main menu, and some of my issues are already noted. If Triangle Factory checks 60% of these boxes, they’ll have addressed 99% of the game’s current shortcomings.
But don’t be misled by those last few paragraphs of complaints; we're striving for balance, and the problems noted are ultimately minor. Forefront's core gameplay is solid, almost perfect for what it aims to be. Combat is exciting and tense, its VR gunplay is tactile and satisfying, and its environments are dynamic and engaging. Currently, it’s difficult to recommend another large-scale shooter over Forefront.
Forefront is out now in early access on Quest, Steam, and Pico.
VRider SBK Connect is a new companion app for the VR superbike racer, letting you race friends who own the full game for free on Quest.
Developed by Funny Tales, VRider SBK is an officially licensed VR racing game based on the Superbike World Championship that first appeared last year on Quest. Following this summer's PS VR2 and PC VR launch, the studio has released a Quest-exclusive companion app that lets your friends join you without everyone owning the full game.
Joined by a simultaneous update for the full game, VRider SBK owners can now create and host private multiplayer race rooms, letting Connect players join using a room code. Custom rules can also be selected to adjust the match type, number of players, circuit of choice, number of laps, and reputation settings.
If you've only downloaded VRider SBK Connect and don't have access to the main game, Connect also offers tutorials and the Hot Lap Practice mode, which lets you train on all of its official tracks with every bike unlocked. Connect owners can later switch to the full release via a paid in-game upgrade.
VRider SBK Connect is out now on the Meta Quest platform. While Connect isn't on these platforms too, the full game is also on Steam, Pico, and PlayStation VR2.
The PlayStation VR2 Sense Controllers are now sold by Apple, priced at $250, and the charging stand is included.
Apple added support for Sony's tracked controllers to Vision Pro headsets with visionOS 26, which released in September, but Sony itself doesn't sell them separately from its $400 VR headset.
The PS VR2 Sense controller support of visionOS includes 6DoF positional tracking, capacitive finger touch detection, and basic vibration support. The precision haptics of the controllers are not supported, however, and nor are their unique resistive triggers.
One of the first Vision Pro games to support the PS VR2 Sense controllers was the indie title Ping Pong Club, which we tested when visionOS 26 launched.
And three weeks ago, Resolution Games launched a title leveraging the controllers called Pickle Pro, a pickleball game with both local and remote SharePlay, so you can play against other Vision Pro owners in the same room or remotely over the internet as Personas.
The $250 price includes Sony's official charging stand.
The PlayStation VR2 Sense Controllers are available on the online Apple Store in the US, priced at $250, with Sony's official charging stand included.
There's no word yet on availability outside the US.
Technically, PS VR2 headset owners who lose or damage both controllers could also buy the package from Apple instead of a new headset, though it would probably be a better idea to get used replacements on a marketplace like eBay instead.
The Los Angeles store is located on Melrose Avenue. Meta describes it as its "flagship" retail location and says it spans 20,000 square feet, with multiple levels "specifically designed to highlight the features and benefits of our hardware".
Meta is also opening temporary "pop-up spaces" in New York and Las Vegas to demo its smart glasses:
The Vegas pop-up is relatively small, a 560 square foot space inside the Wynn, and opened last month.
The New York 5th Avenue pop-up will be much larger, at 5000 square feet, and is set to open "soon".
Meta Lab
Meta says it also plans to open a series of smart glasses "micro-stores", that may be similar to the hardware vending machines it had at Connect 2025. Snap tried that just under a decade ago for its original Spectacles smart glasses, but like the product itself, it didn't catch on.
All of this is in addition to the thousands of stores where Meta's smart glasses are already demoed and sold, thanks to its partnership with EssilorLuxottica, the owner of Ray-Ban and Oakley. But Meta's stores have the potential to include more technical staff who are aware of the intricate details of the devices, and they can include its Quest headsets too.
The 14-minute 'Flight Ready' Apple Immersive Video puts you on the flight deck of the USS Nimitz aircraft carrier as Super Hornets launch and land.
The USS Nimitz is the lead ship of the 10 Nimitz-class aircraft carriers of the US Navy, one of its 11 total current supercarriers. The Nimitz has been involved in the Iran hostage crisis, the Gulf of Sidra incident, the Gulf War, the Iraqi no-fly zones enforcement, and the 21st-century wars in Iraq and Afghanistan. In 2004 two of its Super Hornets reported that they encountered the now-famous "tic tac" UFO, a rapidly maneuvering white oblong flying object with no obvious means of propulsion.
What Is Apple Immersive Video?
The Apple Immersive Video format is 180° stereoscopic 3D video with 4K×4K per-eye resolution, 90FPS, high dynamic range (HDR), and spatial audio. It's typically served with higher bitrate than many other immersive video platforms.
We highly praised Apple Immersive Video in our Vision Pro review. It's not possible to cast or record Apple Immersive Video though, so you'll have to take our word for it unless you have access to a Vision Pro.
The new Flight Ready immersive documentary walks you through the flight deck of the USS Nimitz as it prepares for deployment, including the role of the pilots and crew on the deck itself and in the tower.
"The flight deck of a naval aircraft carrier is the most chaotic place on Earth", the narrator brings you into the film by declaring.
The video features many close-up immersive shots of Super Hornets launching and landing, as well as sweeping aerial views of the Nimitz sailing through the ocean as the jets fly by at low level.
It's an impressive use of the immersive video format that will appeal to any fan of military documentaries, and may induce some nostalgia for those who served at sea.
You can find Flight Ready in the Apple TV app, for free, exclusively on Apple Vision Pro.
Lumines Arise is the puzzle series' best entry since the original PSP game, channeling Tetris Effect with a stunning presentation. Here's our full review.
For all the incredible things VR developers have achieved across the years chasing immersion and embodiment, it's a testament to Enhance's design that Tetris Effect remains a regular fixture in my VR library. This transcendent transformation of the classic 2D puzzler blew me away years ago with a wonderful audiovisual experience. Now, Lumines Arise walks a similar path.
The Facts
What is it?: A rhythmic puzzle game where you match 2x2 blocks to clear the grid, featuring optional VR support. Platforms: PS VR2, PC VR (Reviewed on PS VR2 via PS5 Pro) Release Date: Out now Developer: Enhance, Monstars Inc. Publisher: Enhance Price: $39.99
Lumines has always been its own series ever since the 2004 PSP game, which ironically owes its existence to series creator Tetsuya Mizuguchi being unable to secure Tetris' licensing rights. Calling Lumines Arise a spiritual successor to Tetris Effect feels strange in that regard, yet it's very difficult to avoid directly comparing them. The parallels are screaming, especially in its slick presentation.
0:00
/0:29
Gameplay captured by UploadVR on PlayStation VR2.
Make no mistake, VR is the definitive way to experience Lumines Arise. VR lets these stages truly come alive, providing a stronger sense of depth and presence in a way flatscreen platforms can't achieve. Each stage packs unique animated effects that show clear VR focused design, delivering memorable visuals ranging from rhythmic geckos to the giant man in shadow as its arresting soundtrack goes on.
Journey Mode is the heart of this adventure, normally offering four or five levels per area that play in a continuous sequence. Arise delivers a great range of unique stage designs, though stage transitions don't always feel like a natural continuation of each other. Game Over means you can retry from the beginning or continue from your current stage; the former is preferable for high score chasers, since continuing resets your score.
Beating Lumines involves forming 2x2 squares of matching colors, dropping blocks of the same size with different patterns onto your grid. Matching patterns then disappear as a line goes across the screen, the speed of which matches the soundtrack's tempo. Easy to learn, difficult to master, and so satisfying when it lines up perfectly, earning extra points with larger squares.
Every block placement is crucial since being careless is a fast track to failure, which is easier said than done to avoid. You can't switch out squares, so you're stuck placing whatever comes your way. Fortunately, there's a helpful new Burst mechanic you can activate at 50% charge to temporarily slow time. Useful for clearing out the grid quickly with its focus on one specific color, though I faced some very infrequent framerate drops during this on PS VR2.
Another handy tool here is a connecting block symbolized by a crosshair, which clears all blocks of the same color directly touching each other. These infrequent appearances can be a major lifesaver if grids start stacking up high, and watching dozens of squares immediately disappear brings an immediate sense of relief.
Impressive DualSense integration benefits Arise's gameplay well on PS5, with haptic feedback feeling like a crucial part of experiencing each song as you get into the rhythm. I do wish the Burst move had adaptive trigger support for that little bit of extra feedback, though it's a minor issue when the rest feels this good.
I'd recommend using DualSense (or a gamepad on Steam) since gameplay really doesn't need motion controllers. Having a D-pad is a massive help for precision when the speed increases, something the Sense controllers lack. On a related note, it's great to see Lumines Arise lets you adjust the sensitivity for the left and right analog sticks. More on that in the comfort section below.
Comfort
Lumines Arise is one of the more comfortable VR experiences for newcomers that you'll find, since gameplay doesn't use artificial locomotion.
DualSense and VR headset vibration can be turned off from the options menu, also supporting adjustable strengths for each. Some UI elements like the time limit display can also be switched off, though this doesn't apply to every mode. VR camera distance from your physical position and the playfield is customizable, and background motion can be turned off.
The accessibility tab lets you turn off spiders and snakes from appearing. Text size is adjustable, a color filter is available, and various visual effects can also be changed. Brightness and audio levels can be changed on a slider, too. Sensitivity sliders are available for both the left and right analog sticks. Lumines Arise also supports remappable controls.
There's a welcome strategic challenge that rarely overwhelms even when quick decisions are needed, yet its inherent gameplay design leaves Lumines Arise feeling slightly trickier than Tetris Effect. It's a captivating experience in its own right; both are great games, and Arise boasts an even stronger presentation that's more expressive than its predecessor in chasing synesthesia.
Perhaps it's unfair to overly compare the two despite the obvious similarities. However, Lumines' core mechanics fall slightly short of those same highs for me that Tetris Effect provides. If we're purely talking gameplay, the older title remains slightly more compelling. What's here is still highly enjoyable though, and this is very much a “your mileage may vary” situation.
Completing Journey Mode unlocks an endless Survival mode, which tasks you with beating the entire campaign in a single run. It's more challenging given that stages don't reset the blocks each time. If anything, it's a fine excuse to go back through Hydelic's incredible songs again - Arise, Serpent Clash, Autumn Fall, and Sunset Beach are some personal highlights. This electronic soundtrack has a wonderful range, and it's hard not to feel captivated in these moments. You can feel the emotion poured into this.
Clearing stages and other tasks earns currency that unlocks customization items for your character, Loomii, who represents you in the multiplayer hub. That same hub also features a 1v1 ranked mode called Burst Battle with crossplay support, and its only major difference from solo modes is that clearing squares sends additional blocks to your opponent. It's a great inclusion for playing online publicly or with friends alike, and creating custom rooms with adjustable rule sets provides a suitable unranked option, too.
There are a few additional options around the Hub, the most notable being the Leaderboard League with two modes. One is your standard 'Time Attack' mode that's simply clearing as many squares as possible in a time limit, while 'Dig Down' slowly increases gameplay speed as more blocks emerge from the bottom for some frantic fun.
Lumines Arise: How Does It Compare On PC VR?
This review is based on the PlayStation VR2 version via PS5 Pro, though I received Steam access not long before launch. Testing this with a Meta Quest 3 via the Virtual Desktop and Steam Link apps, I encountered no performance issues on the highest settings.
My desktop exceeds the recommended requirements, which you can find with the minimum specs on the Steam page. It uses an Intel i9 16-Core Processor i9-12900 (Up to 5.1GHz), 32GB RAM - Corsair VENGEANCE DDR5 5200MHz, and a 16GB Nvidia GeForce RTX 4070 Ti Super.
Other available choices include the missions area, which helpfully teaches you more advanced techniques in Lumines through tutorials, and clearing these also earns more currency. Challenges add some decent variety to the mix too, tasking you with goals like activating burst as many times as possible to keep this varied.
If you'd rather sit back and enjoy the shows, a 'Theatre' option lets you do that as the game automatically plays. That's easy to miss though as it's only in the Playlist section, which lets you create dedicated sets of unlocked songs after clearing them in Journey. Annoyingly, playlists are also the only way you can jump into individual songs by just selecting one track, so I hope Enhance adds a quick play mode in a future update.
Lumines Arise - Final Verdict
Lumines Arise is the best entry yet in this long-running puzzle series, building upon Enhance's work in Tetris Effect to create a mesmerizing audiovisual spectacle that shines even brighter in a VR headset. It's not as transformative as the older title and the core gameplay doesn't quite hit those same high notes, but that hasn't stopped Enhance from delivering a highly memorable journey. If you enjoy puzzle games, you won't want to miss this.
UploadVR uses a 5-Star rating system for our game reviews – you can read a breakdown of each star rating in our review guidelines.
Spatially tracked styluses from Logitech for Apple Vision Pro and Quest paint an unfinished picture for creative input in headsets.
Priced around $130 from Logitech, the MX Ink is for Quest 2, 3 and 3S headsets, and Muse is for Apple Vision Pro. The Ink has been available for Meta headsets for more than a year and Muse debuted recently for Apple with a beta for visionOS 26.2 adding system level support for the input for the first time.
Logitech Muse at left and Logitech MX Ink at right.
The Logitech MX Ink for Quest has been supported in the menu systems for Horizon OS for some time, allowing for a kind of easy laser pointer or remote control across the entire Quest menu experience. You can launch apps like Figmin XR, Vermillion, ShapesXR and Gravity Sketch using the Ink in one hand, just clicking a button with the index finger.
MX Ink with pressure-sensitive main button, as well as buttons behind and in front.
On Quest, the apps usually also want a Touch controller in the off hand as well. I activated a setting in the app Jigsaw Night on Quest 3 to show interaction with the MX Ink in the dominant hand and simultaneous hand tracking in the off hand.
0:00
/0:32
Dual-mode input with MX Ink and hand tracking on Quest is rare in apps on Horizon OS as of November 2025, but Jigsaw Night from Steve Lukas supports it.
The Logitech Muse for Vision Pro is much newer, released near the launch of visionOS 26, and marks the beginning of Apple's support for "motion tracking in six degrees of freedom" as a way to change how you "work, create, and collaborate with Apple Vision Pro". As of visionOS 26.2, apps like Freeform and Notes bundled by Apple support the Muse pen, allowing for quick and intuitive sketching on a vertical 2D surface anywhere. Apps are rolling out on an ongoing basis that integrate the pen as a 3D tracked object.
Both Ink and Muse charge over USB-C connections near the rear of the stylus. The MX Ink also has an optional charging stand called the MX Inkwell, selling for $50, that the stylus can fit into. If you plug the pens into a long USB-C cord they appear to charge without issue while in use. That's not something a Quest controller or a PlayStation VR2 controller supports. Battery life is likely to vary by use, especially as new apps release using all the features. I haven’t encountered any battery anxiety, particularly after discovering they could charge in use.
0:00
/0:37
Logitech Muse in Freeform for Vision Pro as of visionOS 26.2
In theory, MX Ink and Muse can travel in pockets where a wider controller can't fit. In practice, at least from this writing, creative software with Quest seems to demand traveling with the Ink plus at least one controller. Support for hand tracking, if supported at all, seems to be treated by many apps as a different mode of input rather than hand in hand with the pen.
We could paint in Tilt Brush in 3D using a Vive wand in 2016 as a precision 3D instrument, albeit quite bulky in hand. We're nearly in 2026 and a much smaller pocket-sized wand can access Open Brush on Quest. When the MX Ink is used, I see the shape of an old bulky tool in hand, standing in for the marker-sized object actually held in hand with Quest roughly a decade later.
Apple and Meta both see a future for 3D tracked precision input for headsets, but their operating systems and developer ecosystems are still actively developing support for tools like the MX Ink and Muse alongside other input systems.
The Best Input For Thrasher?
Surprisingly, MX Ink support in Thrasher on Quest is wildly fun. Conceptually, any game involving wands, whether for conducting music or for wielding magic, would find itself a nice match with either of these tools.
Thrasher makes a key example of how spatial inputs and their tracking systems are in a completely different place from where they were in 2016 and 2017. Holding the MX Ink at my side in a dark room while playing Thrasher sees the 3D object in my hand seemingly drift into strange orientations. Nonetheless, the slithering thrashing beast under my control moves as expected across the flat vertical plane across from me, consistent with each flick of the controller. Other developers can't expect their games to fit into the same constraints that make control work so well in Thrasher.
MX Ink is well supported in some of Quest's best creative apps, but holding a Touch controller in the off hand doesn't make a whole lot of sense in the long term when hand tracking works well at the system level.
On Vision Pro, PlayStation VR2 controllers and Muse arrived as spatial inputs within weeks of one another. As of visionOS 26.2's latest testing release this week, each input supports system level control secondary to hand tracking and gaze. Some Apple developers are just now starting to think about their precision input integrations with Vision Pro, while some Meta developers are wondering why they should update theirs with Quest. Circuit Flux became one of the first apps on Apple's platform to support both Sony and Logitech spatial input systems, and Meta is offering developers a chance at funding with a competition.
As developers roll out additional support for MX Ink and Muse, and operating systems improve overall in their support for these inputs alongside hand tracking, we'll plan to cover this picture more as it starts to fill in. Please reach out if you’re doing something interesting with these tools.
Meta Ray-Ban Display is an early glimpse of a future where mobile computing doesn't mean looking down and taking something out of your pocket.
Most people never leave home without their phone, and take it out of their pocket so often that it's even become a replacement for fidgeting. The smartphone is so far the ultimate mobile computing device, an omnitool for communication, photography, navigation, gaming, and entertainment. It's also your alarm clock, calendar, music player, wallet, and flashlight. Globally, more people own a smartphone than TVs and cars combined. To get philosophical for a moment, the smartphone has become humanity's second cognitive organ.
The problem is that taking out your phone harshly disconnects you from the world around you. You have to crane your neck down and disengage from what you were otherwise doing, your attention consumed by the digital world of the little black rectangle.
In recent years, multiple startups have tried and failed to solve this problem. The smug founders of Humane came to liberate you from your phone with a $700 jacket pin, while Rabbit r1 promised the "large action model" on its $200 pocket device could handle your daily life instead.
The truth, and the reason why these companies failed, is that most people adore their phones, and are borderline addicted to the immense value they provide. And the screen of the smartphone is a feature, not a bug. People love being able to view content in high-resolution color anywhere they go, and despite the cries of a small minority of dissenters, the models with the biggest screens sell the best.
"If you come at the king, you best not miss", as the phrase goes.
The only form factor that seems to have any real chance of one day truly replacing the smartphone is AR glasses, which could eventually provide even larger screens that effectively float in midair, anywhere the wearer wants, any time they want. But while prototypes exist, no one yet knows how to affordably produce wide field of view true AR glasses in a form factor that you'd want to wear all day. In the meantime, we're getting HUD glasses instead.
HUD glasses can't place virtual 3D objects into the real world, nor even 2D virtual interfaces. Instead, they provide a small display fixed somewhere in your vision. And in the case of many of the first-generation products, like Meta Ray-Ban Display, that display is only visible to one of your eyes.
Meta Ray-Ban Display is also highly reliant on your nearby phone for connectivity, so it isn't intended to be a replacement for it as a device. It is, however, meant to replace some of the usage of your phone, preventing the need to take it out of your pocket and keeping your head pointed up with your hands mostly free. So does it succeed? And is it a valuable addition to your life? I've been wearing it daily for around a month now to find out.
(UploadVR purchased Meta Ray-Ban Display at retail with our own funds, while Meta provided us with the correctly sized Meta Neural Band for review.)
Comfort & Form Factor
Unlike a VR headset that you might use at home or on a plane for a few hours, the pitch for smart glasses is that you can wear them all day, throughout your daily life. Even when they run out of battery, they can still act as your sunglasses or even prescription eyewear (for an extra $200 and weeks of waiting).
As such, it's crucial that they have a design you'd be okay with wearing in public, and that they're comfortable enough to not hate having them on your face.
Meta Ray-Ban Display weighs 69 grams, compared to the 52 grams of the regular Ray-Ban Meta glasses, and 45 grams of the non-smart Ray-Ban equivalent. It's also noticeably bulkier, with thicker rims and far thicker temples.
Ray-Ban Meta vs Meta Ray-Ban Display vs Xreal One Pro
In my month with Meta Ray-Ban Display I've worn it almost every day throughout my daily life, sometimes for more than 8 hours at a time, and I experienced no real discomfort. The additional weight seems to be mostly in the temples, not the rims, while the nose pads are large and made out of a soft material. If anything, because the larger temples distribute the weight over a greater area and are more flexible, I think I even find Meta Ray-Ban Display slightly more comfortable than the regular Ray-Ban Meta glasses.
So, for my head at least, physical comfort is not an issue with Meta Ray-Ban Display. But what has been an issue is the social acceptability of its thick design.
With the regular Ray-Ban Meta glasses, people unfamiliar with them almost never clocked that I was wearing smart glasses. The temples are slightly thicker than usual, but the rims are essentially the same. It's only the camera that gave them away. With Meta Ray-Ban Display, it's apparent that I'm not wearing regular glasses. It's chunky, and everyone notices.
In some circles, thick-framed glasses are a bold but valid fashion choice. For most people, they look comically out of place. I've asked friends, loved ones, and acquaintances for their brutally honest opinions. Some compared it to looking like the glasses drawn on an archetypal "nerd" in an old cartoon, while only a few said that the look works because it matches current fashion trends. And my unit is the smaller of the two available sizes.
Ray-Ban Meta vs Meta Ray-Ban Display vs Xreal One Pro
Meta Ray-Ban Display also comes in two colors, 'Black' and 'Sand', and a confounding factor here is the black is glossy, not matte. I'm told that this decision was made because glossy was the most popular color for the regular Ray-Ban Meta glasses. But combined with the size, the glossy finish on Meta Ray-Ban Display makes it look cheap in a way that an $800 product really shouldn't, like a prop for a throwaway Halloween costume.
So Meta Ray-Ban Display is physically very comfortable, but not socially. More on that soon.
The Monocular Display
The fixed HUD in Meta Ray-Ban Display covers around 14 degrees of your vision horizontally and vertically (20 degrees diagonal). To understand roughly how wide that is, extend your right arm fully straight and then turn just your hand 90 degrees inward, keeping the rest of your arm straight. To understand how tall, do the same but turn your hand upwards or downwards.
What you see within that 20 degrees is a clear and high detail image, though ever so slightly soft rather than fully sharp, with higher angular resolution than even Apple Vision Pro. There's a slight glare that sees, for example, icons mildly bleed into the empty space around them, but this is very minor, and not distracting.
More notably, the significant difference between a waveguide like this and the interfaces you might see in a mixed reality VR headset is that it's very translucent, with a ghostly feel. You can see the real world through it at all times.
The display's perceived opacity and brightness is highly variable, though, because with waveguides this depends on the environmental light level, and the system also rapidly automatically adjusts display brightness, leveraging the ambient light sensor. You can manually adjust the brightness if you want, but the system is very good at deciding the appropriate level at all times, so I never do.
With a 5000 nit maximum, it's even visible in daytime sunlight, though it has a very ghostly translucent feel. As the photochromic lenses transition to dark, the perceived opacity slightly increases.
One notable quirk is that because an LCOS is essentially (to greatly simplify things) an LCD microdisplay, if you're in a very dark environment you'll see a faint glow throughout the display area when it's on, like an LCD monitor trying to show black. But in anything above almost pitch black, you won't see this.
So Meta Ray-Ban Display's display is surprisingly good, and lacks the distracting visual artifacts seen in many of the waveguide AR headsets of the 2010s. But there is a massive, glaring problem: it's only visible to your right eye.
0:00
/0:11
No through-the-lens approach accurately depicts what the HUD looks like, so here's Meta's generic marketing clip instead.
Meta Ray-Ban Display is a monocular device. Your left eye sees nothing at all.
Other than your nose, which your visual system is hardwired to understand, there is no analog in nature for one eye seeing something the other doesn't. It just feels wrong, and induces a constant minor feeling of eyestrain when I look at the display for more than a few seconds.
I can put up with it for a few seconds at a time, and have gotten slightly more used to it over time, but I would never want to watch a video or conduct a video call like this. I've also put the glasses on more than a dozen people now, and while some of them could just about tolerate the monocular display, others found it hurt their eyes within seconds.
I suspect that this is a core reason why Meta Ray-Ban Display is only available to buy after a retail demo. This just isn't a visually comfortable product for many people, and Meta likely wants to avoid mass returns.
Bloomberg's Mark Gurman and supply-chain analyst Ming-Chi Kuo have both claimed that Meta plans to retire Meta Ray-Ban Display in 2027 upon launching a binocular successor, with significantly ramped up marketing, production, and availability. By closing my left eye, I can already get a pretty good feel for just how much more visually comfortable the next generation could be.
No Light Leak! But Is That A Good Thing?
Almost all of the brightness of Meta Ray-Ban Display stays on your side of the glasses – 98% according to Meta. The display is not visible to people looking at you. I've repeatedly asked friends whether they can even tell if I have the display on or off, and none have been able to so far. The clickbait YouTube thumbnails you may have seen are fake.
This is partially due to the low "light leak" of the geometric waveguide in Meta Ray-Ban Display, but it's also because of the automatic brightness adjustment. If you manually turn up the brightness to an uncomfortably high level, which I can't imagine anyone intentionally doing, you can make the display slightly visible externally, though not its content (it just looks like a scrambled pattern). But again, even this requires an adjustment that no regular user would reasonably make.
All this said, while I initially assumed that the low light leak was an important feature of Meta Ray-Ban Display, I've come to see the inability for nearby people to know whether you're looking at the HUD as somewhat of a bug.
When you're with another person and take out your phone, that's an unambiguous indicator that you're diverting attention from them. Similarly, while Apple Vision Pro shows a rendered view of your eyes, if virtual content is occluding the person you're looking at, Apple intentionally renders an occluding pattern. Why? To clearly signal that you're looking at virtual content, letting the person know when they do or don't have your full attention.
When someone wearing an Apple Vision Pro is looking at virtual content that partially occludes you, you'll see a pattern on the display in front of their rendered eyes (center image above). With Meta Ray-Ban Display, you don't know whether the wearer is looking at the HUD or you.
With Meta Ray-Ban Display, there is no such signal. People who spend a lot of time with you can eventually figure out that you're looking at the HUD when your eyes are looking slightly down and to the right, but it's far more ambiguous, and this is not conducive to social acceptability. Are you fully present with them or are you not? They can't clearly tell.
And the worst case scenario is, when looking at the HUD, to a person sitting in front of you it can, in some specific circumstances, appear as if you're just looking at their chest. Yikes.
I'm not saying that I want other people to be able to see the contentof my display, as that would be a terrible privacy flaw. But I do wish there was an external glow on the lens when the display is on. I don't want the other person to have to guess whether I'm fully present or not, and whether I'm looking at the HUD or their body.
The Interface & Meta Neural Band
Like the regular Ray-Ban Meta glasses, you can control Meta Ray-Ban Display with Meta AI by using your voice, or use the button and touchpad on the side for basic controls like capturing images or videos and playing or pausing music. But unlike any other smart glasses to date, it also comes with an sEMG wristband in the box, Meta Neural Band.
In its current form, Meta Neural Band is set up to detect five gestures:
Thumb to middle finger pinch: double tap to toggle the display on/off, single tap to go back to the system menu, or hold for quick shortcuts to the 3 menu tabs.
Thumb to index finger pinch: how you "click".
Thumb to side of index finger double tap: invoke Meta AI.
Thumb swiping against the side of your index finger, like a virtual d-pad, which is how you scroll.
Thumb to index finger pinch & twist: to adjust volume or camera zoom, as you would a physical volume knob.
How Does Meta Neural Band Work?
Meta Neural Band works by sensing the activation of the muscles in your wrist which drive your finger movements, a technique called surface electromyography (sEMG).
sEMG enables precise finger tracking with very little power draw, and without the need to be in view of a camera.
The wristband has an IPX7 water resistance rating, and charges with an included proprietary magnetic contact pin charger.
While in the above clips I have my arms extended to illustrate the gestures, the beauty of sEMG is that you don't need to. Your hand can be at your side, resting on your leg, or even in your pocket. And it works even in complete darkness.
The gesture recognition is almost flawless, with close to 100% accuracy. The one exception is that rarely, if I'm walking it will fail to pick up a sideways directional swipe. But in general Meta Neural Band works incredibly well. The volume adjustment gesture, for example, where you pinch and twist an imaginary knob, feels like magic.
And whether I'm washing my hands, eating food, or driving a car, I have never found the display waking by accident, as the middle finger double-tap gesture only ever triggers when I'm intentionally making it. The only accidental activation I've encountered is that sometimes, if I'm tapping my phone with my thumb, Meta AI will trigger. It's hard to imagine solving this without the low-level access to the phone OS that only companies like Apple and Google have.
0:00
/0:05
Wake (top left), Scroll (top right), Click (bottom left), and Volume/Zoom (bottom right)
The Meta Neural Band gestures control the HUD interface, which looks much like that of a smart watch. It has 3 tabs, which you horizontally scroll between:
The center home tab (the default) shows the date and time, your notifications, and a Meta AI button, with tiny shortcuts to active audio or navigation at the top.
The right tab is the two-column applet library: WhatsApp, Instagram, Messenger, Messages, Calls, Camera, Music, Photos, Captions, Maps, Tutorials, and a game called Hypertrail. Four rows are shown at a time, and you navigate vertically to see the rest.
The left tab features quick controls and settings like volume, brightness, Do Not Disturb, as well as shortcuts to Captions, Camera, and Music.
When I first used Meta Ray-Ban Display I found the interface to be "just too much", often requiring too many sequential gestures to do what you want. While I still broadly hold this view a month later, I have found myself becoming more used to quickly performing the correct sequence with experience, and I've discovered that if you pinch and hold your thumb to your middle finger, you get shortcuts to the three main menu tabs, which can speed things up.
I still think there's plenty of room for improvement in Meta Ray-Ban Display's interface though, from a simplicity perspective, and repeat my assertion that the menu should have two tabs, not three. The Meta AI finger gesture makes the Meta AI button on the center tab redundant, for example, and when you don't have any notifications, the tab feels like a waste of space.
0:00
/0:10
This clip from Meta shows the 3 tabs of the system interface, and how you swipe between them.
A lot of the friction here will eventually be solved with the integration of eye tracking. Instead of needing to swipe around menus, you'll be able to just look at what you want and pinch, akin to the advantages of a touchscreen over arrow keys on a phone. But for now, it can sometimes feel like using MP3 players before the iPod, or smartphones before the iPhone. sEMG is obviously going to be a huge part of the future of computing. But I strongly suspect it will only be one half of the interaction answer, with eye tracking making the whole.
A major improvement since the Meta Connect demo though, is performance. While I still wouldn't describe Meta Ray-Ban Display as very snappy, the abject interface lag I encountered at Connect is gone in the shipping consumer model. The most noticeable delay is in waking the display, which often takes a second or two after the gesture.
Meta Neural Band size comparison with Fitbit Luxe.
Coming back to Meta Neural Band for a second, the only real problem with it is that it's something else you need to wear and charge (with a proprietary cable). I always wear a Fitbit on my left wrist, and now I wear the Meta Neural Band on my right too.
That's not to say that Meta Neural Band is a comfort burden. I find it no more or less comfortable than I did the Pixel Watch I used to own, and having tried a Whoop it feels similar to that too. And while it does leave a minor mark on my wrist, so do the straps of those other devices.
But it's another thing to remember to put on charge at night, another cable to remember to bring when traveling, and another USB-C port needed at my bedside.
Ideally, I should only have to wear and charge one wrist device. But today, Meta Neural Band is solely an sEMG input device, and nothing more.
Traveling means bringing the proprietary wristband charging cable with you.
The band already has an accelerometer inside, so in theory, a software update could let it track your daily step count. And if a future version could add a heart rate sensor for fitness, health, and sleep tracking, I wouldn't need my Fitbit at all anymore. But we're not there yet. And wearing a second wrist device just for input is a big ask for any product.
Features & Use Cases
There are six primary use cases of Meta Ray-Ban Display. It's a camera, a communications device, an on-foot GPS navigator, an assistive captions and translations tool, a personal audio player, and an AI assistant that can (if you want) see what you see.
So how well does it do each of these things?
Capturing Photos & Videos
When the regular Ray-Ban Meta glasses first launched, they were primarily pitched as camera glasses, like their predecessor the Ray-Ban Stories, and this remains one of the biggest use cases even as Meta now calls the product category "AI glasses". But without a display, you didn't get a preview of what you're capturing, nor could you check if the result was any good until it synced to the phone app. Sometimes you got lucky, and other times you failed to even frame your subjects, an issue not helped by the camera on almost all smart glasses being on a temple rather than centered.
0:00
/0:08
Meta depiction of photography.
Meta Ray-Ban Display also has 32GB of storage for media, and also syncs everything to your phone via Wi-Fi 6 when you open the Meta AI app. But the experience of capturing media is fundamentally better, because you get a live visual preview of exactly what's in frame. And with the Meta Neural Band, you can capture without raising your arm, as well as adjust the zoom on the fly by pinching your index finger to your thumb and twisting, the same gesture used to adjust the volume.
The camera quality isn't as good as your phone, given that the sensor has to fit into the temple of glasses. It appears to be the same camera as the regular Ray-Ban Meta glasses, and while it can produce great results in daytime, in low-light environments or with heavy zoom you'll get a relatively grainy output. You can see some sample shots and clips in my colleague Ian Hamilton's launch-week impressions piece.
Regardless, the ability to capture precisely framed shots without needing to hold up a smartphone is Meta Ray-Ban Display at its best. It lets you both stay in the moment and capture memories at the same time, without the doubt that what you've shot doesn't capture what you wanted to include. And it works completely standalone, even if your phone is out of battery.
With a couple of swipes and taps you can also easily send your captured media to a friend, something that required extensive voice commands with the displayless glasses. And this brings us on to messaging.
Messaging
While Meta doesn't have an existing device ecosystem like Apple and Google, what it does have is the most popular messaging platform and two most popular social networks in the world, all three of which have an integration on Meta Ray-Ban Display.
You can opt to have incoming WhatsApp, Messenger, Instagram, and text messages pop up on the display, and so without needing to look down at a smartwatch or take your phone out of your pocket you can "screen" them to decide which are important enough to respond to immediately. Tap your thumb to your middle finger to dismiss a message, or to your index finger to open it.
The notification appears in the very bottom of the display area, which is already slightly below your sightline, so doesn't block your view of what's in front of you. And of course, unlike with a phone, no one around you can see it. There's also a setting to automatically detect when you're in a moving vehicle, so if you start driving a car you won't be interrupted.
0:00
/0:03
Meta depiction of messaging.
If you want to respond to a message, there are 4 options in the interface: dictate, voice note, suggested emoji, or suggested reply.
I don't send voice notes, and I don't find the suggested emojis or replies useful, but I do love the dictation. It's powered by an on-device speech recognition model with relatively low latency and surprisingly great accuracy, in my experience. Even with my Northern Irish accent that some other systems (and people) can find difficult to understand, I'm able to dictate full responses without needing to type. And what's most impressive is that it works even when you speak quietly, thanks to the six-microphone array that includes a dedicated contact mic positioned just a few inches above your lips. That's yet another advantage of the glasses form factor.
Still, there are plenty of messages that I wouldn't want to dictate even softly in public, and situations when I want to use combinations of punctuation and letters that don't have a spoken form. Meta plans to release a software update in December that will let you enter text by finger-tracing letters on a physical surface, such as your leg, bringing some of its more advanced sEMG research out of the lab and into the product. It sounds straight out of science fiction, and we'll bring you impressions of it when the update rolls out. But it's not there today.
What About iMessage?
If you're an iPhone user, you're probably wondering whether all this works with iMessage.
I use an Android phone, from which Meta Ray-Ban Display can receive and send both 1-on-1 and group text messages, including both SMS and RCS.
For iPhone, I'm told, you can receive and send only 1-on-1 iMessages, with no support for group threads. And this support is limited to only pop-up notifications - you won't see a 'Messages' applet in the list.
These limitations, to be clear, are imposed by Apple, and Meta would gladly support the missing features if Apple let it.
As well as receiving new messages as pop-up notifications, you can also access your 10 most recent threads on each messaging platform at any time by swiping to it on the apps list. In the apps, you can scroll through past messages and send new ones, including your captured photos and videos stored on the glasses.
The problem with accessing your past message threads, and with viewing photos and videos you've been sent, is that it's incredibly slow to load. Open WhatsApp on your phone and you'll typically see your messages update within a fraction of a second. On Meta Ray-Ban Display you'll see everything as it was the last time the applet was opened for upwards of 10 seconds, while media can take over a minute at times, or even just seemingly never load. And it's badly missing any kind of loading progress bar.
So, for example, while in theory you can definitely use Meta Ray-Ban Display to catch up on the dozens of Reels that one unemployed friend sends you all day on the go, assuming your eyes can put up with the monocular display for this long, in practice, as with waiting for fast-moving group chats to finally load, I often found it faster to take the phone out of my pocket and open the real app.
The cause of this slowness seems to be that Meta Ray-Ban Display is entirely reliant on your phone's Bluetooth for internet connectivity. This connection speed issue is something I ran into repeatedly on Meta Ray-Ban Display, and made me wish it had its own cellular connection. In fact, I'm increasingly convinced that cellular will be a hard requirement for successful HUD and AR glasses.
Audio & Video Calls
For over two years now, I've made and taken almost every phone call, both personal and professional, on smart glasses. I hate in-ear and over-ear audio devices for calls because it feels unnatural to not hear my own voice clearly, and the people on the other end love the output of the optimally-positioned microphone array.
With Meta Ray-Ban Display, the addition of the HUD and wristband lets you see how long a call has gone on for, and end it without having to raise your hand. You can also view and call recently called numbers in the Calls applet.
Meta depiction of video calling.
On the regular Ray-Ban Meta glasses you can also share your first-person view in a video call, and the big new calling feature of Meta Ray-Ban Display is that you can now see the other person too.
But again, this great-in-theory video calling feature is ruined by the fact that the internet connection is routed to Meta Ray-Ban Display via your phone's Bluetooth. Even with my phone and the person I'm calling on very strong internet connections, the view on both sides was pixelated and exhibited constant stuttering. Bluetooth just isn't meant for this.
On-Foot Navigation
One of the features I've most wanted HUD glasses (and eventually AR glasses) for is pedestrian navigation. There are few things I hate more in this world than arriving in a new city and having to constantly look down at my phone or watch while walking, almost bumping into people and poles. Worse, in cities with dense skyscrapers, the GPS accuracy degrades to dozens of meters, and I hopelessly watch the little blue dot on my phone bounce around the neighborhood.
In theory, Meta Ray-Ban Display solves at least the first problem, but with a massive caveat you absolutely must be aware of if you're thinking of buying it for this use case.
0:00
/0:05
Meta depiction of navigation.
You can open the Maps applet anywhere, and you'll see a minimap (powered by OpenStreetMap and Overture) with nearby venues and landmarks. You can zoom all the way in to the street level, or out to the city level, using the same pinch-and-twist gesture used for volume control. You can also search for places using speech recognition, and scroll through the results by swiping your fingers.
The problem is that everywhere except inside the 28 cities Meta explicitly supports, you won't be able to initiate navigation. Instead, you just have an option to send it to your phone, where you can tap to open it in Google Maps, defeating the purpose. It's a rare product where core functionality is geofenced to a handful of cities.
I have been able to use Meta's navigation feature multiple times when visiting London, and found it genuinely very useful when in New York for the Samsung Galaxy XR launch event. I love the idea here, and where it works, the implementation isn't bad. Not having to look down to navigate is exactly what I wanted. But it works in so few places, relative to my life at least, that I just can't stop wishing it had Google Maps. And it's hard to imagine not switching to whatever HUD glasses have Google Maps first.
The Supported Cities
USA • Atlanta, Georgia • Austin, Texas • Boston, Massachusetts • Chicago, Illinois • Dallas, Texas • Fort Worth, Texas • Houston, Texas • Los Angeles, California • Miami, Florida • New York City, New York • Orlando, Florida • Philadelphia, Pennsylvania • Phoenix, Arizona • San Antonio, Texas • San Diego, California • San Francisco, California • San Jose, California • Seattle, Washington • Washington D.C.Canada • Toronto, Canada • Montreal, Canada • Vancouver, Canada
It's baffling that Meta decided to roll its own navigation system. It may be the right bet in the long term, but in the short term it compromises the product and leaves the goal wide open for Google to deliver a significantly better experience. Meta already has a wide-ranging partnership with Microsoft – why didn't it license Bing Maps for navigation? Or why not acquire a provider like TomTom?
It somewhat reminds me of the Apple Maps launch debacle, except in a parallel universe where it arrived alongside a first-generation iPhone that didn't have Google Maps.
As for the other problem with navigating in cities, the GPS issue, Meta Ray-Ban Display offers no solution. In theory, Meta could leverage the camera to calibrate the position and orientation, a technique often called VPS, but it doesn't today, and would likely require the company to build up a huge imagery dataset similar to Google Street View.
Meta AI & Its Dedicated Gesture
Meta has been marketing its smart glasses as "AI glasses" for some time now, riding the wave of hype that has for better and for worse almost entirely taken over the tech industry.
I didn't use Meta AI often on the regular Ray-Ban Meta glasses because I hate saying "Hey Meta", just as I hate saying "Hey Google", especially in public. With Meta Ray-Ban Display, you can still invoke the AI that way if you want, but there's also a dedicated gesture: just slightly curl your fingers inward and double-tap the side of your index finger with your thumb.
There's something very satisfying about this gesture. It feels more natural than the pinches used for everything else. And it has me using Meta AI far more often than I would if I had to keep saying "Hey Meta".
With the display, you also get visual aids in your responses. Ask about the weather in a place, for example, and you'll see a five-day forecast appear. Or for most queries, you'll see the response appear as text. I wish I could disable the voice response and just get the text, for some situations, and while I can do so by just reducing the volume to zero temporarily, this isn't a very elegant solution.
The real problem with Meta AI is that it's still Meta AI. It just isn't as advanced as OpenAI's GPT-5 or Google's Gemini 2.5, sometimes failing at queries where it needs to make a cognitive leap based on context, and fundamentally lacking the ability to think before it responds for complex requests.
Occasionally, I've run into situations where I wanted the convenience of asking advanced AI about something without taking out my phone, but ended up doing so after Meta AI just couldn't get the answer right. Ideally, I'd be able to say "Hey Meta, ask [ChatGPT/Gemini]". But that's not supported.
This is part of the reason Mark Zuckerberg is spending billions of dollars acquiring top AI talent for Meta Superintelligence Labs.
Audio Playback & Control
The primary way I use the regular Ray-Ban Meta glasses is for listening to podcasts and audiobooks. Smart glasses speakers don't do music justice, but they're great for spoken word content, and having your ears fully open to the real world is ideal.
What's different on Meta Ray-Ban Display is that you can far more easily and precisely adjust the volume. Rather than needing to raise your arm up to your head and awkwardly swipe your finger along the temple, you can just wake the display, pinch and hold your index finger to your thumb, and twist. It's a satisfying gesture that feels natural and precise.
The HUD also shows you a thumbnail for the content you're viewing, as well as how far you're into it and how long is left. It's a nice addition, and one less reason to take out my phone.
Live Captions & Translation
For accessibility, one of the biggest marketed features of Meta Ray-Ban Display is Live Captions.
The real-time speech transcription has decent accuracy, with the exception of niche proper nouns, and fairly low latency, always giving you at least the gist of what the person you're looking at is saying. And yes, I do just mean the person you're looking at. It's remarkable how well the system ignores any audio not in front of you, leveraging the microphone array to cancel it out. Look away from someone and the captions will stop. Look back and they'll continue. It really does work.
0:00
/0:06
Meta depiction of live captions.
Just one swipe over from the virtual button to start Live Captions is the one to start Live Translation, and this feature blew me away. Having a friend speak Spanish and seeing an English translation of what they're saying feels like magic, and I could see it being immensely useful when traveling abroad. For the other person to understand you, by the way, you just hand them your phone with the Meta AI app open it shows them what you're saying, in their language. Brilliant.
Yes, you can do live translation with just a phone, or audio-only devices like AirPods and Pixel Buds.
Unfortunately, it only supports English, French, Spanish, and Italian. This is another example of where Google's services, namely Google Translate, would be immensely valuable.
The bigger problem with both Live Captions and Live Translation is that because the display is slightly below and to the right of your sightline, you can't look directly at someone when using the features. It's far better than having to look down at a phone, but ideally I'd want the text to appear centered and much higher, so that I could appear to be keeping eye contact with them while seemingly-magically understanding what they're saying. This would require different hardware, though.
The Big Missing Feature
While I'm glad that Meta Ray-Ban Display lets me decide whether it's worth taking my phone out of my pocket for inbound personal messages, what I want the most out of HUD glasses is the ability to screen my emails and Slack messages.
There is no applet store on Meta Ray-Ban Display, and Meta's upcoming SDK for phone apps to access its smart glasses won't support sending imagery to the HUD. For the foreseeable future, any new "apps" will have to come directly from Meta, and I call them "applets" because each really only does one thing and is essentially part of the OS.
(The company says it plans to add two new apps soon, a Teleprompter and a dedicated IG Reels experience.)
So a Slack applet won't be happening anytime soon. But there's a far easier way I could get what I want here.
Many smartwatches (yes, including third-party ones on iPhone) already let you view your phone notifications, so it's definitely technically possible. I asked Meta why it doesn't do this for Meta Ray-Ban Display, and the company told me that it came down to wanting to not overwhelm the user. I don't understand this answer though, since as with smartwatches, Meta could let you select exactly which apps you do and don't want notifications from, just as you can already for WhatsApp, Messenger, Instagram, and texts.
If I had this feature, letting me screen Slack notifications and emails, I would take my phone out of my pocket far less often. If any similar pair of glasses arrived with this feature, I'd switch over immediately.
The Case Folds Down & Has Huge Potential
Just like with AirPods, for smart glasses the included battery case is almost as important as the device itself. Meta Ray-Ban Display has an official "mixed use" battery life of 6 hours, which I've found to be fairly accurate, while the case provides 4 full charges for a total of 30 hours of use between needing to charge it with a cable.
The problem with the regular Ray-Ban and Oakley Meta glasses cases is that they're far too bulky to fit in almost any jacket pocket. And the Meta Ray-Ban Display case has an elegant solution for this, likely inspired by what Snap did in 2019 for the Spectacles 3.
0:00
/0:31
How Meta Ray-Ban Display's case folds when you're wearing the glasses.
When containing the glasses, the case is a triangular prism. But when not, it folds down into something with very similar dimensions to a typical smartphone, just slightly taller and less wide. This means that not only does it fit in most jacket pockets, but it even fits into my jeans pocket. So when I'm dressed relatively light and using Meta Ray-Ban Display as my sunglasses, for example, I can keep the case in my pocket when on-the-move and put the glasses back in the case when in the shade. The glasses and case together are the product, along with the wristband, and the folding design makes the whole system significantly more portable.
In fact, the glasses and case make such a great pair that I'm eager for the case to do more than just act as a battery and container.
When I need to take the glasses off to wash my face, take a shower, or just let them charge, docking them in the case should turn the duo into a smart speaker, letting me continue to listen to audio, take calls, and prompt Meta AI as I would with Alexa on an Amazon Echo. This could be achieved with no extra hardware, but ideally Meta would add a speaker to the case.
When folded, the case (center) is flat enough to fit in a pocket.
It also seems like the battery case could be the perfect way to bring cellular connectivity to a future version of Meta Ray-Ban Display without nuking the battery life of the glasses. The case would send and receive cellular signals, and relay the data to the glasses via Bluetooth for low-bandwidth tasks and Wi-Fi for high-bandwidth.
Interestingly, Meta has a partnership with Verizon to soon sell Meta Ray-Ban Display in stores. Could this evolve into a cellular partnership for the next generation?
Conclusions: Is It Worth $800?
Meta Ray-Ban Display is very much so a first generation product, an early attempt at a category that I'm convinced, after a month of testing, could eventually become a key part of the lives of at least hundreds of millions of people, if not billions.
In its current form, it's an amazing way to capture photos and videos without leaving the moment, and a great way to decide whether WhatsApp messages are worth your time without taking out your phone. In the cities where it's supported, the on-foot navigation is genuinely useful, and for the languages supported, the live translation feature could change what it means to travel. Like displayless smart glasses, it's also a great way to listen to spoken audio and take calls.
But the monocular display just doesn't feel right, too much of what Meta is trying to do is hampered by the device's lack of cellular connectivity, and the lack of established services like Gmail, Google Maps and Google Translate makes Meta Ray-Ban Display so much less useful than HUD glasses theoretically could be.
Further, while the Meta Neural Band works incredibly well, future versions need to replicate the functionality of smartwatches, instead of asking buyers to justify wearing yet another device on their wrist.
If you're an early adopter who loves having the latest technology and you don't mind looking odd in public, can live with the flaws I've outlined, and are fine with wearing a dedicated input device on your wrist, there's nothing else quite like Meta Ray-Ban Display, and the novelty could make up for the issues.
For everyone else, I recommend waiting for future generations of HUD glasses, ideally with binocular displays and either cellular connectivity or a seamless automatic phone Wi-Fi sharing system that I suspect only Apple and Google, the makers of your phone's OS, can pull off.
Just like with its Quest headsets, Meta is set to see fierce competition from the mobile platform incumbents in this space, and it'll be fascinating to see how the company responds and evolves its products through the rest of this decade and beyond.
Escape from Hadrian’s Wall is a 5th-century VR puzzle game where you navigate the titular location with magical abilities.
Developed by Jim Gray Productions, Escape from Hadrian’s Wall is a historical fantasy puzzler set in 402 A.D. Britannia during Roman occupation and dives into Celtic legends. As a nameless prisoner held captive inside one of its forts, you become a witch's apprentice and use magical artifacts to solve the puzzles within. That's out today on Quest and PC VR.
Exploring the dungeons beneath Hadrian's Wall, this campaign sees you using magical cards and tools as you manipulate elements like earth, air, fire, and water to solve these puzzles. Jim Gray Productions states the full game features 38 puzzles in total, which include 18 Elemental Golem fights carried out through card battles.
Recently featured in the last Steam Next Fest, a free PC VR demo remains available to download that's seen several updates since its initial launch, introducing additional accessibility features and a French localization. That same demo is also available on Quest, and further language support is promised at a later date.
Escape from Hadrian's Wall launches today on PC VR and Quest.
Every June and December, UploadVR connects with dozens of developers and publishers within the XR industry to highlight the best that virtual reality has to offer. It’s showcase season once again, and we’re excited to announce:
The Showcase will premiere December 5th @10am PT on the IGN and UploadVR YouTube channels.
Kudos to everyone who participated in this past summer’s event - none of this would be possible without the support of the VR community and the trust of developers and publishers who give us their secret showcase announcements. A huge thank you also goes out to last season’s sponsors: Elsewhere Electric, Dixotomia, Fruit Golf, Nightclub Simulator VR, and Virtual Skate. You have helped UploadVR continue to bring you the latest and greatest in all things VR, XR, AR, KR... ZR... (what acronyms are we supposed to use nowadays?) and supported your fellow developers by giving us the means to make the showcase. Thank you!
As always, we’re searching for exclusive content, reveals, and announcements. If you have a new game or exclusive news and you want to submit a video for this season, fill out the application!
Here’s some additional information about the UploadVR Showcase - Winter 2025:
How Do I Watch the Showcase?
Subscribe to our YouTube channel to receive notifications once the showcase goes live. You can also follow us on X, Bluesky, and Instagram for the latest updates.
How Do I Submit My Game to the Showcase?
To submit a game or sign up as a sponsor, please fill out this form. Applying to the showcase tells us what you intend to announce in your video, you do not need to have a video created when you apply. We will respond with our level of interest in the project, and tell you next steps for submitting your video.
How Does UploadVR Select What is in The Show?
We’re looking for originality, oddity, interest, and impact. The projects we highlight are an amalgamation of large and smaller-scale works. While some submissions may not jive with this season’s showcase, we’re always open to future submissions.
Content must be kept under an embargo so that announcements are exclusive to the premiere.
When Will I Know If My Application Was Accepted?
The UploadVR team reviews applications as they come in, and submitters can expect a reply that states our level of interest in what you've described.
The ideal deadline for videos is Thursday, November 20th, 2025. However, we will accept final video submissions until Thursday, November 27th, 2025.
Videos should be in 1080p or 4K and 30-60fps.
There is no deadline for submitting an application, although you probably shouldn't apply the morning of the show.
If your project won’t be ready by the end of November, we encourage you to submit for the next season, or chat with us about coverage and collaboration by emailing tips@uploadvr.com.
When Are Selections Made?
If your project has been accepted, we’ll be contacting you as soon as we review your application. Please wait to contact us regarding the status only if you haven’t heard from us within 7 days after submitting.
Inu Atsume VR is a virtual pet simulator by Hit-Point, creators of the popular cat-collecting game Neko Atsume Purrfect, and it's launching on Quest soon.
Similar in style to the studio's feline-filled experience, Neko Atsume Purrfect, Inu Atsume VR offers puppy-loving players the chance to complete a Dog Encyclopedia and compete with their newfound companions across three competitions. It's playable in both VR and MR, with the mixed reality mode allowing pups to roam freely around your living space without running the risk of a mess.
To find new pets, you can visit a park called the “Square,” where you can throw frisbees to earn the attention of your desired pet and play together. By continuously showering the pup with praise and attention, it will gradually inch closer and eventually become your new friend.
From here, the canine companions can be trained up and taught tricks like Shake, all while receiving gifts that fill out the virtual space. Those with a keen eye for interior design can also customize their in-game home by tweaking the colors of walls and doors.
While the store page lists a 'November 2025' release window, the official website confirms Inu Atsume VR will make its Quest debut on November 20 for $14.99.
Exploration sim Cave Crave added an arcade mode and new horror map in its latest update, and a PC VR release will follow soon.
Developed by 3R Games, Cave Crave sees you exploring tight tunnels and caves as you try to find an escape, marking walls with chalk and using various tools. While this update will arrive “soon” on PS VR2, Quest players can now jump into a new Arcade Mode that turns this into a competitive race against time, where you aim for the quickest run on the online leaderboards.
As for Cave Crave's optional Horror Mode, that's been updated with a brand new map called 'Abyss,' where your goal is to simply make it back alive. 3R Games states that it's been “inspired by cosmic dread and subterranean monstrosities straight out of a Lovecraftian nightmare,” warning of something “ancient and malevolent” hiding in the dark.
This follows the recent addition of Utah's Nutty Putty Cave as a free update on both platforms, a real-life cave closed in 2009 after the death of John Edward Jones. 3R Games says this was recreated using the official cave map and additional data without gamifying it, stating its aim to offer a “respectful, authentic way” to explore this permanently closed site.
Cave Crave is out now on PlayStation VR2 and Quest, while the Steam version is “scheduled to launch within the next few weeks.”
Less than a week since arriving in early access, VR FPS Forefront took #6 for top-earning games by weekly revenue on Quest.
Launched on November 6 in early access, Forefront is a 16v16 VR shooter from Triangle Factory that features semi-destructible maps where you split into four-person squads. Four days after that initial launch, it's reached #6 at the time of writing with a 4.6-star rating on the Meta Horizon Store after 490 user reviews, while Steam lists a “very positive” rating at 298 reviews.
Elsewhere in the charts, the top 10 earners this week remain a mostly familiar sight that's a mix of paid apps and free-to-play titles. UG is at #1 and now boasts the most user reviews on the Horizon Store at 172k. That now surpasses Gorilla Tag, which is currently at 164k user reviews.
Meta Horizon Store: Top-earning games this week by revenue as of November 10, 2025
Beat Saber holds #2, which we'd speculate was further boosted by the recent Spooky Scary Skeletons DLC for Halloween, and that's followed respectively by Animal Company, VRChat, and Gorilla Tag. Rounding out the top 10 in order after Forefront are Blade & Sorcery: Nomad, PokerStars - Vegas Infinite, and Bonelab. #10 keeps changing between FitXR and Golf+, so we cannot determine which one officially holds that position.
We'll continue monitoring these standings, and this list may evolve as the week goes on. You can find the full charts here, which cover the top 50 games and account for all forms of revenue. It's a different approach to the top 50 best-selling Quest games of all time charts, which only factor in paid app sales without including DLC, and that recently saw Assassin's Creed Nexus join the list.
This article was updated shortly after publication when the top 50 games became viewable instead of the top 49. #10 was briefly listed as FitXR but this was changed to Golf+ after the stats were refreshed.
Thrasher receives its remastered edition with a visual update, flatscreen mode, and more today on Steam.
Released on Quest and Apple Vision Pro last year, Thrasher is a cosmic action racer that tasks you with controlling a space eel through obstacle-filled levels, and we previously named it our favorite Apple Vision Pro game of 2024. Following September's PC VR demo release, developer Puddle has launched it today on Steam, with a price drop to $9.99 on all platforms.
As detailed in September, Thrasher's remastered Steam release promises improved visuals compared to standalone platforms. Puddle states the new PC VR controls are more responsive too, letting you pick either controllers or hand tracking support. UX and UI changes are also included, there's an optional flatscreen mode on PC with gamepad and mouse controls, and Steam Deck compatibility at 90 FPS.
Other changes include a new Play+ mode that aims to provide a harder challenge for advanced players, while Time Trials test your speed at clearing levels with no combo bonuses. When asked by UploadVR if these modes will eventually come to Quest or Apple Vision Pro, Puddle advised it has no updates to share about other platforms at this time.
The Steam release also follows Puddle releasing Thrasher's remastered version as a launch title for Samsung Galaxy XR, joining the list of Android XR games currently available. Much like the Steam edition, this also runs at 90 fps on Samsung's headset with the new modes and support for both hand tracking and controllers.
This article was updated shortly after publication with a response from Puddle and following the official launch of a Samsung Galaxy XR port. It was updated again when the Steam release launched.
Constellations: Touch the Stars lets you scan the night sky with a connect the dots experience.
It's the latest experience from developer Grant Hinkson via Parietal Lab, who previously released Connectome earlier this year using the same “connect the dots” engine. Constellations: Touch the Stars includes all 88 constellations recognized by the International Astronomical Union (IAU), letting you trace and connect each constellation until the pattern is complete.
It's been designed using a “hands-first” philosophy with hand tracking support, using a thumb tap motion to bring constellations forward. The sky is positioned based on the user’s location to determine which constellations you see, and Constellations: Touch the Stars comes with fully immersive environments in early access.
Further updates are planned following the initial launch, such as a mixed reality stargazing mode that sees stars overlaid against their real positions. This pulls up constellation names and data using the immersive view's overlay. Other promised features are a 'lie-back' mode for looking up at the stars while lying down, social stargazing with friends, and creating your own constellation patterns.
Constellations will launch in early access on Meta Quest 3/3S, arriving in the first half of December. Pre-early access builds are also available by joining the official Discord server.
Retro-futuristic puzzler UnLoop reaches PC VR in early access next week.
Published by CM Games (Into the Radius) and developed by Superposition NULL, UnLoop is a sci-fi puzzle game built around self-cooperation and time manipulation that's reminiscent of We Are One. Set on a remote space station called the Temporal Research Hub, you create copies of yourself each loop and replay your past actions in real time as you retrieve data.
Following its full release on Quest and Pico, CM Games has chosen early access on Steam to gather feedback about “optimization, player experience, graphics, and to address possible PCVR-related feature requests.” It still contains content parity with the standalone edition, and a Version 1.1 update is planned this December that promises new puzzles and a story continuation.
On the hardware side, UnLoop on Steam will initially support using Quest, Pico, and Valve Index headsets. A Steam FAQ confirms the developer will explore compatibility with additional headsets and controllers depending on community feedback, and PC VR visual improvements are also planned.
We had positive impressions in our UnLoop hands-on back in September, considering it a “clever self-co-op experience” held back by a “few rough edges.”
UnLoop looks to be a promising head-scratcher for players who love time-looping puzzles and self-orchestrated hijinks. Its core concept is compelling and clever, but a few rough edges keep it from being a standout recommendation just yet. With a bit of polish and hopefully some patches, this could be one to loop back to.
Homeworld: Vast Reaches brings the real-time strategy game to SteamVR with upgraded visuals.
Developed by FarBridge, Homeworld: Vast Reaches takes place between the events of Homeworld 1 and Homeworld 2, setting adventurous astronauts on a fresh journey within the series’ universe. You play as Tyrra Soban, a new Fleet Commander, guided by Karan S’jet as they tackle an unknown evil. Originally launched on Quest last year, it's out today on Steam after originally targeting an October 23 launch.
0:00
/0:54
Traditionally a flatscreen series, Homeworld: Vast Reaches adapts the controls for VR, allowing players to immerse themselves in their space war strategies up close and from 'any angle.' Using a virtual command module located on your wrist, clever tacticians can create ships and direct formations in your search for victory.
In addition to visual improvements, the SteamVR version introduces new Challenge Levels designed to test experienced players. Those levels are also available on Quest with a new free update.
“When we launched on Meta Quest initially, some core strategy players reported they had mastered the gameplay in Vast Reaches and wanted harder missions, so we built three new Challenge Levels for this new version with them in mind,” said FarBridge Creative Director Richard Rouse. “Get ready!”
In our previous impressions on Quest, we felt Homeworld: Vast Reaches maintained the strategic depth and storytelling chops of its predecessors.
“This new adventure successfully translates the complex, strategic gameplay of the Homeworld series, all while bringing the franchise into a new and immersive medium, making Vast Reaches a standout title in the VR RTS genre and one that we feel is a must-play for both fans of the long-running series and newcomers to VR and MR gaming.”
Homeworld: Vast Reaches is out now on Steam and Quest.
RUSH: Apex Edition brings the 2017 wingsuit racer back today on PlayStation VR2, read on for our full impressions.
The Binary Mill has been going all in on PlayStation VR2 this last year, delivering high quality ports for Into Black and Resist while taking full advantage of PS5 Pro enhancements. More than eight years since RUSH first appeared on Gear VR, later followed by subsequent ports and updates, it's now returned with some welcome changes, like expanding online multiplayer to support 12 players.
0:00
/0:43
RUSH: Apex Edition adds some appreciated visual upgrades like revamped lighting and textures, and it looks great in motion. Subtle touches like your mask showing frost in the corners as you glide through this icy mountain are rather nice, though I do wish the landings were smoother as you reach the end. Performance feels great at a native 120fps on PS5 Pro, while the base PS5 supports 90fps.
Four solo modes are included alongside online multiplayer. Standard 'Races' against the AI earn medals for a top three finish, and those convert into points, unlocking more courses and wingsuit customization options. 'Time Attack' involves beating your own scores, 'Score Challenge' adds an interesting twist where gliding through a specific part of a checkpoint ring gets a better score. Finally, 'Free flight' mode lets you explore without any course restrictions.
Races are RUSH's biggest draw. Visually diverse environments set the scene well and each hosts dozens of courses that follow different paths, though said courses begin feeling very similar after a while. Even still, there's an initial rush (no pun intended) as you descend, gliding your way across these courses in hopes of being first. Failing to reach a checkpoint adds a five second penalty, forcing you to follow a specific path to have any chance of winning.
The initial platform jump uses gaze tracking to determine you're looking forward, asking you to hold this for three seconds before the race begins. That's tracked by a pointer and while you can swap to a less noticeable one, not using eye tracking for this feels like a missed opportunity. It's moments like this that show the game's aging foundations, something that also applies to the control scheme.
Screenshot captured by UploadVR on PlayStation VR2
Gliding through and steering involve lifting your arms up to different positions. Raise both at once to ascend, down to descend, or alternate your hand movements here for going left and right. A functional but basic approach that leaves you holding your arms out, though it's a better choice than using analog sticks. For greater immersion, putting a fan on feels great as the cold “wind” hits you while racing.
You can build up speed boosts in two ways: either reaching checkpoints or gliding close to a wall and the ground, and I enjoy how RUSH: Apex Edition rewards risk takers with the latter. It's a critical balancing act as those boosts can be the difference between 1st and 2nd, but a single collision is all it takes to end your run. Boosting also benefits from adaptive trigger support on PlayStation VR2.
Descending through these courses remains satisfying, though that feeling becomes fleeting in longer stints. I'm having plenty of fun messing around in the lobbies where you can shoot some hoops, or shoot other players with dart guns; I'm just not compelled to stick it out much longer with the main game.
Given that PlayStation VR2 lacks backward compatibility with the original PlayStation VR, I'm pleased more games are getting a second life, though RUSH's aging gameplay makes it a harder recommendation in 2025. Still, Apex Edition is a great remaster effort from The Binary Mill that's the best way to play.
Forefront, a 32-player VR FPS from the Breachers studio, is out now in early access on Quest, Steam, and Pico.
Developed by Triangle Factory, Forefront is a 16v16 VR shooter with expansive, semi-destructible maps where each team splits into four-person squads. Featuring four playable classes, four maps, a friends system, alongside customization and attachments for weapons, it's now entered early access on all three platforms with cross-platform multiplayer support.
Forefront takes place in a near-future setting of 2035, where an energy corporation called O.R.E. has gone to war with local governments over control of a rare mineral. Battles feature over 20 types of weapons and 10 vehicles covering land, air, and sea, while you can choose between four classes with their own unique weapons and gadgets: Assault, Engineer, Medic, or Sniper.
Detailing its release plans in a Steam FAQ, Triangle Factory states that Forefront will approximately remain in Early Access for "8-12 months." Planned additions for the full release include more maps, vehicles, and gadgets, joined by class perks, performance improvements. Updated PC VR graphics are also mentioned, and the studio plans to "gradually raise the price" as new content gets introduced.
Forefront's current roadmap
Forefront is out now in early access on Quest, Steam, and Pico. We'll be bringing you our full early access impressions as soon as we can.