↩ Accueil

Vue lecture

Meta Ray-Ban Display Review: First Generation Heads-Up Mobile Computing

Meta Ray-Ban Display is an early glimpse of a future where mobile computing doesn't mean looking down and taking something out of your pocket.

Most people never leave home without their phone and take it out of their pocket so often that it's become a replacement for fidgeting. Today, the smartphone is the ultimate mobile computing device, an omnitool for communication, photography, navigation, gaming, and entertainment. It's your alarm clock, calendar, music player, wallet, and flashlight. Globally, more people own a smartphone than TVs and cars combined. To get philosophical for a moment, the smartphone has become humanity's second cognitive organ.

The problem is that taking out your phone harshly disconnects you from the world around you. You have to crane your neck down and disengage from what you were otherwise doing, with your attention consumed by the digital world of the little black rectangle.

Photograph by Margaret Burin of ABC News.

In recent years, multiple startups have tried and failed to solve this problem. The smug founders of Humane came to liberate you from your phone with a $700 jacket pin, while Rabbit r1 promised the "large action model" on its $200 pocket device could handle your daily life instead.

The truth, and the reason why these companies failed, is that most people adore their phones, and are borderline addicted to the immense value they provide. And the screen of the smartphone is a feature, not a bug. People love being able to view content in high-resolution color anywhere they go, and despite the cries of a small minority of dissenters, the models with the biggest screens sell the best.

"If you come at the king, you best not miss", as the phrase goes.

The only form factor that seems to have any real chance of one day truly replacing the smartphone is AR glasses, which could eventually provide even larger screens that effectively float in midair, anywhere the wearer wants, any time they want. But while prototypes exist, no one yet knows how to affordably produce true AR glasses in a form factor that you'd want to wear all day. In the meantime, we're getting HUD glasses instead.

HUD glasses can't place virtual 3D objects into the real world, nor even 2D virtual interfaces. Instead, they provide a small display fixed somewhere in your vision. And in the case of many of the first-generation products, like Meta Ray-Ban Display, that display is only visible to one of your eyes.

Meta Ray-Ban Display is also highly reliant on your nearby phone for connectivity, so it isn't intended to be a replacement for it as a device. It is, however, meant to replace some of the usage of your phone, preventing the need to take it out of your pocket and keeping your head pointed up with your hands mostly free. So does it succeed? And is it a valuable addition to your life? I've been wearing it daily for around a month now to find out.

(UploadVR purchased Meta Ray-Ban Display at retail with our own funds, while Meta provided us with the correctly sized Meta Neural Band for review.)

Comfort & Form Factor

Unlike a VR headset that you might use at home or on a plane for a few hours, the pitch for smart glasses is that you can wear them all day, throughout your daily life. Even when they run out of battery, they can still act as your sunglasses or even prescription eyewear.

Meta Ray-Ban Display Prescription Lenses: What You Need To Know
Looking to use Meta Ray-Ban Display as your everyday prescription glasses? Here’s a rundown of what prescriptions it supports, and how that works.
UploadVRDavid Heaney

As such, it's crucial that they have a design you'd be okay with wearing in public, and that they're comfortable enough to not hate having them on your face.

Meta Ray-Ban Display weighs 69 grams, compared to the 52 grams of the regular Ray-Ban Meta glasses, and 45 grams of the non-smart Ray-Ban equivalent. It's also noticeably bulkier, with thicker rims and far thicker temples.

Ray-Ban Meta vs Meta Ray-Ban Display vs Xreal One Pro

In my month with Meta Ray-Ban Display I've worn it almost every day throughout my daily life, sometimes for more than 8 hours at a time, and I experienced no real discomfort. The additional weight seems to be mostly in the temples, not the rims, while the nose pads are large and made out of a soft material. If anything, because the larger temples distribute the weight over a larger area and are more flexible, I think I even find Meta Ray-Ban Display slightly more comfortable than the regular Ray-Ban Meta glasses.

So, for my face at least, physical comfort is not an issue with Meta Ray-Ban Display. But what has been an issue is the social acceptability of its thick design.

With the regular Ray-Ban Meta glasses, people unfamiliar with them almost never clocked that I was wearing smart glasses. The temples are slightly thicker than usual, but the rims are essentially the same. It's only the camera that gives them away. With Meta Ray-Ban Display, it's apparent that I'm not wearing regular glasses. It's chunky, and everyone notices.

In some circles, thick-framed glasses are a bold but valid fashion choice. For most people, they look comically out of place. I've asked friends, loved ones, and acquaintances for their brutally honest opinions. Some compared it to looking like the glasses drawn on an archetypal "nerd" in an old cartoon, while only a few said that the look works because it matches current fashion trends. And my unit is the smaller of the two available sizes.

Ray-Ban Meta vs Meta Ray-Ban Display vs Xreal One Pro

Meta Ray-Ban Display also comes in two colors, 'Black' and 'Sand', and a confounding factor here is the black is glossy, not matte. I'm told that this decision was made because glossy was the most popular color for the regular Ray-Ban Meta glasses. But combined with the size, the glossy finish on Meta Ray-Ban Display makes it look cheap in a way that an $800 product really shouldn't, like a prop for a throwaway Halloween costume.

So Meta Ray-Ban Display is physically comfortable, but not socially. More on that soon.

The Monocular Display

The fixed HUD in Meta Ray-Ban Display covers around 14 degrees of your vision horizontally and vertically (20 degrees diagonal). To understand how wide that is, extend your right arm fully straight and then turn just your hand 90 degrees inward, keeping the rest of your arm straight. To understand how tall, do the same but turn your hand downwards.

When it comes to what you see within that 20 degrees, it's a clear and high detail image, though ever so slightly soft rather than fully sharp, with higher angular resolution than even Apple Vision Pro. There's a minor but non-distracting glare that sees, for example, icons mildly bleed into the empty space around them.

More notably, the significant difference between a waveguide display like this and the interfaces you might see in a mixed reality VR headset is that it's very translucent, with a ghostly feel. You can see the real world through it.

Display System Specs

  • Display Type: Full-Color LCOS + Geometric Reflective Waveguide (Monocular)
  • Resolution: 600×600
  • Angular Resolution: 42 pixels per degree
  • Field Of View: 14°H × 14°V (20° D)
  • Peak Brightness: 5000 nits (automatically adjusts)
  • Frontal Light Leak: 2%

The display's perceived opacity and brightness is highly variable, because with waveguides this depends on the ambient light level, and the system also rapidly automatically adjusts the LCOS brightness, leveraging the ambient light sensor. You can manually adjust the brightness if you want, but the system is very good at deciding the appropriate level at all times, so I never do.

With a 5000 nit maximum, it's even visible in daytime sunlight, though it has a very ghostly translucent feel. As the photochromic lenses transition to dark, the perceived opacity slightly increases.

One notable quirk is that because an LCOS is essentially an LCD microdisplay, if you're in a pitch black room you'll see a faint glow throughout the display area when it's on, like an LCD monitor trying to show black. But in anything above pitch black, you won't see this.

So Meta Ray-Ban Display's display is surprisingly good, and lacks the distracting visual artifacts seen in many of the waveguide headsets of the 2010s. But, of course, there is a massive, glaring problem: it's only visible to your right eye.

0:00
/0:11

No through-the-lens approach accurately depicts what the HUD looks like, so here's Meta's generic depiction instead.

Meta Ray-Ban Display is a monocular device. Your left eye sees nothing at all.

There is no situation in nature where one of your eyes sees something yet the other doesn't while both are open. It just feels wrong, and induces a constant minor feeling of eyestrain when I look at the display for more than a few seconds, and thus I would never want to watch a video or conduct a video call on it (more on those use cases later). I've also put the glasses on more than a dozen people now, and while some of them could just about tolerate the monocular display, others found it hurt their eyes within seconds.

I suspect that this is a core reason why Meta Ray-Ban Display is only available after a retail demo. This just isn't a visually comfortable product for many people, and Meta likely wants to avoid returns.

Bloomberg's Mark Gurman and supply-chain analyst Ming-Chi Kuo have both claimed that Meta plans to launch a binocular successor to Meta Ray-Ban Display in 2027, at which point the company is expected to significantly ramp up marketing, production, and availability. By closing my left eye, I can already get a pretty good feel for just how much better the next generation will be.

No Light Leak! But Is That A Good Thing?

Almost all of the brightness of Meta Ray-Ban Display stays on your side of the glasses – 98% according to Meta. The display is not visible to people looking at you. I've repeatedly asked friends whether they can even tell if I have the display on or off, and none have been able to so far. The clickbait YouTube thumbnails you may have seen are fake.

This is partially due to the low "light leak" of the geometric waveguide in Meta Ray-Ban Display, but it's also because of the automatic brightness adjustment. If you manually turn up the brightness to an uncomfortably high level, which I can't imagine anyone intentionally doing, you can make the display slightly visible externally, though not its content. It just looks like a scrambled pattern. But again, even this requires an adjustment that no regular user would reasonably make.

All this said, while I initially assumed that the low light leak was an important feature of Meta Ray-Ban Display, I've come to see the inability for nearby people to know if you're looking at the HUD as somewhat of a bug.

When you're with another person and take out your phone, that's an unambiguous indicator that you're diverting attention from them. Similarly, while Apple Vision Pro shows a rendered view of your eyes, when virtual content is occluding the person you're looking at, Apple intentionally renders a pattern on the front display. Why? To signal that you're looking at virtual content, making it clear when the person does and doesn't have your full attention.

When someone wearing an Apple Vision Pro is looking at virtual content that partially occludes you, you'll see a pattern on the display in front of their rendered eyes (the center). With Meta Ray-Ban Display, you have no idea whether the wearer is looking at the HUD or you.

With Meta Ray-Ban Display, there is no such signal. People who spend a lot of time with you can eventually figure out that you're looking at the HUD when your eyes are looking down and to the right, but it's far more ambiguous, and this is not conducive to social acceptability. Are you present with them or are you not? They can't clearly tell.

And the worst case scenario is, when looking at the HUD, to a person sitting in front of you it can, in some circumstances, appear as if you're just looking at their chest. Yikes.

I'm not saying that I want people to be able to see the content of the display, as that would be a terrible privacy flaw. But I do wish there was an external glow on the lens when the display is on. I don't want the other person to have to guess whether I'm fully present or not, and whether I'm looking at the HUD or their body.

The Interface & Meta Neural Band

Like the regular Ray-Ban Meta glasses, you can control Meta Ray-Ban Display with Meta AI by using your voice, or use the button and touchpad on the side for basic controls like capturing images or videos and playing or pausing music. But unlike any other smart glasses to date, it also comes with an sEMG wristband in the box, Meta Neural Band.

In its current form, Meta Neural Band is set up to detect five gestures:

  • Thumb to middle finger pinch: double tap to toggle the display on/off, single tap to go back to the system menu, or hold for quick shortcuts to the 3 menu tabs.
  • Thumb to index finger pinch: how you "click".
  • Thumb to side of index finger double tap: invoke Meta AI.
  • Thumb swiping against the side of your index finger, like a virtual d-pad, which is how you scroll.
  • Thumb to index finger pinch & twist: to adjust volume or camera zoom, as you would a physical volume knob.

How Does Meta Neural Band Work?

Meta Neural Band works by sensing the activation of the muscles in your wrist which drive your finger movements, a technique called surface electromyography (sEMG).

sEMG enables precise finger tracking with very little power draw, and without the need to be in view of a camera.

The wristband has an IPX7 water resistance rating, and charges with an included proprietary magnetic contact pin charger.

While in the above clips I have my arms extended to illustrate the gestures, the beauty of sEMG is that you don't need to. Your hand can be at your side, resting on your leg, or even in your pocket. And it works even in complete darkness.

The gesture recognition is almost flawless, with close to 100% accuracy. The one exception is that rarely, if I'm walking it will fail to pick up a sideways directional swipe. But in general Meta Neural Band works incredibly well. The volume adjustment gesture, for example, where you pinch and twist an imaginary knob, feels like magic.

0:00
/0:05

Wake (top left), Scroll (top right), Click (bottom left), and Volume/Zoom (bottom right)

The Meta Neural Band gestures control the HUD interface, which looks much like that of a smart watch. It has 3 tabs, which you horizontally scroll between:

  • The center home tab (the default) shows the date and time, your notifications, and a Meta AI button, with tiny shortcuts to active audio or navigation at the top.
  • The right tab is the two-column app library: WhatsApp, Instagram, Messenger, Messages, Calls, Camera, Music, Photos, Captions, Maps, Tutorials, and a game called Hypertrail. Four rows are shown at a time, and you navigate vertically to see the rest.
  • The left tab features quick controls and settings like volume, brightness, Do Not Disturb, as well as shortcuts to Captions, Camera, and Music.

When I first used Meta Ray-Ban Display I found the interface to be "just too much", often requiring too many sequential gestures to do what you want. While I still broadly hold this view a month later, I have found myself becoming more used to quickly performing the correct sequence with experience, and I've discovered that if you pinch and hold your thumb to your middle finger, you get shortcuts to the three main menu tabs, which can speed things up.

I still think there's plenty of room for improvement in Meta Ray-Ban Display's interface though, from a simplicity perspective, and repeat my assertion that the menu should have two tabs, not three. The Meta AI finger gesture makes the Meta AI button on the center tab redundant, for example, and when you don't have any notifications, the tab feels like a waste of space.

0:00
/0:10

This clip from Meta shows the 3 tabs of the system interface, and how you swipe between them.

A lot of the friction here will eventually be solved with the integration of eye tracking. Instead of needing to swipe around menus, you'll be able to just look at what you want and pinch, akin to the advantages of a touchscreen over arrow keys on a phone. But for now, it can sometimes feel like using MP3 players before the iPod, or smartphones before the iPhone. sEMG is obviously going to be a huge part of the future of computing. But I strongly suspect it will only be one half of the interaction answer, with eye tracking making the whole.

A major improvement since the Meta Connect demo though, is performance. While I still wouldn't describe Meta Ray-Ban Display as very snappy, the abject interface lag I encountered at Connect is gone in the shipping consumer model. The most noticeable delay is in waking the display, which often takes a few seconds after the gesture.

Meta Neural Band size comparison with Fitbit Luxe.

Coming back to Meta Neural Band for a second, the only real problem with it is that it's something else you need to wear and charge (with a proprietary cable). I always wear a Fitbit on my left wrist, and now I wear the Meta Neural Band on my right too.

That's not to say that Meta Neural Band is a comfort burden. I find it no more or less comfortable than I did the Pixel Watch I used to own, and having tried a Whoop it feels similar to that too. And while it does leave a minor mark on my wrist, so do those other devices.

But it's another thing to remember to put on charge at night, another proprietary cable to remember to bring when traveling, and another USB-C port needed at my bedside.

Ideally, I should only have to wear and charge one wrist device. But today, Meta Neural Band is solely an sEMG input device, and nothing more.

Traveling means bringing the proprietary charging cable with you.

The band already has an accelerometer inside, so in theory, a software update could let it track my daily step count. And if a future version could add a heart rate sensor for fitness, health, and sleep tracking, I wouldn't need my Fitbit at all. But we're not there yet. And wearing a second wrist device is a big ask for any product.

Features & Use Cases

There are six primary use cases of Meta Ray-Ban Display. It's a camera, a communications device, a GPS navigator, an assistive captions and translations tool, a personal audio player, and an AI assistant that can (if you want) see what you see.

So how well does it do each of these things?

Capturing Photos & Videos

When the regular Ray-Ban Meta glasses first launched, they were primarily pitched as camera glasses, like their predecessor the Ray-Ban Stories, and this remains one of the biggest use cases even as Meta now calls the product category "AI glasses". But without a display, you didn't get a preview of what you're capturing, nor could you check if the result was any good until it synced to the phone app. Sometimes you got lucky, and other times you failed to even frame your subjects, an issue not helped by the camera on almost all smart glasses being on a temple rather than centered.

0:00
/0:08

Meta depiction of photography.

Meta Ray-Ban Display also has 32GB of storage for media, and also syncs everything to your phone via Wi-Fi 6 when you open the Meta AI app. But the experience of capturing media is fundamentally better, because you get a live visual preview of exactly what's in frame. And with the Meta Neural Band, you can capture without raising your arm, as well as adjust the zoom on the fly by pinching your index finger to your thumb and twisting, the same gesture used to adjust the volume.

The camera quality isn't as good as your phone, given that the sensor has to fit into the temple of glasses. It appears to be the same camera as the regular Ray-Ban Meta glasses, and while it can produce great results in daytime, in low-light environments or with heavy zoom you'll get a relatively grainy output. You can see some sample shots and clips in my colleague Ian Hamilton's launch-week impressions piece.

Hands-On With Meta Ray-Ban Display & Meta Neural Band Across New York City
UploadVR’s Ian Hamilton bought the Meta Ray-Ban Display glasses on launch day and tested them across Manhattan.
UploadVRIan Hamilton

Regardless, the ability to capture precisely framed shots without needing to hold up a smartphone is Meta Ray-Ban Display at its best. It lets you both stay in the moment and capture memories at the same time, without the doubt that what you've shot doesn't capture what you wanted to include. And it works completely standalone, even if your phone is out of battery.

With a couple of swipes and taps you can also easily send your captured media to a friend, something that required extensive voice commands with the displayless glasses. And this brings us on to messaging.

Messaging

While Meta doesn't have an existing device ecosystem like Apple and Google, what it does have is the most popular messaging platform and two most popular social networks in the world, all three of which have an integration on Meta Ray-Ban Display.

You can opt to have incoming WhatsApp, Messenger, Instagram, and text messages pop up on the display, and so without needing to look down at a smartwatch or take your phone out of your pocket you can "screen" them to decide which are important enough to respond to immediately. Tap your thumb to your middle finger to dismiss a message, or to your index finger to open it.

The notification appears in the very bottom of the display area, which is already slightly below your sightline, so doesn't block your view of what's in front of you. And of course, unlike with a phone, no one around you can see it. There's also a setting to automatically detect when you're in a moving vehicle, so if you start driving a car you won't be interrupted.

0:00
/0:03

Meta depiction of messaging.

If you want to respond to a message, there are 4 options in the interface: dictate, voice note, suggested emoji, or suggested reply.

I don't send voice notes, and I don't find the suggested emojis or replies useful, but I do love the dictation. It's powered by an on-device speech recognition model with relatively low latency and surprisingly great accuracy, in my experience. Even with my Northern Irish accent that some other systems (and people) can find difficult to understand, I'm able to dictate full responses without needing to type. And what's most impressive is that it works even when you speak quietly, thanks to the six-microphone array that includes a dedicated contact mic positioned just a few inches above your lips. That's yet another advantage of the glasses form factor.

Still, there are plenty of messages that I wouldn't want to dictate even softly in public, and situations when I want to use combinations of punctuation and letters that don't have a spoken form. Meta plans to release a software update in December that will let you enter text by finger-trace letters on a physical surface, such as your leg, bringing some of its more advanced sEMG research out of the lab and into the product. It sounds straight out of science fiction, and we'll bring you impressions of it when the update rolls out. But it's not there today.

What About iMessage?

If you're an iPhone user, you're probably wondering whether all this works with iMessage.

I use an Android phone, from which Meta Ray-Ban Display can receive and send both 1-on-1 and group text messages, including both SMS and RCS.

For iPhone, I'm told, you can receive and send only 1-on-1 iMessages, with no support for group threads. And this support is limited to only pop-up notifications - you won't see a 'Messages' app in the list.

These limitations, to be clear, are imposed by Apple, and Meta would gladly support the missing features if Apple let it.

As well as receiving new messages as pop-up notifications, you can also access your 10 most recent threads on each messaging platform at any time by swiping to it on the apps list. In the apps, you can scroll through past messages and send new ones, including your captured photos and videos stored on the glasses.

The problem with accessing your past message threads, and with viewing photos and videos you've been sent, is that it's incredibly slow to load. Open WhatsApp on your phone and you'll typically see your messages update within a fraction of a second. On Meta Ray-Ban Display you'll see everything as it was the last time the applet was opened for upwards of 10 seconds, while media can take over a minute at times, or even just seemingly never load. And it's badly missing a loading progress bar.

So, for example, while in theory you can definitely use Meta Ray-Ban Display to catch up on the dozens of Reels that one unemployed friend sends you all day, assuming your eyes can put up with the monocular display for this long, in practice, as with waiting for fast-moving group chats to finally load, I often found it faster to take the phone out of my pocket and open the real app.

The cause of this slowness seems to be that Meta Ray-Ban Display is entirely reliant on your phone's Bluetooth for internet connectivity. This connection speed issue is something I ran into repeatedly on Meta Ray-Ban Display, and made me wish it had its own cellular connection. In fact, I'm increasingly convinced that cellular will be a hard requirement for successful HUD and AR glasses.

Audio & Video Calls

For over two years now, I've made and taken almost every phone call, both personal and professional, on smart glasses. I hate in-ear and over-ear audio devices for calls because it feels unnatural to not hear my own voice clearly, and the people on the other end love the output of the optimally-positioned microphone array.

With Meta Ray-Ban Display, the addition of the HUD and wristband lets you see how long a call has gone on for, and end it without having to raise your hand. You can also view and call recently called numbers in the Calls applet.

Meta depiction of video calling.

On the regular Ray-Ban Meta glasses you can also share your first-person view in a video call, and the big new calling feature of Meta Ray-Ban Display is that you can also see the other person.

But again, this great-in-theory video calling feature is ruined by the fact that the internet connection is routed to Meta Ray-Ban Display via your phone's Bluetooth. Even with my phone and the person I'm calling on very strong internet connections, the view on both sides was laggy and pixelated. Bluetooth just isn't meant for this.

On-Foot Navigation

One of the features I've most wanted HUD glasses (and eventually AR glasses) for is pedestrian navigation. There are few things I hate more in this world than arriving in a new city and having to constantly look down at my phone or watch while walking, almost bumping into people and poles. Worse, in cities with dense skyscrapers, the GPS accuracy degrades to dozens of meters, and I hopelessly watch the little blue dot on my phone bounce around the neighborhood.

In theory, Meta Ray-Ban Display solves at least the first problem, but with a massive caveat you absolutely must be aware of if you're thinking of buying it for this use case.

0:00
/0:05

Meta depiction of navigation.

You can open the Maps applet anywhere, and you'll see a minimap (powered by OpenStreetMap and Overture) with nearby venues and landmarks. You can zoom all the way in to the street level, or out to the city level, using the same pinch-and-twist gesture used for volume control. You can also search for places using speech recognition, and scroll through the results by swiping your fingers.

The problem is that everywhere except inside the 28 cities Meta explicitly supports, you won't be able to initiate navigation. Instead, you just have an option to send it to your phone, where you can tap to open it in Google Maps, defeating the purpose. It's a rare product where core functionality is geofenced to a handful of cities.

I have been able to use Meta's navigation feature multiple times when visiting London, and found it genuinely very useful when in New York for the Samsung Galaxy XR launch event. I love the idea here, and where it works, the implementation isn't bad. Not having to look down to navigate is exactly what I wanted. But it works in so few places, relative to my life at least, that I just can't stop wishing it had Google Maps. And it's hard to imagine not switching to whatever HUD glasses have Google Maps first.

The Supported Cities

USA
• Atlanta, Georgia
• Austin, Texas
• Boston, Massachusetts
• Chicago, Illinois
• Dallas, Texas
• Fort Worth, Texas
• Houston, Texas
• Los Angeles, California
• Miami, Florida
• New York City, New York
• Orlando, Florida
• Philadelphia, Pennsylvania
• Phoenix, Arizona
• San Antonio, Texas
• San Diego, California
• San Francisco, California
• San Jose, California
• Seattle, Washington
• Washington D.C.Canada
• Toronto, Canada
• Montreal, Canada
• Vancouver, Canada

UK
• London, UK
• Manchester, UK

France
• Paris, France

Italy
• Rome, Italy
• Milan, Italy
• Naples, Italy 

It's baffling that Meta decided to roll its own navigation system. It may be the right bet in the long term, but in the short term it compromises the product and leaves the goal wide open for Google to deliver a significantly better experience. Meta already has a wide-ranging partnership with Microsoft – why didn't it license Bing Maps for navigation? Or why not acquire a provider like TomTom?

It somewhat reminds me of the Apple Maps launch debacle, except in a parallel universe where it arrived alongside an iPhone that didn't have Google Maps.

Meta AI & Its Dedicated Gesture

Meta has been marketing its smart glasses as "AI glasses" for some time now, riding the wave of hype that has for better and for worse almost entirely taken over the tech industry in recent years.

I didn't use Meta AI often on the regular Ray-Ban Meta glasses because I hate saying "Hey Meta", just as I hate saying "Hey Google", especially in public. With Meta Ray-Ban Display, you can still invoke the AI that way if you want, but there's also a dedicated gesture: just slightly curl your fingers inward and double-tap the side of your index finger with your thumb.

There's something very satisfying about it. It feels more natural than the pinch gestures used for everything else. And it has me using Meta AI far more often than I would if I had to keep saying "Hey Meta".

With the display, you also get visual aids in your responses. Ask about the weather in a place, for example, and you'll see a five-day forecast appear. Or for most queries, you'll see the response appear as text. I wish I could disable the voice response and just get the text, for some situations, and while I can do so by just reducing the volume to zero temporarily, this isn't a very elegant solution.

The real problem with Meta AI is that it's still Meta AI. It just isn't as advanced as OpenAI's GPT-5 or Google's Gemini 2.5, sometimes failing at queries where it needs to make a cognitive leap based on context, and fundamentally lacking the ability to think before it responds for complex requests.

Occasionally, I've run into situations where I wanted the convenience of asking advanced AI about something without taking out my phone, but ended up doing so after Meta AI just couldn't get the answer right. Ideally, I'd be able to say "Hey Meta, ask [ChatGPT/Gemini]". But that's not supported.

Audio Playback & Control

The primary way I use the regular Ray-Ban Meta glasses is for listening to podcasts and audiobooks. Smart glasses speakers don't do music justice, but they're great for spoken word content, and having your ears fully open to the real world is ideal.

What's different on Meta Ray-Ban Display is that you can far more easily and precisely adjust the volume. Rather than needing to raise your arm up to your head and awkwardly swipe your finger along the temple, you can just wake the display, pinch and hold your index finger to your thumb, and twist. It's a satisfying gesture that feels natural and precise.

The HUD also shows you a thumbnail for the content you're viewing, as well as how far you're into it and how long is left. It's a nice addition, and one less reason to take out my phone.

Live Captions & Translation

For accessibility, one of the biggest marketed features of Meta Ray-Ban Display is Live Captions.

The real-time speech transcription has decent accuracy, with the exception of niche proper nouns, and fairly low latency, always giving you at least the gist of what the person you're looking at is saying. And yes, I do just mean the person you're looking at. It's remarkable how well the system ignores any audio not in front of you, leveraging the microphone array to cancel it out. Look away from someone and the captions will stop. Look back and they'll continue. It really does work.

0:00
/0:06

Meta depiction of live captions.

Just one swipe over from the virtual button to start Live Captions is the one to start Live Translation, and this feature blew me away. Having a friend speak Spanish and seeing an English translation of what they're saying feels like magic, and I could see it being immensely useful when traveling abroad. For the other person to understand you, by the way, you just hand them your phone with the Meta AI app open it shows them what you're saying, in their language. Brilliant.

Yes, you can do live translation with just a phone, or audio-only devices like AirPods and Pixel Buds.

Unfortunately, it only supports English, French, Spanish, and Italian. This is another example of where Google's services, namely Google Translate, would be immensely valuable.

The bigger problem with both Live Captions and Live Translation is that because the display is slightly below and to the right of your sightline, you can't look directly at someone when using the features. It's far better than having to look down at a phone, but ideally I'd want the text to appear centered and much higher, so that I could appear to be keeping eye contact with them while seemingly-magically understanding what they're saying. This would require different hardware, though.

The Big Missing Feature

While I'm glad that Meta Ray-Ban Display lets me decide whether it's worth taking my phone out of my pocket for inbound personal messages, what I want the most out of HUD glasses is the ability to screen my emails and Slack messages.

There is no app store on Meta Ray-Ban Display, and Meta's upcoming SDK for phone apps to access its smart glasses won't support sending imagery to the HUD. For the foreseeable future, any new "apps" will have to come directly from Meta, and I call them "applets" because each really only does one thing and is essentially part of the OS.

(The company says it plans to add two new apps soon, a Teleprompter and a dedicated IG Reels app.)

So a Slack app won't be happening anytime soon. But there's a far easier way I could get what I want here.

Many smartwatches (yes, including third-party ones on iPhone) already let you view your phone notifications. I asked Meta why it doesn't do this for Meta Ray-Ban Display, and the company told me that it came down to not overwhelming the user. I don't understand this answer though, since as with smartwatches, you could select exactly which apps you do and don't want notifications from, just as you can for WhatsApp, Messenger, Instagram, and texts already.

Another interesting approach here could be to just show the icon for the app sending a notification, and require a pinch to open the full thing. If I had this, and could screen Slack notifications and emails, I would take my phone out of my pocket significantly less often.

The Case Folds Down & Has Huge Potential

Just like with AirPods, for smart glasses the included battery case is almost as important as the device itself. Meta Ray-Ban Display has an official battery life of 6 hours, which I've found to be accurate, while the case provides 4 full charges for a total of 30 hours of use between needing to charge it at the wall.

The problem with the regular Ray-Ban and Oakley Meta glasses cases is that they're far too bulky to fit in almost any jacket pocket. And the Meta Ray-Ban Display case has an elegant solution for this, likely inspired by what Snap did for the 2019 Spectacles 3.

0:00
/0:31

How Meta Ray-Ban Display's case folds when you're wearing the glasses.

When containing the glasses, it's a triangular prism. But when not, it folds down into something with very similar dimensions to a typical smartphone, (just slightly taller and less wide). This means that not only does it fit in most jacket pockets, but it even fits into my jeans pockets. So when I'm dressed relatively light and using Meta Ray-Ban Display as my sunglasses, for example, I can keep it in my pocket when on-the-move and put it back in the case when in the shade. The glasses and case together are the product, along with the wristband, and the folding case makes the whole system significantly more portable.

In fact, the glasses and case make such a great pair that I'm eager for the case to do more than just act as a battery and container.

When I need to take the glasses off, for example to wash my face, take a shower, or just let them charge, docking them in the case should turn the duo into a smart speaker, letting me continue to listen to audio, take calls, and prompt Meta AI as I would with Alexa on an Amazon Echo. This could be achieved with no extra hardware, but ideally you'd add a speaker to the case with similar quality to a smartphone.

When folded, the case (center) is flat enough to fit in a pocket.

It also seems like the battery case could be the perfect way to bring cellular connectivity to a future version of Meta Ray-Ban Display without nuking the battery life. The case would send and receive 5G and LTE signals, and relay data to the glasses via Bluetooth for low-bandwidth tasks and Wi-Fi for high-bandwidth.

Interestingly, Meta has a partnership with Verizon to sell Meta Ray-Ban Display in stores. Could this evolve into a cellular partnership for the next generation?

Conclusions: Is It Worth $800?

Meta Ray-Ban Display is very much so a first generation product, an early attempt at a category that I'm convinced, after a month of testing, could eventually become a key part of the lives of at least hundreds of millions of people.

In its current form, it's an amazing way to capture photos and videos without leaving the moment, and a great way to decide whether WhatsApp messages are worth your time without taking out your phone. In the cities where it's supported, the on-foot navigation is genuinely useful, and for the languages supported, the live translation feature could change what it means to travel. Like displayless smart glasses, it's also a great way to listen to spoken audio and take calls.

But the monocular display just doesn't feel right, too much of what Meta is trying to do is hampered by the device's lack of cellular connectivity, and the lack of established services like Gmail, Google Maps and Google Translate makes Meta Ray-Ban Display so much less useful than HUD glasses theoretically could be.

Further, while the Meta Neural Band works incredibly well, future versions need to replicate the functionality of smartwatches, instead of asking buyers to justify wearing yet another device on their wrist.

If you're an early adopter who loves having the latest technology and you don't mind looking odd in public, can live with the flaws I've outlined, and are fine with wearing a dedicated input device on your wrist, there's nothing else quite like Meta Ray-Ban Display, and the novelty could make up for the issues.

For everyone else, I recommend waiting for future generations of HUD glasses, ideally with binocular displays and either cellular connectivity or a seamless automatic phone Wi-Fi sharing system that I suspect only Apple and Google, the makers of your phone's OS, can pull off.

Just like with its Quest headsets, Meta is set to see fierce competition from the mobile platform incumbents in this space, and it'll be fascinating to see how the company responds and evolves its products through the rest of this decade and beyond.

Appreciate our reporting? Consider becoming an UploadVR Member or Patron.

  •  

Gigabyte X870E Aorus Pro X3D Ice Review

It wowed at Computex and now it's finally here! Gigabyte's new X870E motherboard has a beautiful white color scheme, X3D CPU boosting modes, a Wi-Fi driver integrated into the BIOS and other innovative new features. But do they justify its asking price?

  •  

From 1976 To Today, Dhrystone Benchmarks Reveal How Far CPUs Have Come

From 1976 To Today, Dhrystone Benchmarks Reveal How Far CPUs Have Come Possibly the most absurd truth of modern computing is that, as far as the technology has evolved, we're fundamentally still doing the exact same thing we were doing decades ago: twiddling bits. The question, of course, has always been how fast we can twiddle those bits, and how many we can do at once. Because it's still the same work, though,
  •  

Alarming runC Flaws Enable Hackers To Exploit Docker Containers For Root Access

Alarming runC Flaws Enable Hackers To Exploit Docker Containers For Root Access Security researchers have found several alarming security flaws in tooling used by containerization tool Docker that allows attackers to attack the host machine. The flaws specifically relate to runC, which Docker describes as the “infrastructure plumbing” that makes it such a useful tool for developers. RunC is a universal container runtime
  •  

Escape from Hadrian’s Wall Is A 5th Century Puzzler Out Now On Quest & PC VR

Escape from Hadrian’s Wall is a 5th-century VR puzzle game where you navigate the titular location with magical abilities.

Developed by Jim Gray Productions, Escape from Hadrian’s Wall is a historical fantasy puzzler set in 402 A.D. Britannia during Roman occupation and dives into Celtic legends. As a nameless prisoner held captive inside one of its forts, you become a witch's apprentice and use magical artifacts to solve the puzzles within. That's out today on Quest and PC VR.

Exploring the dungeons beneath Hadrian's Wall, this campaign sees you using magical cards and tools as you manipulate elements like earth, air, fire, and water to solve these puzzles. Jim Gray Productions states the full game features 38 puzzles in total, which include 18 Elemental Golem fights carried out through card battles.

Recently featured in the last Steam Next Fest, a free PC VR demo remains available to download that's seen several updates since its initial launch, introducing additional accessibility features and a French localization. That same demo is also available on Quest, and further language support is promised at a later date.

Escape from Hadrian's Wall launches today on PC VR and Quest.

  •  

XGODY Gimbal N6 PRO Review: Okay But Could Have Been Better

A Xgody projector with its lens and power button visible on a wooden surface.

Someone who is looking for an immersive option for both watching movies/videos and playing games can use a projector since even the most affordable projectors these days have support for both. XGODY recently launched its budget-tier Gimbal N6 Pro projector that promises native 1080p content support, which makes it an intriguing choice for those who are looking to get the job done without having to spend hundreds of dollars. Not just XGODY, but there are countless projector manufacturers who are competing in the $100-$200 price range that offer competitive specifications. There are always some limitations when you are settling for […]

Read full article at https://wccftech.com/review/xgody-gimbal-n6-pro-review/

  •  

Visual Instruments reveals transparent PC monitor with impressive peak brightness

Transparent display technology is not new, having been featured in various concept laptops and signage for years. However, a new company called Visual Instruments is attempting to bring the technology to the mainstream desktop with its new Phantom monitor, a display it claims is the first practical “transparent computer monitor”.

Unlike standard transparent OLEDs, the Phantom (via Tom's Hardware) uses technology akin to a vehicle's heads-up display. The image is not generated on the glass itself but is projected onto it using mirrors, as Visual Instruments claims this method is currently the most sophisticated way to achieve a see-through effect.

The monitor also features what Visual Instruments calls “dynamic opacity”, allowing users to adjust the transparency level. This feature is said to offer three distinct settings that range from completely see-through to fully opaque, mimicking a traditional monitor when standard viewing is required.

According to the company, the Phantom is a 24-inch, 16:9 display with a native 4K resolution. Most surprisingly, the marketing materials boast an “Ultra HDR” feature capable of hitting a peak brightness of 5000 nits. It also claims 100% sRGB gamut coverage and includes standard connectivity via USB-C and HDMI.

Currently, the monitor appears to be in an extremely limited “early access” phase. Only ten units are being produced initially, with three already reserved. Each one is also being made-to-order, meaning certain specs will vary, along with pricing.

KitGuru says: Do you think transparent monitors will ever become mainstream?

The post Visual Instruments reveals transparent PC monitor with impressive peak brightness first appeared on KitGuru.
  •  

G-Wolves Fenrir Max 8K Review

G-Wolves expands their fingertip grip lineup with the Fenrir Max 8K. Weighing no more than 23 g, the ambidextrous Fenrir Max comes with PixArt's PAW3950 sensor, Huano main button switches, and 8000 Hz wireless polling, configurable through a web driver.

  •  

AMD Confirms Zen 6 CPUs Will Support AVX512 And These Other Instruction Sets

AMD Confirms Zen 6 CPUs Will Support AVX512 And These Other Instruction Sets The idea that AMD's Zen 6 would support AVX-512 in some fashion has never really been in question, to tell the truth. With native 512-bit vector datapaths and a nearly-complete AVX-512 implementation, AMD's Zen 5 already has the strongest AVX-512 support on client systems, but it does seem like Zen 6 is poised to expand that support considerably.
  •  

Minisforum Unveils World's First Arm-Based Mini Workstation With UEFI Boot

Minisforum Unveils World's First Arm-Based Mini Workstation With UEFI Boot The compact PC world may just have been given a good shake up with Minisforum's new MS-R1, a mini workstation with server-grade Arm performance in an expandable desktop form factor. But hold up, the machine is also being hailed as the first of its kind in the world to support UEFI boot. With UEFI, the MS-R1 makes it as simple to install and
  •  

AirPods 4 Hits New Low Price, iPhones Starting At $119 In Early Black Friday Sale

AirPods 4 Hits New Low Price, iPhones Starting At $119 In Early Black Friday Sale Sleigh bells ring, are you listening? In the lane, snow is glistening. A beautify sight, we're happy tonight, walking in a Black Friday wonderland. Wait, that's not how the song goes. However, it IS how the season goes this time of year as retailers hurl deals at pace faster than Elf (Will Ferrell version) can chuck snowballs. And right now,
  •  

Lamborghini Temerario Super Trofeo Racer Drops Hybrid For Pure V8 Power

Lamborghini Temerario Super Trofeo Racer Drops Hybrid For Pure V8 Power At the season-ending Lamborghini World Finals at the Misano World Circuit Marco Simoncelli, Lamborghini’s Squadra Corse division unveiled the design concept for the Temerario Super Trofeo, the new track weapon set to take over the Super Trofeo championships in 2027. This sixth model in the series marks a historic transition, as it replaces
  •  

3 Alternatives To YouTube TV As Standoff With Disney & ESPN Drags On

3 Alternatives To YouTube TV As Standoff With Disney & ESPN Drags On Google and Disney remain embroiled in a contract dispute that is preventing YouTube TV subscribers from accessing a litany of Disney-owned channels, including ABC, ESPN, FX, National Geographic, and several others. In the wake of the ongoing dispute, YouTube TV has begun sending out emails promising a $20 credit within the next few days. But
  •  

Apple iPhones To Get A Massive Satellite Upgrade With 5 New Features

Apple iPhones To Get A Massive Satellite Upgrade With 5 New Features Apple is reportedly planning to amp up the iPhone’s satellite features from solely as an emergency-only lifeline into a connected device that keeps key features in certain major apps running: think Starlink-like connectivity for Apple Maps or WhatsApp, but with everything built into your next iPhone. If all goes as planned, this future expansion
  •  

ID-Cooling updates air and liquid cooler lineups with digital displays

ID-Cooling is refreshing its cooling portfolio with new product lines, heavily embracing the trend of integrated displays in its cooling solutions. These releases include standard tower air cooling, low-profile ITX solutions, and a new series of 360mm AIOs.

Starting with the Frozn A410 TD is a new compact single-tower air cooler designed for mainstream users who want at-a-glance system monitoring. It features a top-mounted digital display that provides real-time CPU temperature readings. The cooler uses a 50mm thick black-coated fin stack connected via four direct-touch heat pipes. Despite its small stature, ID-Cooling rates it for a 220W TDP. It ships with the updated X25 mounting kit and pre-applied Frost X45 thermal paste, carrying a price tag of $34.99.

For SFF builds, the company has introduced the IS-53-XT. This is an ultra-low-profile top-flow cooler standing just 53mm tall and 94mm wide, specifically designed to avoid interference with tall RAM or GPU backplates in tight Mini-ITX motherboards. Unlike the Frozn model, this uses a CNC-machined pure copper base soldered directly to the heat pipes for improved thermal transfer. It is equipped with a slim 92mm PWM fan that can spin up to 3000 RPM, rated to handle thermal loads up to 120W. It is also priced at $34.99.

Rounding out the launch is the new FX360 series of all-in-one liquid coolers, which introduces three new 360mm models powered by the company's Gen 7 Pro pump. The CPU block design and fan choices differentiate the lineup.

The FX360 LCD (also available with a 240mm radiator) features a 1.48-inch screen on the pump block, which is fully customisable to display system statistics or custom images. The “Performance Edition” named FX360 LCD PE upgrades the standard AS-120 V2 ARGB fans to higher-performance AP-120-K fans. For those who prefer utility over customisation, the FX360 TD variant replaces the LCD with a simpler, dedicated digital temperature display that requires only an open USB 2.0 header to function.

KitGuru says: Are you planning to upgrade your current CPU cooling solution? Did any of ID-Cooling's new products pique your interest?

The post ID-Cooling updates air and liquid cooler lineups with digital displays first appeared on KitGuru.
  •  

State of Play returns tomorrow

Last month, Sony held a State of Play stream, bringing us new PS5 game announcements, along with the latest trailer for Insomniac's long-awaited Wolverine game. This month, another State of Play will take place, this time focusing on games made in Japan and other regions of Asia.

State of Play returns tomorrow at 10PM GMT, or 7AM for Japanese viewers, an odd time considering that this is a Japanese-focused State of Play. The stream will be roughly 40 minutes long, making it similar in length to last month's showcase.

State of Play February

There is not much to know about the stream so far, outside of the fact that it will focus on games made in Asia. That likely means we'll see appearances from the likes of Capcom, Konami and Square Enix, alongside smaller studios.

KitGuru Says: What are you hoping to see from this State of Play? 

The post State of Play returns tomorrow first appeared on KitGuru.
  •  

Arc Raiders continues to be a huge hit on Steam

Arc Raiders got off to a great start earlier this month, achieving around 300K concurrent players over its launch weekend. Since then, the numbers have only grown, with the game now surpassing Helldivers 2's all-time concurrent player record. 

Helldivers 2 and Arc Raiders are both extraction shooters, but each handles the genre differently. Helldivers 2 is more PvE focused, while Arc Raiders has a huge PvP element to its matches. Helldivers 2 topped out at just under 459,000 peak concurrent players on Steam, but Arc Raiders has managed to surpass this with a peak concurrent player count of 462,000 players, as measured over the weekend.

Arc Raiders has continued to do very well in the weeks since launch. However, Embark Studios will have to keep it going in the months ahead. The Finals, the previous game from Embark, also had a great launch, but things tapered off afterwards. Fortunately, it does look like there is a solid plan in place for Arc Raiders in the coming months, with a full roadmap already available to players.

Unlike The Finals, Arc Raiders is not a free-to-play game, instead opting for a $40 price point. Other extraction shooters over the years have been priced similarly. It is expected that Bungie's take on the genre, Marathon, will also launch with a $40 price tag.

KitGuru Says: Have you played Arc Raiders already? Have you been enjoying the game so far? 

The post Arc Raiders continues to be a huge hit on Steam first appeared on KitGuru.
  •  

Samsung teases next-gen LPDDR6 memory for CES 2026

Samsung is gearing up for a major showing at CES 2026 in Las Vegas, with plans to unveil advancements in memory technology tailored for the AI era. The company has confirmed it will showcase its next-generation LPDDR6 RAM and a compact, AI-optimised PCIe Gen5 SSD, the PM9E1.

The new LPDDR6 memory, based on Samsung's 12nm process, is rated for speeds of up to 10.7Gbps. This represents an 11% speed increase over current LPDDR5X chips. More importantly for mobile and edge devices, it also boasts a 21% improvement in power efficiency thanks to a new dynamic power management system.

Alongside the new RAM, Samsung will introduce the PM9E1, a compact M.2 2242 NVMe SSD. The new 2242 SSD is claimed to be the “world's first AI-optimised PCIe Gen5 SSD”, offering read speeds as high as 14.8GB/s and write speeds of up to 13.4GB/s. The drive uses Samsung's own Presto controller and V8 TLC V-NAND, offering capacities of up to 4TB and a reported 50% improvement in power efficiency over its predecessor.

Both products have already been recognised as CES 2026 Innovation Award honorees. Samsung is expected to provide full details on pricing and availability during the event in early January.

KitGuru says: Samsung's move to LPDDR6 appears to be in line with SK Hynix's roadmap, which also includes LPDDR6 starting in 2026. Of the two, which one do you think will be the first to reach consumer devices?

The post Samsung teases next-gen LPDDR6 memory for CES 2026 first appeared on KitGuru.
  •  

New Mass Effect 5 art teases a Krogan civil war

While BioWare didn't reveal much about the next Mass Effect game during N7 Day, the company did leave a series of hints in its blogpost for fans to decode. Now, those fans have succeeded, uncovering a link to a new piece of Mass Effect 5 concept art.

The art, titled “Civil War” shows a group of Krogan in the middle of a shootout, with the main focus being on this obviously older looking Krogan member in the centre:

There isn't a ton to derive from this artwork but the older looking Krogan, combined with our earlier look at an older version of Liara, suggests that the next Mass Effect game will take place many, many years after the events of the original trilogy. In the Mass Effect universe, both Krogan and Asari beings have incredibly long lifespans, so while the next game could take place centuries after the original trilogy, some characters, like Liara, could return.

As part of its N7 Day blog post, BioWare confirmed that it is still committed to Mass Effect 5, alongside publisher EA. Unfortunately, there is still no word on when we might see the game properly announced, let alone released.

KitGuru Says: The next Mass Effect is still likely years away. Perhaps we'll get some more details at next year's N7 day. 

The post New Mass Effect 5 art teases a Krogan civil war first appeared on KitGuru.
  •  

Halo Infinite is getting one last update before devs move on to other projects

When Halo Infinite was first planned, 343 Industries had ambitions of keeping the game going for a decade. When the game ultimately failed to win over a significant number of fans at launch, those plans changed. Now, 343 Industries has been broken down and rebuilt as Halo Studios, and the team is looking to move on to other projects. As a result, support for Halo Infinite is coming to an end. 

Announced during the November Community Livestream, Operation Infinite is scheduled to launch on November 18th. It will arrive with a 100-tier Operation Pass and a separate 100-tier Premium Pass, offering a range of new weapon models and armor sets. To help players finish their remaining progression, the update will also permanently increase the earn rates for both Career Rank and Spartan Points.

While internal content creation is winding down, the update will introduce new community-created maps. These include Yuletide, a holiday-themed map joining the Husky Raid queues, and Vacancy, an urban map for ranked play.

Following this update, Halo Infinite will effectively enter maintenance mode. Halo Studios has pledged to keep the servers running and will continue to rotate daily and weekly challenges, as well as the ranked playlists, throughout 2026 and beyond. However, players should no longer expect new seasons or substantial content drops.

KitGuru says: Four years after its rocky launch, Halo Infinite is winding down, and resources are now being redirected to other projects.

The post Halo Infinite is getting one last update before devs move on to other projects first appeared on KitGuru.
  •  

Gigabyte Gaming A16 Pro Laptop Review (Core 7 240H + RTX 5080)

The Gigabyte Gaming A16 Pro laptop is aiming to come in cheaper than any other laptop on the market with a GeForce RTX 5080. Coming in at around $2200, it’s certainly priced very competitively. Accompanying that RTX 5080 is an Intel Core 7 240H, 32GB of LPDDR5X memory running at 5600MT/s and a 16” IPS 2560×1600 165Hz display. Can the A16 Pro deliver a solid experience while aiming to keep its price tag as low as possible?

Timestamps:

00:00 Start
00:52 Pricing
01:44 Hardware Specs
03:01 Memory Speeds
03:22 Drive setup
04:04 Drive speeds / slots
05:02 The Display
05:53 The Keyboard
06:34 The Trackpad
06:52 Ports and I/O / Connectivity
08:14 The Battery / Longevity
09:18 Webcam
09:30 Audio / Speakers
09:59 The Design / Appearance
10:50 Build Quality
11:17 Cooling System
11:38 Testing Methodology
12:02 GI Mate Software
13:25 Power Mode setup for Gaming
14:03 Cyberpunk 2077
14:33 Kingdom Come Deliverance 2
15:07 A Plague Tale Requiem
15:32 Counter Strike 2
15:51 Forza Horizon 5
16:32 Hogwarts Legacy
17:06 Marvel Rivals
18:01 CPU Real World Operation
20:38 Thermal Camera Footage / Noise output
21:15 Closing Thoughts

Specifications:

  • CPU – Intel® Core™ 7 processor 240H 
  • GPU – NVIDIA® GeForce RTX™ 5080 Laptop GPU
  • Display – 16″ 16:10 – IPS WQXGA (2560×1600) 165Hz
  • RAM – 32GB LPDDR5x 5600MHz 
  • SSD – 2 x Kingston 1TB M.2 NVMe SSD
  • Keyboard – 1-zone RGB Backlit Keyboard, Up to 1.7mm Key-travel
  • I/O 
    • Left Side:
      • 1 x DC-in
      • 1 x RJ-45
      • 1 x HDMI 2.1
      • 1 x Type-A support USB3.2 Gen1
      • 1 x Type-C USB3.2 Gen1 support DisplayPort™ 1.4 and Power Delivery 3.0
    • Right Side:
      • 1 x Type-A support USB3.2 Gen1
      • 1 x Type-A support USB2.0
      • 1 x 3.5mm Audio Jack support mic / headphone combo
  • Audio – 2x 2W speakers
  • Wireless Networking – WIFI 6E (802.11ax 2×2)
  • Wired Networking – LAN: 1G
  • Bluetooth v5.2
  • FHD (1080p) IR Webcam
  • Li Polymer 76Wh
  • 240W AC Adapter
  • 358.3 x 262.5 x 19.45~22.99 mm
  • Weight – ~2.3kg
  • Colour – Titanium Black

The core specs are confirmed in the following CPU-Z and GPU-Z screenshots:

We put the A16 Pro through a series of varying gaming tests, some of which are detailed in the screenshots below. All titles were tested at the laptop's native resolution of 2560×1600 using maximum graphics settings. Overall, the machine showed capable gaming performance in most titles, managing to maintain 60+ FPS most of the time, with occasional dips in the mid 50s. We also tested the gains on offer when enabling both DLSS at the performance preset and frame generation where possible. For full details and a more in-depth look at the gaming performance of the A16 Pro Laptop head over to YouTube and watch the video review in full.

Hogwarts Legacy (2560×1600 – Max Settings)

Hogwarts Legacy (2560×1600 – Max Settings, DLSS Performance, 4X Multi Frame Generation)

Forza Horizon 5 (2560×1600 – Max Settings)

Forza Horizon 5 (2560×1600 – Max Settings, DLSS Performance, Frame Generation On)

Kingdom Come: Deliverance II (2560×1600 – Max Settings)

Kingdom Come: Deliverance II (2560×1600 – Max Settings, DLSS Performance)

Closing Thoughts

The Gigabyte Gaming A16 Pro is overall a decent laptop. The combination of Intel’s Core 7 240H and an RTX 5080 produce mostly good gaming results and the 32GB of LPDDR5x memory keeps everything running smoothly, though is sadly not upgradeable due to being soldered. The dual 1TB SSDs offer plenty of space, but one of the M.2 slots runs at slower speeds, which feels like a missed opportunity in an otherwise solid setup.

The 16-inch, 2560×1600 IPS panel display is decent. It offers good sharpness and smooth motion, though it lacks the contrast of pricier OLED alternatives. The keyboard and trackpad are serviceable: comfortable enough for daily use but nothing exceptional, with cramped arrow keys and a slightly hollow trackpad click. Connectivity is practical, though the left-side charging port can get in the way and an SD card reader would’ve been a welcome inclusion.

Build quality is solid but unremarkable, and battery life is respectable when used sensibly, though heavy gaming quickly drains the 76Wh battery. The cooling system does its best to keep thermals in check, but like many other high-power laptops, CPU temperatures can get to concerning levels when putting the processor under heavy load.

Overall, the A16 Pro offers genuine RTX 5080 mobile performance at a lower price than most competitors. It's not without its quirks, but is a strong contender for those who want flagship power without a flagship price.

We don't currently have information regarding UK stock for the A16 Pro Laptop, we will update the page when we know more. The US RRP is $2199.99.

Pros:

  • One of the most affordable gaming laptops equipped with an RTX 5080 GPU.
  • 2TB SSD storage in the reviewed specification.
  • Good battery life for general use (non-gaming).
  • Decent native gaming performance.
  • Doesn’t overdo the ‘gamer’ aesthetic.

Cons:

  • One M.2 slot runs at roughly half speed with no way to adjust settings.
  • Soldered memory means no upgrade path.
  • High CPU temperatures when under intense workloads.
  • Cramped keyboard design.
  • Side DC in port can get in the way when not using the laptop on a desk.

KitGuru says: The Gigabyte Gaming A16 Pro delivers genuine RTX 5080 mobile performance at a lower price than some of its rivals. However, it does trade a few quality-of-life features and premium touches to keep that price down. 

The post Gigabyte Gaming A16 Pro Laptop Review (Core 7 240H + RTX 5080) first appeared on KitGuru.
  •  

UploadVR's Winter Showcase 2025 Announcement

Every June and December, UploadVR connects with dozens of developers and publishers within the XR industry to highlight the best that virtual reality has to offer. It’s showcase season once again, and we’re excited to announce:

The Showcase will premiere December 5th @10am PT on the IGN and UploadVR YouTube channels.

Kudos to everyone who participated in this past summer’s event - none of this would be possible without the support of the VR community and the trust of developers and publishers who give us their secret showcase announcements. A huge thank you also goes out to last season’s sponsors: Elsewhere Electric, Dixotomia, Fruit Golf, Nightclub Simulator VR, and Virtual Skate. You have helped UploadVR continue to bring you the latest and greatest in all things VR, XR, AR, KR... ZR... (what acronyms are we supposed to use nowadays?) and supported your fellow developers by giving us the means to make the showcase. Thank you!

As always, we’re searching for exclusive content, reveals, and announcements. If you have a new game or exclusive news and you want to submit a video for this season, fill out the application!

Here’s some additional information about the UploadVR Showcase - Winter 2025: 

How Do I Watch the Showcase?

Subscribe to our YouTube channel to receive notifications once the showcase goes live. You can also follow us on X, Bluesky, and Instagram for the latest updates.

How Do I Submit My Game to the Showcase?

To submit a game or sign up as a sponsor, please fill out this form. Applying to the showcase tells us what you intend to announce in your video, you do not need to have a video created when you apply. We will respond with our level of interest in the project, and tell you next steps for submitting your video.

How Does UploadVR Select What is in The Show?

We’re looking for originality, oddity, interest, and impact. The projects we highlight are an amalgamation of large and smaller-scale works. While some submissions may not jive with this season’s showcase, we’re always open to future submissions.

Content must be kept under an embargo so that announcements are exclusive to the premiere. 

When Will I Know If My Application Was Accepted?

The UploadVR team reviews applications as they come in, and submitters can expect a reply that states our level of interest in what you've described.

The ideal deadline for videos is Thursday, November 20th, 2025. However, we will accept final video submissions until Thursday, November 27th, 2025.

Videos should be in 1080p or 4K and 30-60fps.

There is no deadline for submitting an application, although you probably shouldn't apply the morning of the show.

If your project won’t be ready by the end of November, we encourage you to submit for the next season, or chat with us about coverage and collaboration by emailing tips@uploadvr.com

When Are Selections Made?

If your project has been accepted, we’ll be contacting you as soon as we review your application. Please wait to contact us regarding the status only if you haven’t heard from us within 7 days after submitting. 

How Do I Sponsor the Event?

Want to be a sponsor? Please fill out the application or book a meeting with Beck

PS - check out past UploadVR showcases here.

  •  

Modular Samsung AutoSSD Has A Detachable Controller & NAND For Easy Upgrades

Modular Samsung AutoSSD Has A Detachable Controller & NAND For Easy Upgrades Samsung is showing off a couple of new storage products that it plans to showcase at the Consumer Electronics Show (CES) in January 2026, one of which is a cleverly designed solid state drive (SSD) with a modular aspect to it. Called the Detachable AutoSSD, the drive takes aim at automobiles and sports a modular makeup to make upgrades a little
  •  

Inu Atsume VR Is A Puppy Collecting Sim Coming To Quest Next Week

Inu Atsume VR is a virtual pet simulator by Hit-Point, creators of the popular cat-collecting game Neko Atsume Purrfect, and it's launching on Quest soon.

Similar in style to the studio's feline-filled experience, Neko Atsume Purrfect, Inu Atsume VR offers puppy-loving players the chance to complete a Dog Encyclopedia and compete with their newfound companions across three competitions. It's playable in both VR and MR, with the mixed reality mode allowing pups to roam freely around your living space without running the risk of a mess.

0:00
/1:32

Official trailer

To find new pets, you can visit a park called the “Square,” where you can throw frisbees to earn the attention of your desired pet and play together. By continuously showering the pup with praise and attention, it will gradually inch closer and eventually become your new friend.

From here, the canine companions can be trained up and taught tricks like Shake, all while receiving gifts that fill out the virtual space. Those with a keen eye for interior design can also customize their in-game home by tweaking the colors of walls and doors.

While the store page lists a 'November 2025' release window, the official website confirms Inu Atsume VR will make its Quest debut on November 20 for $14.99.

  •