
Project Aura is a new XR glasses system born from a collaboration between Google and XREAL, and it offers a fresh take on how extended reality hardware can be designed. Rather than packing all the computing power into the glasses themselves, Aura uses a wired approach. The glasses connect to a separate, smartphone-shaped compute pack that handles processing and battery power, keeping the headset itself lightweight and comfortable.
Inside this tethered puck is Qualcomm’s Snapdragon XR2 Gen 2 chip, the same processor found in the Galaxy XR headset, along with the main battery. This design choice removes heat, weight, and bulk from the glasses, resulting in a prototype that weighs just around 90 grams.
The system also includes a flat touch surface on the compute pack, which acts as an additional input method. It isn’t a display, but rather a touchpad that complements hand tracking. When you put the glasses on, you immediately see the real world through transparent lenses. This is known as optical see-through technology, meaning you view your surroundings directly with your own eyes while digital content is projected into your field of view.
With a 70-degree field of view, Project Aura overlays virtual elements onto the physical world in a way that feels natural and immersive. The result is an XR experience that blends digital content seamlessly with reality, without isolating you from what’s around you.
Today’s XR landscape is split between two extremes. Traditional VR headsets are powerful and immersive, but they’re also bulky and fully block out the real world. On the other end, AI glasses are lightweight and wearable all day, but they’re limited by small displays and reduced processing power. Project Aura sits squarely between these two categories.
Aura offers full six-degrees-of-freedom tracking, allowing virtual objects to stay locked in place wherever you position them. It uses high-quality micro-OLED (pin) displays, hand tracking powered by two onboard cameras, and an additional temple-mounted camera for capturing photos and video. Interaction is handled entirely through hand gestures, similar to other Android XR headsets, making it intuitive and easy to pick up without a learning curve.
Because the main computing hardware is offloaded to the tethered puck, Aura avoids the need for a heavy headset strapped to your face. This form factor makes it more portable and comfortable while still delivering advanced XR capabilities. It’s a practical balance between immersion, performance, and wearability: one that could appeal to users who find current VR headsets too cumbersome.
While Project Aura shows a lot of promise, there are still some important unanswered questions. XREAL hasn’t shared a complete set of technical specifications yet, and it’s unclear whether the system will support dedicated controllers. If controller support is missing, that could limit compatibility with many immersive VR titles.
We also don’t know the final price or an exact release date. However, XREAL has confirmed that Project Aura is planned to launch in 2026, giving the company time to refine the hardware and define its place within the Android XR ecosystem.
For now, Aura stands as an intriguing glimpse into a future where XR is powerful, lightweight, and seamlessly integrated into everyday life.