Lightweight but with a limited view field

The biggest reveal of Google I/O is that the company has officially returned to the mixed reality game . It has been many years since we saw a significant one from the search giant in the AR/VR/Xr Front, but it seems that the hardware partners’ Swat is finally changing to travel with its XR platform.
After the keynote, Google showed the very small demo of the prototype device we saw on the stage. I just got only a few minutes with the device so my impressions were unfortunately limited, but I was immediately impressed with how light the mirrors were compared to the mirrors with the growing reality eyeglasses of the meta. Although both of these are very chunk, the Google’s prototype device is lightweight and is like a simple pair of glasses. The frames are slightly thicker than I usually wear, but not a lot.
At the same time, there are some important differences between Google’s XR glasses and meta and snap. Google’s device has only one side display – the right lens, you can see it in the picture at the top of this article – so the visuals are more “excellent” than that is completely immersed. I found on Google’s demo platform at I/O, seemed narrow, and I can verify that it is more limited than the 46-degree view field of Snap (Google refuses to share the specifications on how widespread Google is in its prototype.)
Instead, the display seemed a little similar to how you could use the front display of the folding phone. You can quickly see short snippets of time and notifications and short information from your applications as you listen to any music.
Obviously, Gemini plays a major role in the Android XR ecosystem and Google walks me through some demo of the assistant who works on smart glasses. I can see a show of books or some art on the wall and ask Gemini questions about what I see. It is very similar to the multimodal capabilities we have seen in the project Astra and elsewhere.
There are some bugs, however, carefully in the orchestrated demo. Before I finished my question, Gemini began to tell me what I was looking for, after which we both gave a break and interrupted each other.
Google Maps in glasses is one of the more interesting use of Google. Like Google Augmented reality walking directions you can get the head-up view of your next turn and see the smallest section of the map on the floor. However, it was unable to answer when I asked Gemini how long it would take to drive from my position to San Francisco. (This is actually like “tool output” and my demo ended very quickly.)
Like the most mixed reality demo I have seen, it’s still very early days. Google is careful to emphasize that it is a prototype hardware to show what Android XR capacity is, not a device that is planning to sell anytime. So any smart glasses or company hardware partners look very different from Google. What I can show with Android XR is how Google is thinking about bringing together AI and mixed reality. It is not so different from the meta, which sees smart glasses as crucial for long -term receiving its AI assistant. But now that Gemini is coming to Google production, the company has a very visionary foundation to actually achieve it.
Developing ..