Google closed its I/O presentation with one big surprise: a look at the latest AR glasses. The main feature Google showed was the ability to see languages translated before your eyes, which seems to me to be a very practical application for AR glasses. While much of Silicon Valley has invested heavily in making AR glasses a reality, no one has yet suggested a truly “killer” app for AR that would allow you to overlook the wide variety of privacy issues inherent in the technology. can see. Live translation of the spoken word would definitely be a great feature.
The company didn’t share any details on when they might be available, only demonstrating them in a recorded video that didn’t actually show the screen, or how you’d handle it. But what was shown in the video painted a really cool picture of a potential AR future.
In one demo, a Google product manager tells someone wearing the glasses, “You should see what I’m saying, just transcribed for you in real time. A bit like subtitles for the world.” Later, the video shows what you might see wearing the glasses: With the speaker in front of you, the translated language appears in your line of sight in real time.
Until these become a real product for us to try, we won’t know how well they can work in practice. And it’s unclear whether this is Google’s Project Iris product that we reported on in January or something completely different. But Google’s vision shown at I/O would be incredibly helpful if it came true. You can watch the video for yourself here†