Of all the product announcements at Google I/O this year, the one that came closest to us was a preview of a prototype pair of augmented reality glasses that put real-time translation and transcription directly in the wearer’s line of sight. This announcement was accompanied by a video of a mother and her daughter speaking different languages. The mother spoke Mandarin and the daughter spoke English, the relationship of which could only be strengthened if only they really had a way of communicating with each other.
So there’s a moment in the video where a Google product manager hands his daughter a pair of glasses, not the Google Glasses we know at the time, but a normal-looking pair of glasses. But these glasses were far from regular. They transcribed and translated in real time what was being said, right in front of the person’s eyes and not on a small screen nearby. As mentioned in the video, it was like having “subtitles for the world”.
The video dropped near my home because I too come from a bilingual household. That is to say, I am bilingual, but my mother only speaks Spanish. Since we moved to this country, I have seen my mother struggle to communicate with others and have had to step in as an official translator, even as I was learning the language myself. But for me it was easier, I was still very young and the language centers in my brain were still flexible. However, that was a different story for her, who had just started learning English when she was well over 30. If only my mother had this AR technology available to her back then, would things have been different?
I was delighted to read this week that Google plans to move forward with this project, and starting next month they will start testing these AR glasses in real-world conditions.. The tests will start with a few dozen selected Googlers and testers, who will wear the AR glasses, equipped with an in-lens display and visual and audio sensors, during their normal day-to-day activities. Google hopes testing in this way, rather than in a lab, will help them better understand and develop useful features that may be difficult or impossible to recreate indoors. For example, Google has listed AR navigation, which will need to consider factors such as weather and busy intersections.
We will begin small-scale testing in public places with AR prototypes worn by a few dozen Googlers and selected trusted testers. These prototypes will include in-lens displays, microphones and cameras, but will have strict limitations on what they can do. For example, our AR prototypes do not support photography and videography, although image data is used to enable experiences such as translating the menu in front of you or pointing to a cafe nearby.
Juston Payne, Group Product Manager
Those of us who remember Google Glass and its downfall due to privacy concerns can rest easy knowing that although the prototypes will be equipped with cameras and microphones, they will not support photography and videography. Google will test a small number of augmented reality prototypes in parts of the United States with strict limitations on where testers can operate and the types of activities they can engage in. Additionally, all testers must undergo training on devices, protocols, privacy, and security. The images captured by the device will be strictly used to enable the AR experiences built on it.
This time around, Google is making sure that privacy doesn’t get in the way of the usefulness of this project. Although the AR prototypes look like normal glasses, they will have an LED indicator that will light up if image data is being collected for analysis. So if you see someone wearing one, you can ask the tester to delete the image data if you want. Google says if/when this happens, the image data will be deleted from all logs. The prototype glasses will also not be able to be used while driving, operating heavy machinery or playing sports.
Although it seems that Google has taken all the necessary precautions, I can’t help but feel that the memory of Google Glass is too fresh in the public’s collective mind and will create some bias. It’s a shame because I want a product like this to be available to as many people as possible at a reasonable price so that it can be used by those who need it most. However, Google says they want to get it right this time around and will take their time to make sure this project is a success.