One way to accomplish this is through eye-tracking technology, but existing eye-tracking technologies have some disadvantages. In some uses of a heads-up display, it can be useful to know what part of the scene a user is viewing. Google bashes the current eye-tracking technologies in this patent, saying that they cause unnecessary bulk: As you can see detailed above, the eye-tracking path comes from a camera that’s embedded in the device itself (124), and uses the reflective prisms-the same reflective prisms that are used to show the display-to take a photo of the eye. The first is the display path, which demonstrates how the eye will see what is being projected (like the Google Glass display), the second is the ambient path, which shows from where the ambient (background) light comes from, and the last is the built-in eye-tracking. The patent details three different “paths”. The current version of Glass can see very subtle eye gestures like winks, but this patent wants to take that tech to a whole different level. The device would have two illumination sources that light up the eye in certain ways (136 above), and then the image captured would be used to determine where the eye is looking in the background. The system uses a camera embedded in the device itself to take photos of the user’s eye via reflective prisms. This is exactly the technology that Google has claimed in US patent 9,001,030, published on April 7th 2015.
![eye tracking android eye tracking android](https://user-images.githubusercontent.com/41565823/53384554-9419f300-397b-11e9-84e6-c4e01c293ea9.gif)
You wouldn’t have to tap or speak all you would have to do is take a look. Yelp reviews, phone numbers, and breakfast menus could be a glance away, and Glass eye-tracking could make it easier to get that information. Imagine being able to walk down the street, glance at a restaurant that you’re walking by, and have Glass immediately provide you with quick heads-up information about the location.
![eye tracking android eye tracking android](https://www.biopac.com/wp-content/uploads/ETVisionSystem.jpg)
It’s what I hoped for when Google first showed off Project Glass: Having to control Glass with voice and tap gestures can be cumbersome for a device that’s supposed to get out of the way and make its wearers lives easier, and eye-tracking might be just what Glass needs to make a wearable heads-up display practical in many situations. There’s nothing overwhelmingly groundbreaking about it, but eye-tracking technology is definitely something that Google might be considering for the next version of Glass. Google has given much of its focus and attention to the Glass at Work program over the last couple of years, and it’s no secret that specific work applications have been where the device has found its best use cases, but what will that mean for the direction that Google takes with the device’s hardware in the future?Ī newly-published patent might give us an idea, and it might involve a new way to get information from the wearable display device based on where you’re looking. The next iteration of Google Glass is already in the works, but not much information has surfaced thus far about what the device’s hardware will be like.