I can imagine a workable heads up display for day to day use. It would take the form of ordinary glasses and overlay real time annotations and then localised information can be "offered" to the HUD. I am not talking about displaying it all the time - I think that would be too invasive. The user would of course need to be able to choose what and when the annotations would activate, in the same way I can choose when to look at my phone.For this to come together making a revolutionary step in technology I think three things have to come together:
- HUD technology to display data and images onto transparent material in a form factor that is comfortable and cosmetically acceptable.
- Maps and databases that are indexed by GPS and image processing data
- Powerful and intelligent video and image processing capabilities
The real cleverness though is not the hardware, it is the software that can figure out wear you are (GPS) and what you are looking at (compass+database) for the static content like buildings. For this you only have to look at the brilliance of Google maps and Google streetview as applied to a mobile phone and combine this with image metadata searching and actual image searching such as TinEye.
Then for the really hard (and cool) stuff you need image processing to analyse the dynamic content like people. Moving licences plates can be solved (automatic toll Bridges), so can moving faces (automatic airport security).
It is little wonder then that Google are looking at things like dynamic real time advertising on phones that is intelligent as to where you are - to lure you into the adjacent shops. And online purchases, not to clip the ticket but for the data. It is not much of a jump to get that phone to also relay the precious information to the glasses which know which way you are looking? Ooh scary thought they might want to track where I was looking too!