Disruptive Technology: More than 20:20 vision for everyone

Imagine totally non-invasive spectacles that could give people more then perfect vision. A wire goes over the ear into an electronic box held in a pocket. Science fiction?

On February 10, 1996, Deep Blue became the first machine to win a chess game against a reigning world champion (Garry Kasparov) under regular time controls. This was once thought science fiction.

In 1998, the "electronic box" for these apparently magic spectacles was approximately 1 metre cubed. It is now over ten years later, and the old adage of "mainframe to desktop in ten years" applies. In 2010 or thereabouts graphics cards in game playing PCs equalled or exceeded the computational capacity of "Deep Blue". Top of the range graphics cards cost of the order of 500, but a quarter of that will, I suspect, buy a graphics card that could get close to Deep Blue.

With regards to the adaptive optics spectacles,
concluded in 1998
In summary, we forecast major advances in spectacle lenses in the early part of the 21st century which will combine the power of adaptive optics technologies with the flexibility of electronic circuitry. The result will be a new generation of "smart spectacles" capable of adapting to the specific requirements of individual eyes to produce customized optical correction of unprecedented quality. However, to make this prediction come true will require an optometric community that is committed to building upon its traditional strength in visual optics by supporting research aimed at applying modern technologies to solve important optometric problems.
In 2013, this page had gone, but the paper can still be found amongst a collection on http://www.opt.indiana.edu/IndJOpt/PDF/ijospr99.pdf. (Scroll down and it appears.)
The following year
As affordable adaptive-optics systems move out of the laboratory and real-life demonstrations of their utility become more common, a host of new applications will emerge. No longer will only expensive telescopes and government-funded optical systems be wearing state-of-the-art spectacles.

By 2011, electronic spectacles are being advertised, but they are not adaptive optics. All they are, are spectacles where the user can touch the frame to get near or distant focus.
This article, written in 2006, suggests that Pixel Optics are making the real thing:
where it says
At the heart of PixelOptics' technology are tiny, electronically-controlled pixels embedded within a traditional eyeglass lens. Technicians scan the eyeball with an aberrometer -- a device that measures aberrations that can impede vision -- and then the pixels are programmed to correct the irregularities.
This may be true, but "real" adaptive optics, as the original article and the name suggests, scans the eye 25 times a second and adjusts the prescription accordingly. It isn't a technician that does it, it is the spectacles themselves. That is how an adaptive optics astronomical telescope works. Effectively, the user is having an eye exam 25 times a second and a new prescription 25 times a second based upon what he is looking at. The telescope version is even able to correct for atmospheric aberration due to heat fluctuations and get Hubble like images from ground based telescopes.

Of course adaptive optics spectacles is a massively disruptive technology. The global turnover of high street opticians and spectacle manufacturers must run into billions. If their custom made products can be replaced by one mass produced product, the producer of that product may do as well as Apple, Intel or Microsoft, but nothing like as well as the sum total of the world's opticians and spectacle manufacturers.
is a patent for add on lenses to optical interments that use this technology.

is where you can buy components to play with adaptive optics. But as a complete developer's kit, with SDK modules, costs between five and eight grand (probably plus VAT), this would be a very expensive experiment. You'd have to learn how to program the massively parallel graphics cards such as late model AMD Radeon series, and interface the software you have written to the adaptive optics modules. Even then, your home made AO spectacles could only be used near your PC - pretty pointless given that simple reading glasses do just as well in that environment. - And, of course, unless you are content with a monocle, you'd need a couple of kits!

Proof of concept for an adaptive optics mirror system using a $200 or $300 neural network development board .

is where you get (apparently free of money but not learning time) development SDKs. AMD have likely learned that once applications other than graphics come out for these cards, people other than game players will put them in their PCs. Intel have produced that Sandybridge range of motherboards that slash the cost of PCs by doing away with the graphics card. Applications that require the parallel processing of a graphics card are a way of graphics card manufacturers fighting back.

suggests that the company Bausch and Lomb may be working on the subject. However many companies seem more interested in supplying products for opticians to test their patients rather than products for the patient themselves to use. However I do note that Bausch and Lomb have brought an eye care food supplement product to the market similar to that produced by the Life Extension Foundation.
If it were possible to invest in a company making adaptive optics, and even if they are very expensive when they first appear on the market, a modest early investment should yield enough after-tax capital gain to buy a pair after they have hit the public investors' immagination, and have some money left over.

As the reader who has got this far will see, I have been Googling the subject to see if there is any chance of getting adaptive optics spectacles now or in the near future, or investing in a company developing them. It seems that the answer is "no", and it would probably be prudent to wait another decade before performing a similar search on whatver Google has become by then.

That is, of course, unless someone reading this knows differently.
[written August 2011]