Welcome to a website about a technology that doesn't exist, but is beginning to exist. Phased Array Optics is the use of Spatial Light Modulators built to nanometer specifications for general manipulation of light. The most compelling application of Phased Array Optics (PAO) is the production of three dimensional scenery that is indistinguishable from real life. The display application has been compared to a Star Trek Holodeck.
How is Phased Array Optics different from Holography?
Phased Array Optics refers to hardware capable of altering the phase and amplitude of light on scales smaller than a wavelength over large areas of addressable control. The technology to do this doesn't exist yet, but spatial light modulator (SLM) and computer technology are moving in this direction. When the technology is achieved, artifacts traditionally associated with holographic image production will disappear. There will be no visible reconstruction beam, false images, or unwanted orders of diffraction.
Are Phased Arrays a form of Computer Generated Holography?
It's been argued that PAO is sufficiently distinct from holography that it shouldn't be considered holography. However given the lack of success of other holography offshoots, such as kinoform, at distinguishing themselves from holography, it must admitted that the term "holography" is now used so expansively that PAO could be considered a type of holography. Phased Array Optics may therefore be seen as the physical limit of holography: Real-time dynamic holograms displayed by spatial light modulators of sub-wavelength resolution. A multiplexed color phased optical array could act as a window into any 3D world that can be imagined.
Optical Phased Arrays vs. Phased Array Optics
The term "Phased Array Optics" can be confusing because "Optical Phased Arrays" usually means the specific use of spatial light modulators to steer laser beams, or aim sensors, by progressive phase modulation. This technology exists today. However in the nanotechnology community, Phased Array Optics has come to mean all optical applications of nanoscale arrays of addressable phase and amplitude modulators. These applications are more general than beam steering, and include real-time diffractive optical elements (DOE) and dynamic holography.
My own thinking about this subject was the result of taking an optics course while thinking about nanotechnology in 1990. It seemed clear that addressable nanoscale optical elements could be a real-life implementation of Huygens' Principle, allowing any possible pattern of light waves to be generated from a planar array of sources. The idea was proposed and accepted as a book chapter in 1991, although the book, Nanotechnology: Molecular Speculations on Global Abundance (MIT Press) didn't finally make it into print until 1996. I later learned that the idea of holographic television, or holovideo, is traceable at least as far back as the Nobel acceptance speech of Dennis Gabor in 1971. The first forms of electro-holography appeared in the 1980s, and the field continues to advance.
Conservatively, holographic television may develop in the 21st century at the same rate that 2D television did in the 20th century. However that pace could accelerate considerably if Moore's Law takes hold for spatial light modulators. As one illustration of the difference time makes, in 1991 I wrote that a single computer would have difficulty generating a large diffraction-limited PAO image within the lifetime of universe with non-FFT methods. In 2007, the fastest parallel computer in the world could actually do that calculation in only a few millennia. Naive extrapolation would predict large PAO displays operating at movie-frame rates by the mid-21st century.