How do you think the new GigE standards will influence the machine vision industry?
Respond or ask your question now!
SIGGRAPH: WHERE CONFOCOAL MICROSCOPY MEETS SYNTHETIC APERATURE RADAR
At Siggraph this year, one of the more interesting papers to be given expands upon the concept of the use of image arrays to create a single synthetic image with a wide aperture and shallow depth of field. The principles of the presentation are borrowed from both remote sensing and medical imaging. Thus, we begin to see why I coined the term ?connecting the dots? in May's editorial. From remote sensing, it constitutes the basis for synthetic aperture radar (SAR), while in medical imaging, it underlies X-ray tomosynthesis in which the detector and source move laterally and in opposite directions on either side of a common focal plane.
In fact, many of the principles have been described in previous technical presentations. For incoherent visible light, the idea of averaging multiple views in a light field to simulate a synthetic aperture has been proposed. The application of this concept has also been demonstrated for seeing through foliage and in several cases using real imagery with a moving camera and in another using dense camera arrays to create the techniques entitled synthetic aperture photography ( SAP ).
This concept can now be applied to illumination. Until recently, physical systems for generating light fields have been limited by available technology to a small number of image-producing sources. Now with the size and costs of image projectors declining, a dense array of projectors allows for the simulation of a projector with a wide aperture. This system produces a real image with a depth of field so shallow that it ceases to exist a short distance from the focal plane. This technique is entitled synthetic aperture illumination ( SAI ).
These techniques rely on proven technology to employ computer-assisted optical effects. Confocal microscopy is a family of imaging techniques, which relies on focused pattern illumination and synchronized imaging to create cross-sectional views of 3D biological specimens. Recently, researchers at Stanford University and Fakespace Labs adapted confocal imaging principles to large-scale scenes by replacing the optical apertures with arrays or either real or virtual projectors and cameras. The prototype implementation relies on a video projector, a camera and an array of mirrors to explore the use of confocal microscopy of partially occluded environments such as foliage, and weakly scattered environments such as murky water.