Advanced Imaging

AdvancedImagingPro.com

   

Advanced Imaging Magazine

Updated: January 12th, 2011 10:01 AM CDT

Component Integration: CCD vs. CMOS

CCD has been around longer, but CMOS is closing the gap
image of a star
© NASA, ESA and Keith Noll Space Telescope Science Institute.
This image of a star, similar to our sun, 3,600 light years way ending its life was taken by Feb. 6, 2007, by the Wide Field and Planetary Camera 2 (WFPC2) on the Hubble telescope. It is actually a composite of many separate exposures made by the CCD instrument.
CMOS sensor
© Sarnoff Corp.
There are fundamental differences in architecture and operations between this CMOS sensor and a CCD sensor.
Advertisement

By Barry Hochfelder

"CCD always has been a custom fabrication process," Janesick says. "CMOS, for a good five or six years, has been advertised as a standard commercial process. It's cheaper. There's nothing custom. It didn't go very far in terms of raw performance. CCD is textbook quality, a perfect device. That's why you can see the edge of the universe with it. That's the standard.

"The CMOS people said, 'silicon is silicon. It can do the same things.' That's far from the truth. CMOS is lacking in raw performance—the work we're doing is to get it up to CCD standards. There are fundamental differences in architecture and operations."

The choice really is application dependent, ranging from cell phones to the Hubble Telescope and hundreds of applications in between.

"You look at the application, the economies and the performance," Janesick says. "Currently you would use CMOS for the camera phone because the detector has to be integrated, you need low power and the quality doesn't have to be that good. It's fundamentally difficult to take good pictures. The lenses can't do it. You need a very high-quality lens to work with 1.5 micron pixel cameras.

"The cell phone camera is based on economics, the Hubble costs millions of dollars to produce a camera system. You go with CCD because it's a textbook sensor. CMOS can perform the same way but it takes a lot of custom work. We start with a large pixel of 8 microns or larger. Then you have to 'stitch' the device. When you fabricate the sensor you have a limited field of view in the lithography to make it, so you stitch several fields together to form one view. You also need high-end, thin silicon. It's blind to IR because you only care about visible. In science, we go to very thick silicon, epithelial, silicon. It's expensive. And you want the device to be thin. That increases the cost. You may wind up quadrupling the cost."



Subscribe to our RSS Feeds