Thursday, November 13, 2008

THE IMAGE SENSOR

Unlike traditional cameras that use film to store an image, digital cameras use a solid-state device called an image sensor. Image sensors are silicon chips that have numerous photosensitive areas (or photosites) constructed with photodiodes and arranged in arrays within the CCD or CMOS chip structures. The photosites are referred to as pixels. The pixels react to the light striking them, creating electrical charges proportional to the incident light; the more light, the higher the charge. The brightness recorded by each photosite is then stored as a set of numbers that can then be used to set the color and brightness of dots on the screen or ink on the printed page to reconstruct the image. Here we’ll look closely at this process because it’s the foundation of everything that follows.


Jorge Smith and Willad Boyal invented CCD in Bell Laboratory. In fact they were trying to invent a new kind of memory semiconductor but accidentally CCD was discovered. In 1970 first Digital camera based on CCD instead of film was made in Bell Labaratory

Image sensor chips-the chip that capture image fall in three main camps: CCD (Charge-Coupled Device) or CMOS (Complementary Metal Oxide Semiconductor) and Foveon

CCD (Charge-Coupled Device)
The most common type of sensor is the CCD (charge-coupled device). With a CCD, light is captured with individual photo-diode sensors. The photons that strike the sensor are converted to an equal number of electrons stored at individual sensor positions. Those electrons are then read electronically and stepped off of the charge transfer register. Once off of the CCD array, they are converted to their relative digital value. CCDs require a specialized chip construction process. Rather than having all the electronics on one chip, a separate chip set is required to handle support functions. There has been some progress made in integrating other electronics functions into the CCD, but for the most part CCD digital cameras require a considerable amount of supporting electronics. Depending upon the camera design, sets of anywhere from three to eight chips are incorporated in the camera’s image capture and conversion process.
CMOS (Complementary Metal Oxide Semiconductor)
There’s another type of sensor besides CCD that’s becoming popular in digital cameras, and that’s the CMOS (complementary metal oxide semiconductor) sensor. Over the last few years, CMOS sensors have become increasingly common. They are being used in medium and large format digital backs, in professional digital SLRs, as well as some consumer cameras. Both CMOS and CCD sensors are constructed from silicon. They have similar light sensitivity over the visible and near-IR spectrum. At the most basic level, both convert incident light into electronic charge by the same photo-conversion process. However, CMOS sensors can be made of the same silicon material as other computer chips. That means all the electronics can be
incorporated onto one chip, reducing production costs, space requirements and power usage. With CMOS, it’s possible to produce entire digital cameras on a single chip. CMOS sensors also have individual picture elements, but, unlike a CCD, the conversion of the electronic signal to a digital value is completed within the individual photo sensor. That makes it possible to read-out the values of the individual sensors in a single step, rather than having to step the electronic signal off of the register, as is the case with CCDs.
The difference between CCD & CMOS can be tabulated as follows:
CCD sensors, as mentioned above, create high-quality, low-noise images. CMOS sensors, traditionally, are more susceptible to noise.
Because each pixel on a CMOS sensor has several transistors located next to it, the light sensitivity of a CMOS chip tends to be lower. Many of the photons hitting the chip hit the transistors instead of the photodiode.
CMOS traditionally consumes little power. Implementing a sensor in CMOS yields a low-power sensor.
CCDs use a process that consumes lots of power. CCDs consume as much as 100 times more power than an equivalent CMOS sensor.
CMOS chips can be fabricated on just about any standard silicon production line, so they tend to be extremely inexpensive compared to CCD sensors.
CCD sensors have been mass produced for a longer period of time, so they are more mature. They tend to have higher quality and more pixels.
Matsushita Electric Industrial Co., Ltd., has announced an image sensor architecture that features advantages of both CCD and CMOS sensors called Maicovicon. Maicovicon consumes less power consumption than a charge couple device and exhibits higher quality image than a CMOS sensor.

NASA is also working on another type of sensor altogether. Under contract to the space agency, the Jet Propulsion Laboratory in Pasadena, California, is working on what’s being called an SOI (silicon on insulator) sensor. SOI sensors are extremely thin, just 1 micron, and could be applied to just about any flat surface.One of the companies on the cutting edge of CMOS development is Foveon, which developed the X3 sensor chip. In some respects, the X3 is revolutionary. It was the first full-color image sensor that captured red, green and blue light at each individual pixel position. Instead of using color filtration to capture RGB color values, the X3 is able to capture all three primary colors simultaneously. It can do that because it has three photo-detectors at every sensor location, making it possible to capture full color images, without having to use a color mosaic filter. CMOS sensors are able to do that without the complexity, and cost, of some CCD systems. Foveon was able to achieve the multi-color capture capabilities through the specific properties of silicon, which absorbs different light waves at different depths. Each X3 sensor consists of three photo-detectors located at different depths. Each detects the absorption of the red, green and blue light that has penetrated the silicon to that specific depth. Blue light is absorbed near the surface, green light is absorbed farther down and red light is absorbed even deeper. The individual photo-detectors convert the absorbed light into three signals. Those signals are converted to digital data, which is then optimized through software. According to the company, the X3 CMOS image capture and optimization process results in higher quality and sharper images, as well as better color. It also eliminates the color artifacts that can be a problem with CCD sensors.
FOVEON X3
NASA is also working on another type of sensor altogether. Under contract to the space agency, the Jet Propulsion Laboratory in Pasadena, California, is working on what’s being called an SOI (silicon on insulator) sensor. SOI sensors are extremely thin, just 1 micron, and could be applied to just about any flat surface.One of the companies on the cutting edge of CMOS development is Foveon, which developed the X3 sensor chip. In some respects, the X3 is revolutionary. It was the first full-color image sensor that captured red, green and blue light at each individual pixel position. Instead of using color filtration to capture RGB color values, the X3 is able to capture all three primary colors simultaneously. It can do that because it has three photo-detectors at every sensor location, making it possible to capture full color images, without having to use a color mosaic filter. CMOS sensors are able to do that without the complexity, and cost, of some CCD systems. Foveon was able to achieve the multi-color capture capabilities through the specific properties of silicon, which absorbs different light waves at different depths. Each X3 sensor consists of three photo-detectors located at different depths. Each detects the absorption of the red, green and blue light that has penetrated the silicon to that specific depth. Blue light is absorbed near the surface, green light is absorbed farther down and red light is absorbed even deeper. The individual photo-detectors convert the absorbed light into three signals. Those signals are converted to digital data, which is then optimized through software. According to the company, the X3 CMOS image capture and optimization process results in higher quality and sharper images, as well as better color. It also eliminates the color artifacts that can be a problem with CCD sensors.
HYBRID IMAGING TECHNOLOGY (HIT).
To further increase the quality of the images that these tiny CMOS-based cameras can capture, NASA is working on what’s called hybrid imaging technology (HIT). Theoretically, HIT merges the best of CCD and CMOS technology, in hopes of coming up with a new technology that’s better than either. Once implemented, the resulting technology should have higher resolution, better scalability and reduced power consumption.

1 comment:

digital image editing said...

Really it was a nice blog thanks for sharing a idea....

Photoshop tutorial