Quantcast

How Backside Illuminated Sensors Improve Picture Quality

By Wesley Fenlon

Did you know an array of teensy-tiny wires can block the light meant for the pixels on your camera sensor? It's true! Backside illumination provides a solution.

Today’s cameras are awash in so many megapixels that we sometimes forget more doesn’t always equal better. As we continue to shrink our sensors and fill them with millions of smaller pixels, capturing enough light to produce a clear picture becomes a challenge. The solution for cameras like the Pentax Q, which was designed to be incredibly small, is a technique called backside illumination.

Backside illumination shines light onto the camera sensor from the back instead of the front, exposing it to more light and thus creating a better image. Most consumer cameras don’t use this process yet--it requires some delicate and expensive manufacturing--but as the Pentax Q demonstrates, backside illumination is becoming more common. Here’s how the next big thing in camera sensor design works.

Most cameras use frontside illumination technology, which starts with a thin layer of light sensitive pixels arranged on a silicon wafer. Several sheets of glass-like material are arrayed on top of the pixels, each containing tiny metallic wires used to connect all the pixels together and link them to other camera circuitry. On top of that come the color filter and micro lens layers. Obviously this entire array is still incredibly thin, but the wiring starts to cause problems by blocking some of the light that needs to reach the pixels below. That degrades image quality, especially in low-light conditions where every bit of illumination counts.

Backside illuminated sensors don’t look so different at first. Pixels are arranged on a sheet of silicon--nothing different there. A complex array of metal wiring is arranged above the pixels to connect the entire array--nothing different there. Then, the difference: the silicon wafer is flipped over and ground away from the bottom (now the top) until only a very thin portion of the wafer (we’re talking microns) separates the pixels from the surface. The color filters and micro lenses are then arrayed on top of the wafer--with all the wiring out of the way on the bottom, the pixels in a backside illuminated sensor can capture far more light.

Considering how fragile silicon wafers are, grinding away layer after layer without breaking the sensor is no easy task. Backside illuminated sensors have mostly been limited to expensive camera arrays until recently--the Pentax Q, iPhone 4 and EVO 4G are all examples of devices using back-illuminated sensors.

And they’re not perfect yet--Camera Technica notes that frontside illumination’s wire array provides a natural aperture missing from backside illumination design. That means there’s a chance an off angle light ray could strike the wrong pixel. Still, that's probably a worthwhile trade-off for improved low-light shooting. Smartphone makers will likely start bragging more about their backside illuminated sensors than megapixel count in the years ahead as lowering manufacturing costs help this technology sweep the camera industry.

Images via The Phoblographer.com, CameraTechnica.com