Waid Observatory

The CCD Image

The method used by CCD cameras to record an image involves dividing the image, or frame, into a rectangular array of small areas known as pixels.  The brightness of each individual pixel is measured and recorded in data file on a computer or other storage device.  The image is created by displaying each pixel, in its proper position, with a shade of gray that is in proportion to its recorded signal strength, or brightness.  This ranges from "black", absence of signal, to "white", saturation of signal.  The resolution of the image is directly related to the number of pixel elements.  The early CCD cameras had very small arrays and produce pictures of low resolution.  The extreme version of this is a camera with one pixel.  This type of camera is known as a photometer and only records the overall brightness of a given object.  Photometers are still used for certain esoteric work where only high precision brightness measurements are required.  The typical modern CCD camera uses a mega-pixel array of very small pixels.  This allows imaging with the resolution of photographic film and the advantage of instant computer access and enhancement of the image

The combination, or stacking, of many individual images is possible with a CCD camera.  These sub-frames, of relative short exposure duration, combine to create a final image roughly equivalent to one exposure of the combined duration.  For example, 12 images of 5 minutes each combine to produce one image of 1 hour's duration. Many astronomical images are footnoted with the number of individual frames along with exposure times used in creating the final image.  Although the best image is created with sub-frames of as long of duration as possible, there is a practical limitation.  The light pollution inherent to most imaging locations in urban, and even rural, areas causes a "sky glow" to register in the image.  This is simply the camera recording the scattered light produced by artificial sources.  This sky glow accumulates with the length of exposure and obscures faint objects.  To a certain extent, increasing the number of sub-frames while shortening the individual exposure times can lessen this phenomenon.  This ability is a significant advantage over film photography.

Modern CCD cameras allow imaging at different resolutions.  This is somewhat likened to different film speeds on photographic systems.  The increased sensitivity of high-speed film is "paid for" by increased grain size and thus limits resolution.  By the same token, lower resolution settings increase the sensitivity of the CCD camera.  The resolution of the CCD camera is set by altering the camera's "bin" setting. An image array is made up of a rectangle of pixels.  Changing the bin setting links pixels together to form a larger effective pixel size.  This larger "effective" pixel records the combined signal for its sub-pixels and thus increases the effective sensitivity.  As an example, consider an array of sixteen pixels on a side. This is an array of 256 pixels.  If the bin setting is 1x1 then the image will consists of 256 pixels and the brightness of each pixel will be proportional to the photons recorded by it.  If the bin setting is 2X2 then the camera will add the signal from a group of pixels that is 2 to a side.  In this case a total of four pixels.  The final image produced will be made of 64 "effective" pixels, considerable less resolution, but each effective pixel will have 4 times the signal strength.

As stated above the CCD camera only records the brightness of the pixels.  There is no inherent color information available directly from the CCD chip.  Using filters placed in front of the imaging chip derives color information.  These filters are typically red, green, and blue; however, they can be cyan, magenta, and yellow.  Three separate grayscale images are taken, one through each filter.  These are then processed in a computer to produce an image that displays in color.  Even on so-called "one shot" color cameras, the entire image array is covered with a RGB or CMY mask.  This allows recording of each filtered layer during a single exposure.  An on board computer, or a standalone computer, still has to combine these filtered grayscale exposures to produce a colored image.  There are pros and cons to using a one shot color camera.  The main drawback is the reduced resolution and sensitivity that results from the incorporation of the colored mask.  Technology is growing by leaps and bounds and one shot color CCD cameras are being improved at a rapid pace.  I predict that in the relative near future, the color CCD camera will become the camera of choice for "pretty picture" color astronomical imaging.  SBIG Astronomical Instruments and Starlight Xpress are now producing high quality color cameras.

High quality color images produced with filtered images from CCD cameras combine both a color, RGB, image and a high-resolution grayscale image known as the "luminance" image.  This luminance image is taken at the highest resolution that conditions allow.  It is then processed to produce as high a quality image as possible.  The series of filtered images use to make the color RGB image may be taken at a lower resolution, or bin setting, than the luminance.  This allows for a shorter exposure time at the expense of resolution.  The final image, referred to as a LRGB image, is combined in the computer and does not significantly suffer from the lower resolution of the RGB image.  The detail and sharpness comes from the luminance image with the RGB image only providing the color information. Finally, the image will usually be post processed in an image editing program such as PhotoShop or Paint Shop Pro.  Brightness levels will be adjusted to bring out faint portions of the image while bright, overexposed, areas may be adjusted to reveal hidden details.  Final color adjustments may be made to correct for light pollution and other atmospheric effects.  The result can be an amazingly rich and detailed image.

Copyright Donald P. Waid