

A star or small detail for example, will never land on the exact same pixel every time. Why? Because, when we take sub-frames, there is usually a little drift or variation between each frame. Only then we can stack it with other sub-frames and finally process it to bring out the details. We must first apply a “Debayer Matrix” to the sub-frame images. However we cannot stack RAW files from a colour camera!
Raw stacking software software#
Various stacking software is used, like Autostakkert or Registax for stacking video, and for deepsky we use software like Pixinsight, Deepsky Stacker, Astro Art and so-on. When using a dedicated astronomy cameras we always capture images or video in RAW mode (for best signal purity), and we stack many sub-frames (or “subs”) to get more detail in the image, to reduce noise, and to improve dynamic range so we can use all sorts of nice processing techniques to show amazing detail. OK, so at what stage do I Debayer when processing my colour images?

Lucky for us, this information is described as the “Bayer Matrix” for that particular sensor, which just a formula describing the pixel filter sequence, e.g.

Only then can they form a meaningful “RGB” colour image, with the usual RGB (Red Green Blue) colour channels. SER Video file in RAW mode, we need to tell the computer which of these black, grey and white pixels have red, green and blue filters associated with them. Therefore, when imaging and creating an RGB image from a. The computer displaying the image (and your image processing software) does not know which pixels have, say, a red filter above them, or which have a blue or green filter above them. When the image comes from the sensor into the computer, it is composed of one big monochrome image (it has white, black and all the levels of grey inbetween). The order or sequence in which these tiny filters appear is usually different for each sensor, for example RGGB, GRBG, GBRG, BGGR… This is referred to as the Bayer Matrix or Bayer Filter (link to Wikipedia). So a “green” pixel is the same as a “red” pixel – but the filter above it is red or green. These filters selectively allow allow red, green or blue light through. However, just above this sensitive layer of pixels is a grid of tiny filters, one for each pixel. This sensitive surface can be considered “monochrome” or black and white only. The sensor has a sensitive layer of pixels which measure the “intensity” of light only, and converts it to numbers. Note: Debaying is required for any output from any colour camera sensor when used in RAW mode. This article applies to deepsky images (.FITS format) and for solar system/planetary videos (.SER format).Ĭolour CMOS camera sensors usually have four colour channels (two greens, one red and one blue).
