Color depth

Color depth or bit depth, is a computer graphics term describing the number of bits used to represent the color of a single pixel in a bitmapped image or video frame buffer. This concept is also known as bits per pixel (bpp), particularly when specified along with the number of bits used. Higher color depth gives a broader range of distinct colors.

Color depth is only one aspect of color representation (formally, the gamut: which colors can be expressed), expressing how finely levels of color can be expressed (formally, gamut depth); the other aspect is how broad a range of colors can be expressed. The RGB color model, as used below, cannot express many colors, notably saturated colors such as yellow. Thus, the issue of color representation is not simply "sufficient color depth" but also "broad enough gamut".

Indexed color
With relatively low color depth, the stored value is typically a number representing the index into a color map or palette. The colors available in the palette itself may be fixed by the hardware or modifiable within the limits of the hardware (for instance, both color Macintosh systems and VGA-equipped IBM-PCs typically ran at 8-bit due to limited VRAM, but while the best VGA systems only offered an 18-bit (262,144 color) palette from which colors could be chosen, all color Macintosh video hardware offered a 24-bit (16 million color) palette). Modifiable palettes are sometimes referred to as pseudocolor palettes.


 * 1-bit color (21 = 2 colors) monochrome, often black and white, compact Macintoshes.
 * 2-bit color (22 = 4 colors) CGA, gray-scale early NeXTstation, color Macintoshes.
 * 3-bit color (23 = 8 colors) many early home computers with TV displays
 * 4-bit color (24 = 16 colors) as used by EGA and by the least common denominator VGA standard at higher resolution, color Macintoshes.
 * 5-bit color (25 = 32 colors) Original Amiga chipset
 * 6-bit color (26 = 64 colors) Original Amiga chipset
 * 8-bit color (28 = 256 colors) most early color Unix workstations, VGA at low resolution, VGA, AGA, color Macintoshes.
 * 12-bit color (212 = 4096 colors) some Silicon Graphics systems, Neo Geo, Color NeXTstation systems, and Amiga systems in HAM mode.
 * 16-bit color (216 = 65536 colors) some color Macintoshes.

Old graphics chips, particularly those used in home computers and video game consoles, often feature an additional level of palette mapping in order to increase the maximum number of simultaneously displayed colors. For example, in the ZX Spectrum, the picture is stored in a two-color format, but these two colors can be separately defined for each rectangular block of 8x8 pixels.

Direct color
As the number of bits increases, the number of possible colors becomes impractically large for a color map. So in higher color depths, the color value typically directly encodes relative brightnesses of red, green, and blue to specify a color in the RGB color model.

8-bit direct color
A very limited but true direct color system, there are 3 bits (8 possible levels) for both the R and G components, and the two remaining bits in the byte pixel to the B component (four levels), enabling 256 (8 × 8 × 4) different colors. The normal human eye is less sensitive to the blue component than to the red or green, so it is assigned one bit less than the others. Used, amongst others, in the MSX2 system series of computers in the early to mid 1990s.

Do not confuse with an indexed color depth of 8bpp (although it can be simulated in such systems by selecting the adequate table).

12-bit direct color
In 12-bit direct color, there are 4 bits (16 possible levels) for each of the R, G, and B components, enabling 4,096 (16 × 16 × 16) different colors. This color depth is sometimes used in small devices with a color display, such as mobile telephones.

HighColor
Highcolor or HiColor is considered sufficient to provide life-like colors, and is encoded using either 15 or 16 bits:


 * 15-bit uses 5 bits to represent red, 5 for green, and 5 for blue. Since 25 is 32 there are 32 levels of each color which can therefore be combined to give a total of 32,768 (32 × 32 × 32) mixed colors.


 * Many 16-bit color schemes uses 5 bits to represent red, 5 bits to represent blue, but (since the human eye is more sensitive to the color green) uses 6 bits to represent 64 levels of green, sometimes known as 5650 format. These can therefore be combined to give 65,536 (32 × 64 × 32) mixed colors. Some formats like the Macintosh 16-bit color scheme known as "Thousands of colors" use 5 bits for each of the colors, and then the last bit for a 1-bit alpha value. There is another format that uses 4 bits for all colors and alpha, known as 4444 format.

LCD displays

 * Almost all cheap LCD displays use dithered 18-bit color (64 × 64 × 64 = 262,144 combinations) to achieve faster transition times, without sacrificing truecolor display levels entirely.

Truecolor
Truecolor can mimic far more of the colors found in the real world, producing over 16.7 million distinct colors. This approaches the level at which megapixel monitors can display distinct colors for most photographic images, though image manipulation, monochromatic images (which are restricted to 256 levels, owing to their single channel), large images or “pure” generated images reveal banding and dithering artifacts.

However, Truecolor, like other RGB color models, cannot express colors outside of its RGB color space (generally sRGB), such as saturated yellow – this is a problem of limited gamut.


 * 24-bit truecolor uses 8 bits to represent red, 8 bits to represent blue and 8 bits to represent green. 28 = 256 levels of each of these three colors can therefore be combined to give a total of 16,777,216 mixed colors (256 × 256 × 256). Twenty-four-bit color is referred to as "millions of colors" on Macintosh systems.

30-bit color
Video cards with 10 bits per color, or 30-bit color, started coming into the market in the late 1990s. An early example was the Radius ThunderPower card for the Macintosh, which included extensions for Quickdraw and Photoshop plugins to support editing 30-bit images.

32-bit color
"32-bit color" is generally a misnomer in regard to display color depth. While actual 32-bit color at ten to eleven bits per channel produces over 4.2 billion distinct colors, the term “32-bit color” is most often a misuse referring to 24-bit color images with an additional eight bits of non-color data (I.E.: alpha, Z or bump data), or sometimes even to plain 24-bit data.

Systems using more than 24 bits in a 32-bit pixel for actual color data exist, but most of them opt for a 30-bit implementation with two bits of padding so that they can have an even 10 bits of color for each channel, similar to many HiColor systems.

Beyond truecolor
While some high-end graphics workstation systems and the accessories marketed toward use with such systems, as from SGI, have always used more than 8 bits per channel, such as 12 or 16 (36-bit or 48-bit color), such color depths have only worked their way into the general market more recently.

As bit depths climb above 8 bits per channel, some systems use the extra bits to store more intensity range than can be displayed all at once, as in high dynamic range imaging (HDRI). Floating point numbers are used to describe numbers in excess of 'full' white and black. This allows an image to describe accurately the intensity of the sun and deep shadows in the same color space for less distortion after intensive editing. Various models describe these ranges, many employing 32 bit accuracy per channel. A new format is the ILM "half" using 16-bit floating point numbers, it appears this is a much better use of 16 bits than using 16-bit integers and is likely to replace it entirely as hardware becomes fast enough to support it.

The ATI FireGL V7350 graphics card supports 40-bit and 64-bit color.

Television color
Most of yesterday's TVs and computer screens form images by varying the intensity of just three primary colors: red, green, and blue. Bright yellow, for example, is composed of equal parts red and green, with no blue component. However, this is only an approximation, and is not as saturated as actual yellow light. For this reason, recent technologies such as Texas Instruments's BrilliantColor augment the typical red, green, and blue channels with up to three others: cyan, magenta and yellow. Mitsubishi and Samsung, among others, use this technology in some TV sets. Assuming that 8 bits are used per color, such six-color images would have a color depth of 48 bits.

Analog TVs use continuous signals and therefore have infinite color depth.