LCD Displays and Bit Color Depth

Explaining the Difference Between 6, 8 and 10-bit Displays

ViewSonic VX4275SMhl 24-inch 4K Monitor
ViewSonic

The color range of a computer is defined by the term color depth—the total number of colors that the computer is capable of displaying given its hardware. The most common color depths you'll see are 8-bit (256 colors), 16-bit (65,536 colors) and 24-bit (16.7 million colors) modes. True color (or 24-bit color) is the most frequently used mode now as computers have attained sufficient levels to easily work at this color depth.

Some professional designers and photographers use a 32-bit color depth, but mainly as a means to pad the color to get more defined tones when the project renders down to the 24-bit level.

Speed Versus Color

LCD monitors struggle with color and speed. Color on an LCD is comprised of three layers of colored dots that make up the final pixel. To display a given color, current must be applied to each color layer to generate the desired intensity that results in the final color. The problem is that to get the colors, the current must move the crystals on and off to the desired intensity levels. This transition from the on-to-off state is called the response time. For most screens, it rates around 8 to 12 milliseconds.

The problem with response time becomes apparent when LCD monitors display motion or video. With a really high response time for transitions from off to on states, pixels that should have transitioned to the new color levels trail the signal and result in an effect called motion blurring. This phenomenon isn't a problem if the monitor is being used with applications such as productivity software, but with high-speed video and certain video games, it can be jarring.

Because consumers demanded faster screens, many manufacturers reduced the number of levels each color-pixel renders. This reduction intensity levels allows the response times to drop but has the drawback of reducing the overall range of colors that they support.

6-Bit, 8-Bit, or 10-Bit Color

LCD TV Pixels - Showing RGB Subpixels

Color depth was previously referred to by the total number of colors that the screen can render, but when referring to LCD panels the number of levels that each color can render is used instead.

For example, 24-bit or true color is comprised of three colors, each with eight bits of color. Mathematically, this is represented as:

  • 2^8 x 2^8 x 2^8 = 256 x 256 x 256 = 16,777,216

High-speed LCD monitors typically reduce the number of bits for each color to 6 instead of the standard 8. This 6-bit color will generate far fewer colors than 8-bit as we see when we do the math:

  • 2^6 x 2^6 x 2^6 = 64 x 64 x 64 = 262,144

This reduction is noticeable to the human eye. To get around this problem, device manufacturers employ a technique called dithering wherein nearby pixels use slightly varying shades of color that trick the human eye into perceiving the desired color even though it isn't truly that color. A color newspaper photo is a good way to see this effect in practice. In print, the effect is called halftones. By using this technique, the manufacturers claim to achieve a color depth close to that of the true color displays.

Why multiply groups of three? For computer displays, the RGB colorspace dominates. Which means that, for 8-bit color, the final image you see on the screen is a composite of one of 256 shades each of red, blue, and green.

There is another level of display that is used by professionals called a 10-bit display. In theory, it displays more than a billion colors, more than even the human eye discerns.

There are a number of drawbacks to these types of displays. First, the amount of data required for such high color requires a very-high-bandwidth data connector. Typically, these monitors and video cards will use a DisplayPort connector. Second, even though the graphics card will render upwards of a billion colors, the display's color gamut—or range of colors it can actually display—will really be less. Even the ultra-wide color gamut displays that support 10-bit color cannot really render all the colors. All of this generally means displays that tend to be a bit slower and also much more expensive, which is why they are not common for home consumers.

How to Tell How Many Bits a Display Uses

Professional displays will often be very quick to talk about 10-bit color support. Once again, you have to look at the real color gamut of these displays though. Most consumer displays will not say how many they actually use. Instead, they tend to list the number of colors they support. If the manufacturer lists the color as 16.7 million colors, it should be assumed that the display is 8-bit per-color. If the colors are listed as being 16.2 million or 16 million, consumers should assume that it uses a 6-bit per-color depth. If no color depths is listed, it should be assumed that monitors of 2 ms or faster will be 6-bit and most that are 8 ms and slower panels are 8-bit.

Does it Really Matter?

The amount of color really matters to those that do professional work on graphics. For these people, the amount of color that is displayed on the screen is very important. The average consumer is not going to really need this level of color representation by their monitor. As a result, it probably doesn't matter. People using their displays for video games or watching videos will likely not care about the number of colors rendered by the LCD but by the speed at which it can be displayed. As a result, it is best to determine your needs and base your purchase on those criteria.