LCD Displays and Bit Color Depth

Explaining the Difference Between 6, 8 and 10-bit Displays

ViewSonic VX4275SMhl 24-inch 4K Monitor
ViewSonic

Color Depth

The color range of a computer is defined by the term color depth. This means the total number of colors that the computer can display to the user. The most common color depths that users will see when dealing with a PCs are 8-bit (256 colors), 16-bit (65,536 colors) and 24-bit (16.7 million colors). True color (or 24-bit color) is the most frequently used mode now as computers have attained sufficient levels to easily work at this color depth.

Some professional use a 32-bit color depth, but this is mainly used as a means to pad the color to get more defined tones when rendered down to the 24-bit level.

Speed Versus Color

LCD monitors have encountered a bit of a problem when it comes to dealing with color and speed. Color on an LCD is comprised of three layers of colored dots that make up the final pixel. To display a given color, current must be applied to each color layer to give the desired intensity that generates the final color. The problem is that to get the colors, the current must move the crystals on and off to the desired intensity levels. This transition from the on to off state is called the response time. For most screens this was rated around 8 to 12ms.

The problem is that many LCD monitors are used to watch video or motion on the screen. With a really high response time for transitions from off to on states, pixels that should have transitioned to the new color levels trail the signal and result in an effect know as motion blurring.

This isn't a problem if the monitor is being used with applications such as productivity software, but with video and motion it can be jarring.

Since consumers were demanding faster screens, something needed to be done to improve response times. To facilitate this, many manufacturers turned to reducing the number of levels each color pixel render.

This reduction in the number of intensity levels allows the response times to drop but has the drawback of reducing the overall number of colors that can be rendered.

6-Bit, 8-Bit or 10-Bit Color

Now color depth was previous referred to by the total number of colors that the screen can render, but when referring to LCD panels the number of levels that each color can render is used instead. This can make things difficult to understand, but to demonstrate, we will look at the mathematics of it. For example, 24-bit or true color is comprised of three colors each with 8-bits of color. Mathematically, this is represented as:

  • 2^8 x 2^8 x 2^8 = 256 x 256 x 256 = 16,777,216

High-speed LCD monitors typically reduce the number of bits for each color to 6 instead of the standard 8. This 6-bit color will generate far fewer colors than 8-bit as we see when we do the math:

  • 2^6 x 2^6 x 2^6 = 64 x 64 x 64 = 262,144

This is far fewer than the true color display such that it would be noticeable to the human eye. To get around this problem, the manufacturers employ a technique referred to as dithering. This is an effect where nearby pixels use slightly varying shades or color that trick the human eye into perceiving the desired color even though it isn't truly that color.

A color newspaper photo is a good way to see this effect in practice. In print the effect is called halftones. By using this technique, the manufacturers claim to achieve a color depth close to that of the true color displays.

There is another level of display that is used by professionals called a 10-bit display. In theory, this can display over a billion colors, more than even the human eye can display. There are a number of drawbacks to these types of displays and why they are used only by professionals. First, the amount of data required for such high color requires a very high bandwidth data connector.

Typically, these monitors and video cards will use a DisplayPort connector. Second, even though the graphics card will render upwards of a billion colors, the displays color gamut or range of colors it can actually display will really be less than this. Even the ultra-wide color gamut displays that support 10-bit color cannot really render all the colors. All of this generally means displays that tend to be a bit slower and also much more expensive which is why they are not common for consumers.

How to Tell How Many Bits a Display Uses

This is the biggest problem for individuals who are looking at purchasing an LCD monitor. Professional displays will often be very quick to talk about 10-bit color support. Once again, you have to look at the real color gamut of these displays though. Most consumer displays will not say how many they actually use. Instead, they tend to list the number of colors they support. If the manufacturer lists the color as 16.7 million colors, it should be assumed that the display is 8-bit per-color. If the colors are listed as being 16.2 million or 16 million, consumers should assume that it uses a 6-bit per-color depth. If no color depths is listed, it should be assumed that monitors of 2 ms or faster will be 6-bit and most that are 8 ms and slower panels are 8-bit.

Does it Really Matter?

This is very subjective to the actual user and what the computer is used for. The amount of color really matters to those that do professional work on graphics. For these people, the amount of color that is displayed on the screen is very important.

The average consumer is not going to really need this level of color representation by their monitor. As a result, it probably doesn't matter. People using their displays for video games or watching video will likely not care about the number of colors rendered by the LCD but by the speed at which it can be displayed. As a result, it is best to determine your needs and base your purchase on those criteria.