HDR vs. 4K: What's the Difference?

4K and HDR both improve image quality, but not in the same way

When shopping for a TV you may come across the terms "4K" and "HDR." Both of these technologies help improve image quality—but they do so in very different ways. Let's cut through the noise and learn what they mean.

HDR vs 4K

Overall Findings

  • Refers to screen resolution—the number of pixels a screen can fit.

  • Used synonymously with "Ultra HD" (UHD). Refers to horizontal screen resolution of roughly 4,000 pixels.

  • Requires UHD-compatible devices and components to avoid upscaling

  • Stands for "High Dynamic Range."

  • Wider color gamut and contrast range than Standard Dynamic Range (SDR).

  • Bright tones made brighter without overexposing. Dark tones made darker without underexposing.

4K and HDR are not competing standards. 4K refers to screen resolution, or the number of pixels that can fit on a television screen or display; it's also sometimes referred to as "UHD" or "Ultra HD," although there is a slight difference. HDR, on the other hand, stands for "High Dynamic Range," and refers to the contrast or color range between the lightest and darkest tones in an image. HDR delivers a higher contrast—or larger color and brightness range—than Standard Dynamic Range (SDR), and is more visually impactful than 4K. That said, 4K delivers a sharper, more defined image.

Both standards are increasingly common among premium digital televisions, and both deliver stellar image quality. TV makers prioritize the application of HDR to 4K Ultra HD TVs over 1080p or 720p TVs, so there is little need to choose between the two standards.

4K resolution may also be referred to as Ultra HD, UHD, 2160p, Ultra High Definition, or 4K Ultra High Definition.

Resolution: 4K Is the Standard

  • 4K/UHD TV standard is 3840 x 2160 pixels. 4K cinema standard is 4096 x 2160 pixels.

  • Four times the number of pixels as 1080p, which means four 1080p images can fit in the space of one 4K resolution image.

  • Resolution-agnostic, although most HDR TVs are also 4K TVs.

4K refers to a specific screen resolution, and HDR has nothing to do with resolution. While HDR has competing standards, some of which specify a minimum 4K resolution, the term is generally used to describe any video or display with a higher contrast or dynamic range than standard dynamic range (SDR) content.

For digital televisions, 4K can mean one of two resolutions. The most common is the "Ultra HD" or "UHD" format of 3,840 horizontal pixels by 2160 vertical pixels. The less common resolution, mostly reserved for cinema and movie projectors, is 4096 × 2160 pixels.

Each 4K resolution is 4 times the number of pixels (or twice the lines) as a 1080p display—the next highest resolution you'll find in a consumer television. That means you can fit four 1080p images in the space of one 4K resolution image. With an aspect ratio of 16:9, or "16 by 9," the total number of pixels in a 4K image exceeds 8 megapixels.

4K (as well as every other TV resolution) remains constant regardless of screen size. However, the number of pixels per inch can vary depending on the size of the screen. This means as TV screen grows in size, pixels are increased in size or spaced further apart to achieve the same resolution.

4K Resolution Comparison Chart
Image courtesy of OPPO Digital

HDR televisions must meet a set of resolution, contrast, and color standards in order to be considered HDR. Those standards vary, but all HDR displays are defined as having a higher dynamic range than SDR, as well as a minimum 10-bit color depth. As most HDR TV are 4K TVs, most incidentally have a resolution of 3840 x 2160 pixels. HDR TVs also usually have contrast that are higher than 1,000 nits for LED TVs and 540 nits for OLED TVs.

Color and Contrast: HDR Is More Visually Impactful

  • As a resolution, the impact of 4K with regards to color is mostly through higher definition.

  • Dramatically improved color reproduction and contrast. HDR has a bigger visual impact than 4K.

  • Greater visual impact than SDR: More accurate colors, smoother light and color shading, more detailed images.

Color reproduction improves dramatically in HDR televisions. As a resolution, 4K does not affect color all that much, other than providing added definition. This is why 4K and UHD often go hand in hand: They complement the two most important aspects of picture quality—definition and color.

As a technology, HDR expands the distance between white and black, making the contrast more intense without overexposing bright colors or underexposing dark colors.

When high dynamic range images captured, the information is used in post-production to "grade" the content so that the widest possible contrast range is obtained. The images are graded to produce a wide color gamut, which makes for deeper, more saturated colors, as well as smoother shading, and more detailed images. Grading may be applied to each frame or scene, or as static reference points for an entire film or program.

When an HDR television detects HDR-encoded content, bright whites will appear without blooming or washout, and deep blacks without muddiness or crushing. In a word, the colors will look more saturated.

For example, in a sunset scene, you should see the bright light of the sun and the darker portions of the image with similar clarity, along with all the brightness levels in between. Check out the example below.

Sony SDR and HDR Comparison

There are two ways for a TV to display HDR:

  • HDR Encoded Content: There are four primary HDR formats: HDR10/10+, Dolby Vision, HLG, and Technicolor HDR. The brand or model of HDR TV determines which format it is compatible with. If a TV can't detect a compatible HDR format, it will display the images in SDR.
  • SDR to HDR processing: Similar to how TVs can upscale resolutions, HDR TV with SDR-to-HDR upscaling can analyze the contrast and brightness information of an SDR signal and expand the dynamic range to approximate HDR quality.

Compatibility: End-to-End for the Full 4K HDR Experience

  • Full 4K UHD resolution requires 4K-compatible equipment from source to display—including the set-top box or Blu-ray Player, streaming device, HDMI cable, and TV.

  • Also requires end-to-end compatibility.

  • Available content more limited than 4K.

4K televisions require end-to-end compatibility among all components to enjoy authentic or "true" 4K resolution. The same is generally true of HDR: You need both an HDR TV and content that was produced using an HDR format. By some measures there is less content available in HDR than there is in 4K, but that's beginning to change.

To enjoy full 4K UHD resolution you need 4K-compatible equipment down the line. That includes home theater receivers, media streamers, Ultra HD Blu-ray players, and video projectors, as well as the native resolution of the content you're watching. You'll also need a high-speed HDMI cable. 4K is more common among larger televisions because the difference between 4K and 1080p is not as noticeable on screens smaller than 55-inches. However, the HDR effect may look different from TV to TV depending on the amount of light the display can emit.

Some 4K devices can upscale lower native resolutions to 4K, but the conversion is not always smooth. 4K has not yet been implemented in over-the-air TV broadcasting in the U.S., so over-the-air (OTA) content will need to be upscale to view in 4K. Similarly, not all HDR TVs can upscale from SDR to HDR. When shopping for a TV with HDR capability, consider the TV's compatibility with HDR10/10+, Dolby Vision, and HLG formats, as well as the TV's peak brightness capability, which is measured in nits.

How well an HDR-enabled TV displays HDR depends on how much light the TV can emit. This is referred to as Peak Brightness and is measured in Nits. Content encoded in the Dolby Vision HDR format, for example, may provide a range of 4,000 nits between the blackest black and the whitest white. Few HDR TVs can emit that much light, but a growing number of displays can reach 1,000 nits. Most HDR TVs will display less. OLED HDR TVs max out between 750 and 800 nits, while low-end LED/LCD HDR TVs may emit just 500 nits. However, because the pixels in an OLED TV are individually lit they can display absolute black, and thus have a higher perceived dynamic range.

When a TV detects an HDR signal but can't emit enough light to display its full dynamic potential, it will employ Tone Mapping to best match the dynamic range of the HDR content with that of TV's light output.

4K vs HDR: Do You Have to Choose?

4K and HDR are not competing standards, so you don't need to choose between them. And because most premium TVs have both standards, you don't need to worry about focusing on one standard over the other, especially if you're buying a TV that's larger than 55 inches. If you are looking for a smaller TV than that, you may be happy with a 1080p display, as you probably won't notice the difference in resolution anyway.