What Is a High-Definition PC Monitor?

The Meaning of HD, UHD, 4K, and Other Terms for Monitors

Acer G247HYL monitor

Amazon

High-definition (HD) is a video quality that is above standard-definition video quality. HD is considered to be a display with greater than 480 vertical lines, though what is considered full HD is 1080 vertical lines.

In terms of a PC monitor, high definition is used somewhat interchangeably with high resolution. High-resolution displays have a higher density of pixels per inch than past standard TV screens in general. This makes the image on the display sharper and clearer because the pixels are less easily seen by the human eye. A high-definition PC monitor, then, delivers a remarkably clearer picture than possible with lower-definition, lower-resolution screens. 

Having an HD monitor makes a significant difference in image quality when it comes to playing video games on your computer, watching movies, and watching HD online video. HD means that you'll be watching in widescreen; for movies, it is generally as it was originally intended: the uncropped, full-screen-width image seen in the theater. Since HDTV caught on, video game studios and online entertainment companies have been focusing more and more on HD programming for a high-resolution screen.

High Definition

By now, everyone's heard of high-definition television (HDTV). It's a selling point for flat-panel plasma and LCD screens that makes sports, movies, and even the Weather Channel look amazing if they are broadcast in HD.

HD and Upscaling

Though a TV or monitor may feature HD, the content being displayed must be HD quality. If it is not, the content may be upscaled to fit the display but will not be true HD.

Most people have at least a vague idea of what high definition delivers for television: a beautiful, sharp picture with more vibrant colors than lower-definition displays.

Monitor Resolution and Evolving Video Standards

Standards have become clearer on what HD means compared to what it meant in the past. The following are the standard definitions for HD monitor resolutions and express the number of pixels in the display horizontally by vertically:

  • 1280 x 720 (aka 720p)
  • 1920 x 1080 (aka 1080i)
  • 1920 x 1080 progressive (aka 1080p)
  • 2560 x 1440 is a resolution found often in monitors for gaming.

The next step up from HD is Ultra High Definition or UHD. This is also referred to as 4K quality in both TVs and monitors. Technically there is a difference between 4K and UHD, but when it comes to what you see on the market, the two are interchangeable and refer to the same type of product. This monitor resolution is around 3840 x 2160, and they are sometimes called 4K UHD monitors.

A small step up from 4K UHD is called 5K. Monitors in this category have resolutions around 5120 × 2880. 5K displays are usually only used as computer monitors.

The level beyond 4K UHD is known as 8K UHD. Again, the technical standards and the names can differ, and as this video definition becomes more common it may be assigned other marketing names. The resolution for an 8K UHD monitor is 7680 x 4320.

Availability of 4K Content

4K may be everywhere in TVs and monitors, but true 4K content that takes advantage of this resolution lags in availability. More 4K movies and other content become available all the time, but it is still not common.

Progressive vs. Interlaced Scanning

The "i" and "p" denote interlaced and progressive scanning, respectively. Interlaced scanning is the older technology of the two. A PC monitor that uses interlaced scanning refreshes half of the horizontal pixel rows in one cycle and takes another cycle to refresh the other half, while alternating rows. The upshot is that two scans are necessary to display each line, resulting in a slower, blurrier display with flickering. Progressive scanning, on the other hand, scans one complete row at a time, in sequence from top to bottom. The resulting display is smoother and more detailed — especially for text, a common element on screens used with PCs.