Computers, Laptops & Tablets Accessories & Hardware 35 35 people found this article helpful What Is a High-Definition PC Monitor? The meaning of HD, UHD, 4K, and other terms for monitors by Mark Casey Writer Mark Casey was a Lifewire writer who specialized in computing and technology, including reviewing PC components and peripherals. our editorial process LinkedIn Mark Casey Updated on June 07, 2020 The Ultimate Guide to Monitors The Ultimate Guide to Monitors Introduction Monitor Basics All About HD PC Monitors TVs vs. Monitors CRT vs. LCD Monitors Learn About Refresh Rates 3D Computer Displays CRT Monitor Resolution Specifications Why You Need a Second Monitor Add or Connect a Monitor Is Having More Than One Display Useful? Add a Second Monitor to Your Windows Laptop How to Connect Your Computer to Your TV You Can Use Your Old iMac as a Monitor How to Use Your iPad as a Second Monitor Calibrate It Yourself Why Monitor Calibration Is Essential Adjusting a Monitor's Settings Why Printer Colors Don't Match Monitor Colors Color Gamuts on LCD Monitors Troubleshooting Issues Testing a Monitor That Isn't Working Fix a Second Monitor Not Working Checking for Loose Power Cables How to Degauss a Traditional CRT Monitor Can Burn-In Happen to LCD Monitors? How to Change Refresh Rate in Windows Our Recommendations: Best Monitors The Best Computer Monitors The Best 4K Monitors The Best 27-Inch LCD Monitors The Best 24-Inch LCD Monitors The Best 32-Inch Monitors The Best USB-C Monitors The Best Monitors for Coding The Best Curved Monitors The Best 5K & 8K Computer Monitors The Best Touchscreen Monitors The Best Ultra-Wide Monitors Tweet Share Email A high-definition PC monitor delivers a remarkably clearer picture than possible with lower-definition, lower-resolution screens. High-resolution displays generally have a higher density of pixels per inch than past standard TV screens, making the image sharper and clearer because the human eye can't make out individual pixels as easily. When it comes to high definition PC monitors, the term "high definition" is used somewhat interchangeably with "high resolution." Image courtesy of Acer, Inc. High Definition By now, everyone's heard of high-definition television (HDTV). It's a selling point for flat-panel plasma and LCD screens that makes sports, movies, and even the Weather Channel look amazing if they are broadcast in HD. Though a TV or monitor may feature HD, the content displayed must be HD quality. If not, it might be upscaled to fit the display but will not be true HD. Most people have at least a vague idea of what high definition delivers for television: a beautiful, sharp picture with more vibrant colors than lower-definition displays. Monitor Resolution and Evolving Video Standards Standards have become clearer on what HD means compared to what it meant in the past. The following are the standard definitions for HD monitor resolutions and express the number of pixels in the display horizontally by vertically: 1280 x 720 (aka 720p)1920 x 1080 (aka 1080i)1920 x 1080 progressive (aka 1080p)2560 x 1440 is a resolution often found in monitors for gaming. The next step up from HD is Ultra High Definition or UHD (4K quality) in both TVs and monitors. Technically, 4K and UHD are different, but when it comes to what you see on the market, the two are interchangeable and refer to the same type of product. This monitor resolution is around 3840 x 2160, and they are sometimes called 4K UHD monitors. A small step up from 4K UHD is called 5K. Monitors in this category have resolutions around 5120 × 2880. 5K displays are usually computer monitors. The level beyond 4K UHD is known as 8K UHD. Again, the technical standards and the names can differ, and as this video definition becomes more prevalent, it might be assigned other marketing names. The resolution for an 8K UHD monitor is 7680 x 4320. 4K may be everywhere in TVs and monitors, but true 4K content that takes advantage of this resolution lags in availability. More 4K movies and other content become available all the time, but it is still not common. Progressive vs. Interlaced Scanning The "i" and "p" denote interlaced and progressive scanning, respectively. Interlaced scanning is the older technology of the two. A PC monitor that uses interlaced scanning refreshes half of the horizontal pixel rows in one cycle and takes another cycle to refresh the other half, while alternating rows. The upshot is that two scans are necessary to display each line, resulting in a slower, blurrier display with flickering. Progressive scanning, on the other hand, scans one complete row at a time, in sequence from top to bottom. The resulting display is smoother and more detailed — especially for text, a common element on screens used with PCs.