Home Theater & Entertainment TV & Displays 290 290 people found this article helpful Nits, Lumens, and Brightness on TVs and Projectors How bright is your TV or video projector? by Robert Silva Writer Robert Silva has written about audio, video, and home theater topics since 1998. Robert has written for Dishinfo.com, and made appearances on the YouTube series Home Theater Geeks. our editorial process Facebook Twitter LinkedIn Robert Silva Updated on December 30, 2020 reviewed by Jerrick Leger Lifewire Tech Review Board Member Jerrick Leger is a CompTIA-certified IT Specialist with more than 10 years' experience in technical support and IT fields. He is also a systems administrator for an IT firm in Texas serving small businesses. our review board Article reviewed on Apr 30, 2020 Jerrick Leger Tweet Share Email TV & Displays Samsung Projectors Antennas HDMI & Connections Remote Controls In This Article The HDR Factor Nits and Lumens Defined Nits vs. Lumens Real-World Light Output The Audio Analogy Light Output vs. Brightness Light Output Guidelines The Bottom Line If you are about to purchase a TV or video projector and haven't shopped for either in several years, things may be more confusing than ever. Whether you look at online or newspaper ads or go to your local dealer cold turkey, there are so many tech terms thrown out, many consumers end up pulling out their cash and hoping for the best. This information applies to TVs from a variety of manufacturers including, but not limited to, those made by LG, Samsung, Panasonic, Sony, and Vizio and video projectors from manufacturers such as Epson, Optoma, BenQ, Sony, and JVC. The HDR Factor One "techie" term that has entered the TV mix is HDR. HDR (High Dynamic Range) is all the rage amongst TV makers, and there is a good reason for consumers to take notice. Although 4K has improved resolution, HDR tackles another important factor in both TV and video projectors, light output (luminance). What You Really Need To Know About 4K And Ultra HD Sami Sarkis / Getty Images The goal of HDR is to support increased light output capability so that displayed images have characteristics that are more like the natural light conditions we experience in the "real world." As a result of HDR implementation, two established technical terms have risen to prominence in TV and video projector promotion: Nits and Lumens. Although the term Lumens has been a mainstay of video projector marketing for some years, when shopping for a TV, consumers are now being hit with the term Nits by TV makers and persuasive salespersons. Before HDR was available when consumers shopped for a TV, one brand/model may have looked "brighter" than another, but that difference wasn't really quantified, you just had to eyeball it. With HDR offered on an increasing number of TVs, light output (notice I did not say brightness, which will be discussed later) is quantified in Nits — more Nits, means a TV can output more light, with the primary purpose to support HDR — either with compatible content or a generic HDR effect generated via a TV's internal processing. What Nits and Lumens Are Here is how Nits and Lumens are defined. Nits — Think of a TV as like the Sun, which emits light directly. A Nit is a measurement of how much light the TV screen sends to your eyes (luminance) within a given area. On a more technical level, a NIT is the amount of light output equal to one candela per square meter (cd/m2 - a standardized measurement of luminous intensity). To put this into perspective, an average TV may have the capability to output 100 to 200 Nits, while HDR-compatible TVs may have the ability to output 400 to 2,000 nits. Lumens — Lumens is a general term describing light output, but for video projectors, the most accurate term to use is ANSI Lumens (ANSI stands for America National Standards Institute). In relation to Nits, an ANSI lumen is the amount of light that is reflected off of a one square meter area that is one meter from a one candela light source. Think of an image displayed on a video projection screen, or wall as the moon, which reflects light back to the viewer. 1000 ANSI Lumens is the minimum that a projector should be able to output for home theater use, but most home theater projectors average from 1,500 to 2,500 ANSI lumens of light output. On the other hand, multi-purpose video projectors (use for a variety of roles, which may include home entertainment, business, or educational use, may be able to output 3,000 or more ANSI lumens). Nits vs. Lumens One Nit represents more light than 1 ANSI lumen. The mathematical difference between Nits and Lumens is complex. However, for the consumer comparing a TV with a video projector, one way to put it is 1 Nit as the approximate equivalent of 3.426 ANSI Lumens. Using that general reference point, in order to determine the approximate amount of Nits comparable to an approximate number of ANSI lumens, you can multiply the number of Nits by 3.426. If you want to do the reverse, divide the number of Lumens by 3.426. Here are some examples: NITS vs Lumens – Approximate Comparisons NITS ANSI LUMENS 200 685 500 1,713 730 2,500 1,000 3,246 1,500 5,139 2,000 6,582 For a video projector to achieve a light output equivalent to 1,000 Nits (keep in mind that you are lighting up the same amount of room area and room lighting conditions are the same)—it needs to output as much as 3,426 ANSI Lumens, which is out of range for most dedicated home theater projectors. However, a projector that can output 1,713 ANSI Lumens, which is easily attainable, can approximately match a TV that has a light output of 500 Nits. Getting more precise, other factors, such as TV screen size also affects the Nits/Lumens relationship. For example, a 65-inch TV that puts out 500 nits will have approximately four times the lumens output of a 32-inch TV putting out 500 nits.Taking that variation into account, when comparing nits, screen size, and lumens, the formula used should be Lumens = Nits x Screen Area x Pi (3.1416). The screen area is determined by multiplying screen width and height stated in square meters.Using the 500 nit 65-inch TV which as 1.167 square meter screen area, the lumens equivalent would be 1,833. TV and Video Projector Light Output in the Real World Although all the above "techie" info on Nits and Lumens provides a relative reference, in real-world applications, numbers are only part of the story. When a TV or video projector is touted as able to output 1,000 Nits or Lumens, that does not mean that the TV or projector outputs that much light all the time. Frames or scenes most often display a range of bright and dark content, as well as a variation of colors. All these variations require different levels of light output. If you have a scene with the Sun in the sky, that portion of the image may require the TV or video projector to output the maximum number of Nits or Lumens. However, other portions of the image, such as buildings, landscape, and shadows, require a lot less light output, perhaps on only 100 or 200 Nits or Lumens. Also, different colors that are displayed contribute to different light output levels within a frame or scene. A key point is that the ratio between the brightest objects and darkest objects be the same, or as close to the same as possible, to result in the same visual impact. This is especially important for HDR-enabled OLED TVs in relation to LED/LCD TVs. OLED TV technology cannot support as many Nits of light output as LED/LCD TV technology can. However, unlike an LED/LCD TV, and OLED TV can produce absolute black. Even though the official optimum HDR standard for LED/LCD TVs is the ability to display at least 1,000 Nits, the official HDR standard for OLED TVs is only 540 Nits. However, remember, the standard applies to the maximum Nits output, not average Nits output. Although you will notice that a 1,000 Nit capable LED/LCD TV will look brighter than an OLED TV when, say, both are displaying the Sun or very bright sky, the OLED TV will do a better job at displaying the darkest portions of that same image, so the overall Dynamic Range (the point distance between maximum white and maximum black may be similar). When comparing an HDR-enabled TV that can output 1,000 Nits, with an HDR-enabled video projector that can output 2,500 ANSI lumens, the HDR effect on the TV will be more dramatic in terms of "perceived brightness". For video projectors, there is a difference between the light output capabilities between projectors that use LCD and DLP technology. LCD projectors have the capability of delivering equal light output level capability for both white and color, while DLP projectors that employ color wheels do not have the capability of producing equal levels of white and color light output. Factors such as viewing in a darkened room, as opposed to a partially lit room, screen size, screen reflectivity (for projectors), and seating distance, more or less Nit or Lumen output may be required to get the same desired visual impact. The Audio Analogy One analogy to approach the HDR/Nits/Lumens issue is in the same way you should approach amplifier power specifications in audio. Just because an amplifier or home theater receiver claims to deliver 100 watts per channel, doesn't mean that it outputs that much power all the time. Although the capability of being able to output 100 watts provides an indication on what to expect for musical or movie soundtrack peaks, most of the time, for voices, and most music and sound effects, that same receiver only needs to output 10 watts or so for you to hear what you need to hear. Light Output vs. Brightness For TVs and Video Projectors, Nits and ANSI Lumens are both measures of light output (Luminance). However, where does the term Brightness fit in? Brightness is not the same as actual quantified Luminance (light output). Brightness can be referred to as the ability to detect differences in Luminance. Brightness may also be expressed as a percentage more bright or a percentage less bright from a subjective reference point (such as the Brightness control of a TV or video projector—see further explanation below). In other words, Brightness is the subjective interpretation (more bright, less bright) of perceived Luminance, not actual generated Luminance. The way a TV or Video projector's brightness control works is by adjusting the amount of black level that is visible on the screen. Lowering the "brightness" results in making dark portions of the image darker, resulting in decreased detail and "muddy" look in darker areas of the image. On the other hand, raising the "brightness" results in making the darker parts of the image brighter, which results in dark areas of the image appearing more gray, with the overall image appearing to look washed out. Although Brightness is not the same as actual quantified Luminance (light output), both TV and video projector makers, as well as product reviewers, have a habit of using the term Brightness as a catch-all for more technical terms that describe light output, which include Nits and Lumens. One example is Epson's use of the term "Color Brightness" that was referenced earlier in this article. TV and Projector Light Output Guidelines Measuring light output with reference to the relationship between Nits and Lumens deals with a lot of math and physics, and boiling it down into a brief explanation isn't easy. So, when TV and video projector companies hit consumers with terms such as Nits and Lumens without context, things can get confusing. However, when considering light output, here are some guidelines to keep in mind. For 720p/1080p or Non-HDR 4K Ultra HD TVs, information on Nits is not usually promoted but varies from 200 to 300 Nits, which is bright enough for traditional source content and most room lighting conditions (although 3D will be noticeably dimmer). Where you need to consider the Nits rating more specifically is with 4K Ultra HD TVs that include HDR — the higher the light output, the better. For 4K Ultra HD LED/LCD TVs that are HDR-compatible, a rating of 500 Nits provides a modest HDR effect (look for labeling such as HDR Premium), and TVs that output 700 Nits will provide a better result with HDR content. However, if you are looking for the best possible result, 1000 Nits is official reference standard (look for labels such as HDR1000), and the Nits top-off for the highest-end HDR LED/LCD TVs is 2,000. If shopping for an OLED TV, the light output high water mark is about 600 Nits — currently, all HDR-capable OLED TVs are required to be able to output light levels of at least 540 Nits. However, on the other side of the equation, as mentioned previously, OLED TVs can display absolute black, which LED/LCD TVs cannot — so that 540 to 600 Nits rating on OLED TV can display a better result with HDR content than an LED/LCD TV can be rated at the same Nits level. Although a 600 Nit OLED TV and 1,000 Nit LED/LCD TV can both look impressive, the 1,000 Nit LED/LCD TV will still produce a much more dramatic result, especially in a well-lit room. As mentioned previously, 2,000 Nits is currently the highest light output level that may be found on a TV, but that may result in displayed images that are too intense for some viewers. If you are shopping for a video projector, as mentioned above, a light output 1,000 ANSI Lumens should be the minimum to consider, but most projectors are capable of outputting 1,500 to 2,000 ANSI lumens, which provides better performance in a room that may not be able to be made completely dark. Also, if you add 3D to mix, consider a projector with 2,000 or more lumens output, as 3D images are naturally more dim than their 2D counterparts. HDR-enabled video projectors lack “point-to-point accuracy” with relation to small bright objects against the dark background. For example, an HDR TV will display stars against a black night much brighter than is possible on a consumer-based HDR projector. This is due to projectors having difficulty in displaying high brightness in a very small area in relation to a surrounding dark image. For the best HDR result available so far (which still falls short of the perceived brightness of a 1,000 Nit TV), you need to consider a 4K HDR-enabled projector that can output at least 2500 ANSI lumens. Currently, there is no official HDR light output standard for consumer-based video projectors. The Bottom Line Just as with any specification or tech term that is thrown at you by a manufacturer or salesperson, don't obsess. Nits and Lumens are only one part of the equation when considering the purchase of a TV or video projector. Take the entire package into consideration, which not only includes stated light output but how the entire image looks to you in terms of : Perceived brightness Color Contrast Motion response Viewing Angle Ease of setup and use Sound quality (if you are not going to use an external audio system) Additional convenience features (such as internet streaming in TVs). Also keep in mind that if you desire an HDR-equipped TV, you need to take the additional content access requirements into consideration (4K Streaming and Ultra HD Blu-ray Disc). Was this page helpful? Thanks for letting us know! Get the Latest Tech News Delivered Every Day Email Address Sign up There was an error. Please try again. You're in! Thanks for signing up. There was an error. Please try again. Thank you for signing up. Tell us why! Other Not enough details Hard to understand Submit More from Lifewire HDR vs. 4K: What's the Difference? The 8 Best Mini Projectors of 2021 HDR: Dolby Vision, HDR10, HLG — What It Means for TV Viewers The 7 Best Gaming Projectors of 2021 The 9 Best Cheap Projectors of 2021 Super Bowl TV and Home Theater Setup Tips What Is a Quantum Dot (aka QD QLED) TV? Video Projectors and Color Brightness What to Look for When Buying a Projector The 6 Best Outdoor Projectors of 2021 The 7 Best Outdoor TVs of 2021 How to Adjust a 3D TV for Best Viewing Results The 9 Best Projectors of 2021 The 8 Best 4K and 1080p Projectors of 2021 Video Projector vs. TV How Does Room Lighting Affect Your TV Viewing?