HDR Gaming: What You Need To Know

Is it worth the investment?

Holding a PS4 controller

Miguel Sanz/Getty Images

High Dynamic Range (HDR) is hitting TVs and now computer monitors with all the fury of 4K Ultra HD. And, it's promising to improve the viewing experience dramatically. Unfortunately, this new feature is being implemented differently at every turn, so not all HDR experiences are created equally. For HDR gaming, the story is especially tricky.

The long-story-short version of it is that HDR gaming on Xbox One S, Xbox One X, PS4, and PS4 Pro is worthwhile, but HDR gaming on PC is a far more fraught pursuit. If you're a console gamer interested in checking out HDR gaming, there should be very little holding you back. Whereas PC gamers are faced with obstacle after obstacle that we can only hope will quickly be removed as HDR becomes more commonplace.

HDR in PC Gaming and Console Gaming

There are a lot of pieces to the puzzle for a good HDR experience. There's the HDR content you're trying to view (in this case, games), the hardware sending that HDR content to a display (your console or PC graphics processor), the cable carrying that signal (HDMI or DisplayPort), the display receiving and processing the HDR content, and format of HDR being used (Dolby Vision, HDR10, HLG, etc.). To get a good HDR experience, it's important for each part to work together.

Xbox One Controller and game in background
 Source: Xbox.com

On consoles connected to a TV, it's a fair bit easier to get things right. The Xbox One S and Xbox One X both support HDR10, as do all PS4 models with system software 4.0 and later. The more advanced Dolby Vision has also come to Xbox One S and X.

With HDR10 being a common standard and Dolby Vision not impossible to find, it's easy enough for console gamers on Microsoft and Sony hardware to find a compatible TV to game on. From there, gaming in HDR is fairly straightforward, as long as the games you try to play support HDR. If you already have one of the consoles we've mentioned, there's nothing extra you need to do to make it HDR ready, and finding a TV that will work with it is no hard task.

PC users don't have it as easy, especially as monitors have lagged behind TVs for HDR adoption and standardization. While most recent PC graphics cards from Nvidia and AMD support HDR, gamers with older cards will need to upgrade. Rock Paper Shotgun has a list of HDR-ready GPUs and the cables needed to support HDR. But even with a capable graphics card and an HDR monitor, getting Windows and games to handle HDR properly is not always a smooth process. Plus, not all games will support HDR.

Input Lag

This is a small point, but for a good gaming experience, it's worth noting. Since TV input lag can hurt a gaming experience by making it feel like your system is unresponsive, it's best to keep it to a minimum.

Your PC or console may experience slightly increased input lag outputting HDR content, and your TV or monitor may similarly increase input lag while processing the HDR content it receives. With good hardware and implementation of HDR in your games, this should be negligible, though. But, not all hardware and implementation are going to be good.

You may find your display increasing the input lag significantly when switching to HDR. If your TV has a gaming mode that helps it achieve reduced input lag but you can't activate this mode and HDR at the same time, you may have to choose which is more important to you.

The trade-off between the high-end visuals offered by HDR and input lag bring us into our next point.

Pretty Games vs. Competitive Games

Just what kind of gaming you want to do can help you decide whether you want to pursue HDR. While we think HDR is worthwhile for console gamers, there's one place it may be worth toggling off: competitive games. In competitive esports titles on both PC and console, high frame rates, low input lag, a good internet connection, and clear visuals are key. For all the beauty HDR can lend a game, it's not likely to help increase any of those key areas in competitive games (except possibly clear visuals with good implementation by game developers).

Aside from the aforementioned input lag, enabling HDR in your games has the potential to reduce your frame rates. Extremetech analyzed data on AMD and Nvidia graphics cards to see the differences in performance between gaming with HDR enabled and disabled, and it found performance hits with the former. Driver updates for the graphics cards can change just how severe the performance hits are over time, but for highly competitive gamers, the chance of a performance hit is likely not worthwhile. This uncertainty makes pursuing HDR especially inadvisable for anyone who would have to invest in all new hardware just to enable the feature.

Screen of Battlefield One
Source: EA

If you are focused on a handful of particular games, it may be worth searching for HDR performance analysis on them. If there's not performance hit, the next thing to see is whether or not they improve the visuals of the game in a way that can help. In a HardwareCanucks video on HDR gaming, it was clear that in some cases HDR can make it easier to see in games, while in other cases it can over-darken shadows and blow out highlights to make certain areas harder to see. That would be no good if an enemy or objective was in those sections of the screen.

For non-competitive games, the slight increase in input lag is less of a concern. How much tolerance you have for reduced frame rates will depend on your hardware and personal preference, but where HDR is implemented well, the increase in visual quality is likely to be worthwhile, and performance should not suffer terribly. So, for single-player games where a few extra milliseconds of delay won't be giving a player somewhere else a a chance to beat you, HDR should improve your experience.

Not All HDR Is Created Equally

There are a lot of little pieces that need to work together to produce a great HDR experience for you. In the next few years, we're sure to see content developers figure out how to best implement HDR, gaming hardware manufacturers figure out how to best support HDR content, and display makers figure out how to best display HDR coming from a wide variety of sources and devices. But, right now, there's a lot of development happening, and it's uncertain just what direction things will go.

Anyone who bought an HDR-enabled TV when they were first coming out can likely see now how fraught early adoption often is. Different HDR media formats, from HDR10 and HLG to Dolby Vision and Technicolor HDR, are vying for wide support on displays and media. And, for you to get those HDR experiences, your whole multimedia setup needs to be ready for them. You won't get Dolby Vision HDR on a display that only supports HLG.

Xbox Dolby Vision media
 Source: Xbox.com

Even if you get a display that can process media in the various HDR formats, there's still the question of how good a job it can actually do presenting the high-contrast imagery, the increased color bit depth, and more. Standards for HDR displays like VESA DisplayHDR are establishing the displays that can really drive a quality visual experience. But, this standardization and industry adoption are an ongoing process.

Then there's still the matter of game developers making their HDR settings actually look good. We previously mentioned how poor calibration can lead to blown out bright spots and overly dark areas. Game developers on console know the hardware they're working with and will likely go for HDR10. Your display's playback of HDR10 content is really the only question mark in that situation.

But, for PC gaming, there are so many variables that ensuring a good HDR experience is likely going to be hard even when HDR is more established. And now, while it's still getting established, the difficulties are even greater.

Our Advice

If you're asking yourself whether HDR is worth it, you need to think about what your gaming setup still needs to achieve HDR. If you already own a TV that supports HDR10 and you've enjoyed great HDR video content, you'll likely find your money is well spent on a PS4, Xbox One S, or Xbox One X (note that the Nintendo Switch and original Xbox One don't support HDR). If you have a TV that supports Dolby Vision, then one of the Xbox One models will let you take advantage of that format.

For PC, there are fewer cases where HDR is going to be worth it for the moment. If you don't have a recent graphics card that supports modern HDR, it may not be worthwhile to upgrade just for HDR. A newer, more powerful card can still help improve your gaming experience, though.

If you already have the hardware in your PC needed to play in HDR, and you have it connected to a monitor or TV that supports HDR, you still may not want to try running games with HDR enabled if you're playing competitively. If you don't have an HDR display yet, it's a better idea to wait and see which HDR standards are most widely adopted before buying a new display solely for the purpose of HDR gaming.

On the other hand, if gaming is only part of your interest in HDR, you have a bit more of an excuse to go ahead and pick up an HDR display. Your best bet is with a good 4K TV that supports multiple HDR formats, so that it's more likely to work with whichever formats your media and games support.

TVs are not ideal for productivity on a computer, and are therefore not the best pairing with a PC. But, a good 4K TV can get you started in HDR gaming while you wait for HDR monitors to become more prevalent and standardized on the market.