what is hdr

High dynamic range (HDR) video is one of the biggest 4K TV feature bullet points. It can push video content past the (now non-existent) limitations to which broadcast and other media standards have adhered to for decades. It’s impressive to see on TVs that can handle it, but it’s also a fairly confusing technical feature with several variations with differences that aren’t very well-established. That’s why we’re here to explain them to you.

Dynamic Range on TVs

TV contrast is the difference between how dark and bright it can get. Dynamic range describes the extremes in that difference, and how much detail can be shown in between. Essentially, dynamic range is display contrast, and HDR represents broadening that contrast. However, just expanding the range between bright and dark is insufficient to improve a picture’s detail. Whether a panel can reach 200 cd/m^2 (relatively dim) or 2,000 cd/m^2 (incredibly bright), and whether its black levels are 0.1cd/m^2 (washed-out, nearly gray) or 0.005cd/m^2 (incredibly dark), it can ultimately only show so much information based on the signal it’s receiving.

Many popular video formats, including broadcast television and Blu-ray discs, are limited by standards built around the physical boundaries presented by older technologies. Black is set to only so black, because as Christopher Guest eloquently wrote, “it could get none more black.” Similarly, white could only get so bright within the limitations of display technology. Now, with organic LED (OLED) and local dimming LED backlighting systems on newer LCD panels, that range is increasing. They can reach further extremes, but video formats can’t take advantage of it. Only so much information is presented in the signal, and a TV capable of reaching beyond those limits still has to stretch and work with the information present.

What Is HDR?

That’s where HDR video comes in. It removes the limitations presented by older video signals and provides information about brightness and color across a much wider range. HDR-capable displays can read that information and show an image built from a wider gamut of color and brightness. Besides the wider range, HDR video simply contains more data to describe more steps in between the extremes. This means that very bright objects and very dark objects on the same screen can be shown very bright and very dark if the display supports it, with all of the necessary steps in between described in the signal and not synthesized by the image processor.

To put it more simply, HDR content on HDR-compatible TVs can get brighter, darker, and show more shades of gray in between (assuming the TVs have panels that can get bright and dark enough to do the signal justice; some budget TVs accept HDR signals but won’t show much of an improvement over non-HDR signals). Similarly, they can produce deeper and more vivid reds, greens, and blues, and show more shades in between. Deep shadows aren’t simply black voids; more details can be seen in the darkness, while the picture stays very dark. Bright shots aren’t simply sunny, vivid pictures; fine details in the brightest surfaces remain clear. Vivid objects aren’t simply saturated; more shades of colors can be seen.

This requires much more data, and like ultra high-definition video, Blu-rays can’t handle it. Fortunately, we now have Ultra HD Blu-ray, a disc type (distinct from Blu-ray, despite the name) that can hold more data, and is built to contain 4K video, HDR video, and even object-based surround sound like Dolby Atmos. Just be aware that you can’t play them on regular Blu-ray players; you need dedicated Ultra HD Blu-ray players or a relatively new game console to play them.

Online streaming also offers HDR content, but you need a reliably fast connection to get it. Fortunately, if your bandwidth is high enough to get 4K video, it can get HDR; Amazon Video and Netflix’s recommended connection speeds for 4K content are respectively 15Mbps and 25Mbps, regardless of whether that content is in HDR or not.

What Is Color Gamut?

This is where HDR gets a bit more confusing. Wide color gamut is another feature high-end TVs have, and it’s even less defined than HDR. It’s also connected to HDR, but not directly. HDR deals with how much light a TV is told to put out, or luminance. The range and value of color, defined separately from light, is called chromaticity. They’re two separate values that interact with each other in several ways, but are still distinct.

Technically, HDR specifically only addresses luminance, because that’s what dynamic range is: the difference between light and dark on a screen. Color is a completely separate value based on absolute red, green, and blue levels regardless of the format of the video. However, they’re tied together by how we perceive light, and a greater range of light means we’ll perceive a greater range of color. Because of that, HDR-capable TVs can often show what’s called “wide color gamut,” or a range of color outside of the standard color values used in broadcast TV (called Rec.709).This doesn’t mean HDR guarantees a wider range of colors, or that they’ll be consistent. That’s why we test every TV for both contrast and color. Most TVs today can hit Rec.709 values, but that leaves a lot of color that the eye can see but that those TVs can’t show. DCI-P3 is a standard color space for digital cinema, and it’s much wider. Rec.2020 is the ideal color space for 4K TVs, and it’s wider still (and we’ve yet to see any consumer TV that can reach those levels). And here’s the kicker: Rec.2020 applies to both SDR and HDR, because HDR doesn’t directly address color levels.

The above chart shows the range of color the human eye can detect as an arch, and the three color spaces we mentioned as triangles. As you can see, each expands pretty significantly on the previous one.

All of this might seem confusing, but it boils down to this: HDR doesn’t guarantee that you’ll get more color. Many HDR TVs have wide color gamuts, but not all of them. Our TV reviews tell you whether a TV is HDR-capable and what its full range of color looks like.

Types of HDR

HDR isn’t quite universal, is currently split into two major formats, with a few others gaining momentum.

Dolby Vision

Dolby Vision is Dolby’s own HDR format. While Dolby requires certification for media and screens to say they’re Dolby Vision compatible, it isn’t quite as specific and absolute as HDR10. Dolby Vision content uses dynamic metadata. Static metadata maintains specific levels of brightness across whatever content you watch. Dynamic metadata adjusts those levels based on each scene or even each frame, preserving more detail between scenes that are very bright or very dark. By tweaking the maximum and minimum levels of light a TV is told to put out on the fly, the same amount of data that would be assigned across the full range of light an entire movie or show uses can be set across a much more specific, targeted span. Darker scenes can preserve more detail in shadows and lighter scenes can keep more detail in highlights, because they aren’t telling the TV to be ready to show opposite extremes that won’t even show up until the next scene.


HDR10 is the standard pushed by the UHD Alliance. It’s a technical standard with specific, defined ranges and specifications that must be met for content and displays to qualify as using it. HDR10 uses static metadata that is consistent across all displays. This means HDR10 video sets light and color levels in absolute values, regardless of the screen it’s being shown on. It’s an open standard, so any content producer or distributor can use it freely.

4K Displays


HDR10+ is a standard developed by Samsung. It builds on HDR10 by adding dynamic metadata, like Dolby Vision. It doesn’t use individualized metadata for each screen, but it still adjusts the range of light it tells the TV to display for each scene or frame. It can potentially add more detail to your picture over what HDR10 shows, and like HDR10 it’s an open standard that doesn’t require licensing with a very specific production workflow.

Hybrid Log-Gamma (HLG)

Hybrid Log-Gamma (HLG) isn’t as common as HDR10 or Dolby Vision, and there’s very little content for it yet outside of some BBC and DirecTV broadcasts, but it could make HDR much more widely available. That’s because it was developed by the BBC and Japan’s NHK to provide a video format that broadcasters could use to send HDR (and SDR; HLG is backwards-compatible). It’s technically much more universal because it doesn’t use metadata at all; instead it uses a combination of the gamma curve that TVs use to calculate brightness for SDR content and a logarithmic curve to calculate the much higher levels of brightness that HDR-capable TVs can produce (hence the name Hybrid Log-Gamma).

Leave a Reply

Your email address will not be published.