Digital Technology Technology

What is HDR TV And How it Works. HDR10, HDR10+, and Dolby Vision

Still don’t know what HDR is? Do not worry about it; we can explain everything you need to know about HDR in great detail. Are you buying your new TV and do not know how important the term HDR TV (High Dynamic Range TV)?

One thing to start with: it is much more important than you think. We will tell you how many types there are and what difference exists between them. HDR is, in short, the part of your TV that is responsible for delivering scenes with more realistic colors and more excellent contrast.

What’s more, many industry experts believe that HDR represents a significantly more significant leap in image quality than the last “UHD revolution.” But do you want to know how the HDR mode works on your TV? Below we tell everything.

1. The Basics of HDR (High Dynamic Range)

The TV’s contrast is measured by the difference between the brightest whites and darkest blacks that a television can display, measured in candela per square meter (cd / m2): the so-called nits. The ideal low end is entirely black (zero nits), currently possible only on OLED displays.

At the high end, the story is different. Standard dynamic range televisions generally produce between 300 and 500 nits, but some HDR televisions aim more elevated, even at thousands of nits.

There are multiple HDR formats, but there are currently two leading players: the proprietary Dolby Vision format and the open standard HDR10. Dolby was the first to join the party with a prototype TV capable of displaying up to 4,000 nits of brightness.

What is HDR (High Dynamic Range)

For a short time, Dolby Vision was nearly synonymous with HDR, but not all manufacturers wanted to follow Dolby’s rules (or pay their fees), and many began to work on their alternatives. Popular electronics manufacturers, including LG, Samsung, Sharp, Sony, Panasonic, and Vizio, agreed to an open standard called HDR10.

The electronics companies like Samsung, LG, Sony, Panasonic, Dolby, and many others are announced Ultra HD Premium certification for UHD Blu-ray players.

This point sets some benchmarks for HDR, such as displaying up to 1,000 nits of brightness and minimum 10-bit color depth. Both HDR10 and Dolby Vision comply with the standards established by this certification, although in different ways.

2. HDR10 and HDR10 +

While Dolby Vision may have been the first, it is currently not the most popular format. The best known is HDR10, which is supported by a good number of TV manufacturers. The latest HDR10 standard was codified by the Consumer Technology Association – the same group responsible for organizing CES.

The specification currently uses a 10-bit color depth, while Dolby Vision uses 12-bit. Both offer millions of colors per pixel, and the difference is difficult to detect.

The two major HDR formats use metadata that runs along with the video signal over an HDMI cable. This metadata allows the source video to “tell” a television how to display colors. HDR10 uses a relatively simple approach: it sends metadata all at once and at the beginning of a video, saying something like, “This video is encoded using HDR, and you should treat it this way.”

HDR10 and HDR10 +

HDR10 has become the more popular of the two formats. Above all, it is an open standard: TV manufacturers can implement it for free. It is also recommended by the Ultra-HD Alliance, which generally prefers open standards to proprietary formats such as Dolby Vision.

Then there’s HDR10 +, which Samsung and Amazon announced in April 2017. Unlike the original HDR10, HDR10 + standard offers features similar to those provided by Dolby Vision, including dynamic metadata that allows TVs to adjust brightness scene by scene and even frame by frame.

Also, like Dolby Vision, HDR10 + uses a 12-bit color depth. Initially, it was only available on Samsung TVs, but now it will also be available on Panasonic’s 2018 4K OLED line, recently unveiled at CES 2018.

3. Dolby Vision HDR

HDR10 is on more TVs, but this may not continue to be the case in the future. Dolby Vision has a distinct advantage in terms of sheer technological power, even with today’s televisions. Looking ahead, the gap between Dolby Vision and HDR10 could widen.

The Dolby Vision supports 12-bit color depth, as opposed to 10-bit color depth supported by HDR10. It also has a higher theoretical brightness. HDR10 currently maxes out at 1,000 nits while Dolby Vision can go up to 4,000 nits. (Dolby says they could allow up to 10,000 nits of peak brightness in the future.)

Dolby Vision HDR

Color depth isn’t the only area where Dolby Vision has a theoretical advantage over HDR10. While HDR10 transmits only static metadata (when a video begins to play), Dolby Vision uses dynamic metadata, varying by scene or frame. The advantages here are mostly theoretical, but as content providers become more adept at mastering film and television for HDR, dynamic metadata could be a huge advantage.

HDR10 may currently be better supported, both in terms of TVs and content, but Dolby works hard to change this. Initially, Dolby Vision required specific hardware to function, which means it could not be added later via a firmware update.

That changed in February 2017, and Blu-ray and Ultra HD TV makers will – theoretically – be able to add support later. The HDR10 was the first format to support Ultra HD Blu-ray players, LG, and Philips announced UHD players with Dolby Vision at CES 2017.

4. Other HDR Standards

Dolby Vision and HDR10 are currently considered as the two leading players in HDR, but other companies are working with their HDR solutions, with two new emerging formats.

Hybrid Log-Gamma (HLG) is a format of the BBC and the Japanese broadcaster NHK. HLG was developed to focus on live streaming, although it can also be used for pre-recorded content. Unlike HDR10 and Dolby Vision, HLG doesn’t use metadata, which could be advantageous, depending on how TV manufacturers implement it.

Technicolor was one of the first in HDR, and at CES 2016, the company announced that it and Philips had combined their efforts in HDR and were working on a new format.

Like HLG, this format aims to be compatible with SDR screens, which according to the companies in a press release, “will simplify HDR implementations for distributors, being able to send a signal to all their customers, regardless of the TV they have.” “At CES 2018, Philips announced that its 2019 TVs would support Technicolor HDR and the ATSC 3.0 broadcast standard.

Worth it?

The High dynamic range (HDR) is much more complicated than just the three little words it sums up. But it is also a fascinating technology, which creates more realistic images than ever. If you’re wondering if the next TV you buy should support HDR, our answer is resounding: yes, of course.

HDR is the most significant upgrade to the home video viewing experience from the jump to high definition, and it is definitely at the core of the future of television.

Related Articles

The 6 Best Cloud Storage Services in [2020]

Technology Endeavor

How to Change The Mouse Pointer in Windows Operating System

Technology Endeavor

Chromecast with Google TV, Finally For Amazon Fire TV and Roku TV

Technology Endeavor

Leave a Comment