Like HDR on television, there are two main standards for Mobile HDR: HDR10 and Dolby Vision. HDR10 is an open source format that is currently more popular. In general, if your television or mobile device supports HDR, it can play any HDR10 content.
Dolby Vision is another standard that is not very common, but arguably offers better image quality. For comparison, currently HDR10 supports brightness up to 4,000 nits, with a color depth of 10 bits. While Dolby Vision, supports brightness up to 10,000 nits, with a color depth of 12 bits. Dolby Vision also uses frame-by-frame metadata to ensure the best results. Although the spread of Dolby Vision is not as much as HDR10, both Amazon and Netflix, the two largest companies that emphasize Mobile HDR, support the Dolby Vision format.
If we saw back then, one of the smartphone with Dolby Vision support is the LG G6. However, keep in mind that Dolby Vision on mobile devices is actually a software solution, rather than hardware based. This means that technically, HDR mobile devices can play Dolby Vision content if there is a software update.
HDR is the latest enhancement to make it to televisions, streaming devices and your favorite TV shows and movies. The good news is that it can deliver the best picture quality available in home video today. The bad news? It comes in multiple formats, adding one more potential point of confusion to the already overwhelming TV buying process.
We already have HDR10, Dolby Vision and HLG, but apparently that’s not enough. Now there’s a new one, HDR10+, the “+” being the key. Created by Samsung, HDR10+ has recently gotten some other big-name backers, like Panasonic, Philips Amazon and 20th Century Fox.
So with the manufacturing and content side on board (some companies of it anyway), it’s starting to look like HDR10+ could be the real deal.
So what makes it different from the others? I’m glad you asked.
To explain the difference between HDR10 and HDR10+, we need to talk about metadata. Metadata is additional info, beyond the video signal itself, that gets transmitted along with an HDR movie or TV show. It basically tells the TV how to show the high dynamic range content. They’re like secret Ikea instructions that turn your Billy bookcase into a library.
HDR10 has static metadata; HDR10+ and Dolby Vision have dynamic metadata. Since this is one of the biggest differences between HDR10 and DV, adding it to the license-free HDR10 is potentially a big deal.
With HDR10, the TV gets one set of instructions at the beginning of the show or movie. This single, static set says, “OK, when this show says jump, this is how high.” This is fine, but is a one-size-fits-all approach. If a movie, say, has a wide variety of scenes, this single piece of metadata might not allow for the best image.
Dolby Vision has dynamic metadata — and soon HDR10+ will, too. This allows for fine-tuning how the HDR looks not for the entire movie, but all the way down to per-scene or even a per-frame basis. Most content probably won’t go that far, but this extra level of control lets filmmakers decide exactly how everything shot in a movie should look on your TV. Potentially, this could mean better picture quality over vanilla HDR10. Now a movie can give a TV instructions on how high to jump essentially on a continuous basis. (Very bossy.)
Here’s how Samsung describes it:
HDR10+ provides for scene-by-scene adjustments for the optimum representation of contrast from the HDR source content. Being an open format, it’s license/royalty free and therefore easily adoptable by manufacturers and content producers with quality maintained through a HDR10+ certification and logo program.
Yup, and just so there’s no confusion, HDR10+ has absolutely nothing to do with Google’s HDR+, an enhancement to camera phones. Similar names, totally unrelated. Well, they both have to do with HDR, but otherwise, not the same.
If you read all this and decided that HDR10+ exists because Samsung doesn’t want to pay Dolby licensing fees for HDR, well, you’d be right. That’s definitely the reason, though I’m sure they also just want HDR to succeed, too.
With the HDR10 ecosystem being a bit like the wild wild west, adding another layer of complexity could create additional problems. Will HDR10+ look the same, worse, or better than Dolby Vision? Impossible to say. Most likely it will come down to the specific transfers, content and so on.
Or to put it another way, it’s probable that HDR10+ and Dolby Vision will potentially look about the same. Dolby’s ace in the hole is, and will be, its hands-on involvement with the TVs themselves. A manufacturer pays Dolby not just for the ability decode Dolby Vision content. Dolby will also show them how to make their TV look as good as possible with said DV content. There’s nothing like that on the HDR10 side — but there might be with HDR10+.
You may have noticed in the quote above mention of a “certification.” There are no details about this yet, but according to Samsung they hope to have something to announce by CES 2018 in January. They told CNET it will be a “quality-based certification and logo program for devices.” What the level of performance TVs will have to meet to be certified is, we’ll have to wait and see. This could be similar to what Dolby does, or it could be as simple as “yep, that’s an image.”
Lastly, the question you’ve probably wanted to ask this whole time: Will your TV work with it? Maybe yes, maybe no. Once again, Samsung:
our entire 2017 lineup has an HDR10+-capable engine, and we will consider them for certification when the program is announced. We are evaluating options for 2016 displays.
Whether other companies will join the HDR10+ bandwagon remains to be seen. LG has Dolby Vision, and it’s not like they jump at things created by Samsung. Other companies, we shall see. If they sign on, will they be able to firmware-update HDR TVs to work with HDR10+? I wouldn’t count on it, but I suppose it’s possible.