Communications Daily is a service of Warren Communications News.
Didn’t ‘Fall From the Sky’

HDR Standards Lacking, But Not for Lack of Effort at ITU-R and SMPTE

The lack of global standards was cited at last month’s NAB Show as the biggest “road block” inhibiting commercialization of next-gen high-dynamic-range (HDR) displays (CED April 10 p1). But it’s not lack of effort by the world’s standards developing organizations (SDOs) that’s to blame for HDR-enabling technologies’ not yet being ready for market introduction.

Sign up for a free preview to unlock the rest of this article

Communications Daily is required reading for senior executives at top telecom corporations, law firms, lobbying organizations, associations and government agencies (including the FCC). Join them today!

Of the world’s big SDOs, the ITU’s radiocommunication sector (ITU-R) and the Society of Motion Picture and TV Engineers (SMPTE) have been the most active in working to finalize standards on everything from signal capture to display that would make HDR-capable TVs a market reality, perhaps by 2017 or sooner. Observers who monitor ITU-R and SMPTE developments say those SDOs’ HDR standardization efforts, though late in getting started, have recently picked up steam.

Dolby Labs, one such observer and an HDR stakeholder for its efforts to commercialize Dolby Vision, has been working with the major studios “on next-generation imaging formats as outlined in a spec from MovieLabs,” the research and development organization run by the studios, said spokeswoman Abigail Holdaway. “Dolby supports the elements of the MovieLabs spec particularly related to extended dynamic range,” Holdaway said. “We are collaborating with the studios in standardizing various elements outlined in that specification in organizations such as SMPTE as well as ITU-R, but unfortunately are not at liberty to disclose the details of the work until the standards become public due to confidentiality policies.”

The MovieLabs spec, drafted last year, supports displays “with very high peak brightness and dynamic range/contrast ratio characteristics.” It defined those characteristics as peak brightness of 10,000 nits and black luminance of 0.005 nits, thus resulting in a “target” contrast ratio of 2,000,000:1. “The lower threshold of the encoding range shall be 0 nits,” while the “upper threshold” will be 10,000 nits, said the spec (http://bit.ly/1nlyGHq).

There’s common belief that the existing electro-optical transfer function (EOTF) used for HDTV and Ultra HD will not support HDR images, said a SMPTE UHD ecosystem study group report released in late March (http://bit.ly/1inRMHP). EOTF encoding and decoding describes how to convert a video signal’s analog voltage, film density or digital code to optical energy or visible light. Work is underway at the ITU-R and SMPTE on standards for a new “perceptual EOTF,” based not upon the “gamma function” of a cathode ray tube, but directly on the “contrast sensitivity ratio of the human eye,” the report said. The technology for gamma-based EOTFs date back to the 1930s, but wasn’t officially written into ITU-R standards until as recently as 2011.

The “fundamental basis for interpreting any visual signal is knowledge of that signal’s transfer function,” said a technical paper published in the SMPTE Journal by a team of Dolby engineers. In electronic displays for TV and film, that “critical information” is found in the EOTF, said the paper (http://bit.ly/1inSuET).

"Visible quantization artifacts” could easily result on HDR displays using today’s content delivery infrastructures, the paper said. “As the performance of electronic display systems continues to increase, the limitations of current signal coding methods become increasingly apparent. With bit-depth limitations set by industry standard interfaces, a more efficient coding system is desired to allow image quality to increase without requiring expansion of legacy infrastructure bandwidth. A good approach to this problem is to let the human visual system determine the quantization curve used to encode video signals. In this way, optimal efficiency is maintained across the luminance range of interest, and the visibility of quantization artifacts is kept to a uniformly small level.” Hence, the need for a perceptual-based EOTF, proposals on which have been submitted to the ITU-R not only by Dolby, but also by the BBC and by Philips.

Dolby calls its EOTF proposal a “perceptual quantization” (PQ)-based system that assumes peak brightness of 10,000 nits, in conformity with the MovieLabs spec, though other proposals are similar to Dolby’s because all are perceptual in nature and are so based because they're the most efficient coding systems available for accommodating HDR. “At 10,000 nits, although it seems a pretty extreme number from a video or cinema sense, it’s actually not too hard to get a real-world look at,” Dolby engineer Scott Miller said on a SMPTE webinar Tuesday. “It’s actually the typical brightness of a fluorescent tube. It’s bright, but it’s certainly not painful to look at in an indoor environment. So 10,000 is not as scary a number as it might seem at first.”

Dolby often fields questions about whether an EOTF optimized for 10,000 nits is “overly ambitious,” said Miller, one of the team of Dolby engineers who authored the SMPTE Journal technical paper. “The short answer is ‘no,'” he said. The system “gives you substantial gains in brightness” without wasting “code words,” he said. “So basically we can think of our 10,000-nit range as just headroom for future expansion. The headroom it takes in PQ to go from, let’s say, a 5,000-nit-peak system to a 10,000-nit-peak system is only about 7 percent of the code space, so we're not taking up large numbers of codes to reserve for that room.”

PQ is “the most efficient way to encode” HDR signals, and “I want to stress that a 10,000-nit system does make a lot of sense,” Miller said. “We've seen that the extra dynamic range is appreciated and preferred by viewers,” he said, citing consumer focus group studies Dolby has done and feedback in recent years from “lots of professional colorists” at the major studios. “The results are stunning, and it’s just a no-brainer preference by almost everyone who sees it.” As for the system’s future-proofing headroom, “even though we don’t have any 10,000-nit displays coming down the pike anytime soon, we may be surprised how quickly they come, given a system in place that can actually deliver the signals they can use.”

"For the longest time,” the maximum brightness of the average consumer display “hovered around 450 or 500 nits,” Joop Talstra, manager of intellectual property and standardization for Philips Consumer Lifestyle, told us. “So the content was created with this imaginary 100-nit display in mind that the consumer would have at home. All the TVs would accept this 100-nit signal and internally upconvert it to whatever their maximum brightness capability was.”

Recent advances in the brightness and cost performance of LED backlighting began spurring the really serious interest in HDR displays, Talstra said. Through Moore’s Law, LEDs have gotten “brighter a lot every year, and this gives you a new degree of freedom,” said Talstra, a member of the Philips HDR development team based in Eindhoven, the Netherlands. For example, “you can use this to lower the cost of the panels,” by including fewer LEDs to achieve the same amount of brightness as before, he said. “But what you're seeing now is that display makers and panel makers are also experimenting with more brightness” in extended-dynamic-range products and prototypes, the results of which have been very compelling, he said.

Philips in its TV labs nearly 10 years ago made one of the world’s first high-brightness display prototypes, so the interest in HDR technology didn’t “just fall from the sky,” Talstra said. The prototype produced 4,000 nits of brightness, he recalled. “It was a huge beast, water-cooled, because LEDs back then were not nearly efficient enough,” he said. “But we were just trying to see what the future of displays would be like, and we were already convinced, ‘Wow, this is great.'”

The industry “really had to wait for LEDs to become efficient and cheap enough to make real products with it,” Talstra said. “And that’s now sort of starting to happen” in the form of displays with peak brightness “all the way up to 1,200 nits” and more, he said. “But what’s going wrong now is that the transmission standards are still designed for these imaginary 100-nit displays.” With upconversion of 100-nit signals for higher brightness, “extrapolating up to 500 nits may be OK, but at 1,000 nits, you're really starting to make errors,” he said.

"If you're going to have these high-brightness signals -- and marketing-wise, high brightness gets introduced with wide color gamut as well -- what you really need to transport that with fidelity is more bit depth,” Talstra said. But bill of materials “issues” abound “if you let bit depth grow in an unbridled way,” he said. “Right now, most interfaces are 8 bits, with internal processing maybe sometimes a little bit more. Increasing that from 8 to 10 is like a big deal in the silicon industry. Going to 12 bits is painful, and going beyond that is not going to happen anytime soon, not in the consumer domain, at least.”

Increasing a display’s dynamic range means “you really have to be careful how you're going to spend those bits,” he said. “That means you have to have the right EOTF. The EOTF basically describes how you spend those quantization levels, those steps.” EOTF is “a big building block for HDR, to do it right,” though there are many other HDR-enabling technical standards that must also be finalized for HDR products to become even a high-end consumer reality, he said. Those standards typically will revolve around native HDR signal capture using new hardware and embedding the signal with the necessary metadata to control the display in an end-to-end system, said sources familiar with ITU-R development. “Extended” dynamic range describes those products that come to market before true HDR standards are complete, they said.

Meanwhile, CEA will await the progress that emerges from ITU-R and SMPTE and other SDOs before acting on its own HDR protocols, Brian Markwalter, senior vice president-research and standards, told us by email. “CEA standards projects adhere to a process in keeping with our ANSI accreditation and therefore have a well-defined approval step before opening a project,” Markwalter said of the American National Standards Institute, which oversees the development of voluntary product standards in the U.S. “It is correct that CEA does not have a current project addressing HDR,” which would be the first step in any HDR standardization effort, said Markwalter. “Given CEA’s role in creating standards for TVs and other CE devices, it is likely that a project incorporating HDR and perhaps other improvements to the video experience will be undertaken as these new approaches move through various organizations creating standards from capture to display.”