Luis Ochoa and Francisco Utray, March 29, 2016.
HDR: a new generation of displays
The objective of this guide on ‘High-Dynamic-Range for video’(High-Dynamic-Range, HDR), is to present in a didactic and informative format, the state of the art of HDR at this time (March 2016) and the challenge posed by the emergence of a new generation of screens and monitors with greater ability to reproduce the brightness of the images.
In this first part we focus on the characteristics of HDR displays and in the second installment we take a closer look at transfer functions (EOTF), gamma correction and the technical standards that the industry is adopting.
High dynamic range images have more detail in highlights and shadows. Their maximum luminance values are brighter and their minimum values are darker. With this technology, a new user experience is pursued with an on-screen representation closer to the direct vision of reality.
HDR is linked to ‘Ultra-High-Definition’ (Ultra-High-Definition, Ultra HD or UHD). 4K TVs are already being marketed with good prospects, but to really appreciate the difference between Ultra HD and ‘High-Definition ‘( HD) it is necessary that the 4K resolution is accompanied by an increase in dynamic range (HDR), a wider color gamut(Wide-Color-Gamut, WCG) and a color depth(bit-depth) of at least 10 bits. However, today’s 4K TVs are a hybrid first generation, still anchored to the limitations of the BT-709 high-definition television standard.
For HDR there are different standards and proprietary solutions from different companies and it is not yet known which one will be able to impose itself on the market.
- SMPTE published ST-2084, the first international HDR standard, in 2014.
- On the other hand the BBC and NHK have proposed the ‘Hibryd Log Gamma’ (HLG) which has been published in 2015 as a standard by ARIB, the Japanese standardization agency (ARIB STD-B67).
The International Telecommunication Union (ITU) is evaluating these two proposals to see if it will definitely opt for one of them or, more likely, for standardizing the two, since they cover different needs.
Industry players are also bringing their proprietary solutions to the table in an attempt to position themselves in this new market.
- Of particular note is the ‘Dolby-Vision’ system, the pioneer of HDR. Dolby was the first company to offer a complete solution that covers the entire production, distribution and exhibition chain with high dynamic range. Dolby’s model, “Dolby Perceptual Quantizer (PQ)”, is included in the SMPTE ST-2084 standard.
- Other companies such as Philips in conjunction with Thomson (SMPTE 2015: 35) or the Blu-ray Disc Association (2015) have also made progress in this area.
HDR is a technology for a new generation of monitors and screens with higher performance than the current ones and affects the entire production and distribution chain of audiovisual content. It has implications for film and television cameras, content mastering, distribution networks and finally the screens where the images are presented.
HDR in television and digital cinema
HDR in television and digital cinema is a technology that allows images to be presented with greater latitude, wider color gamut and higher contrast. The result is a more realistic visual representation.

SMPTE (2015: 4) proposes a definition for a high dynamic range system as one that “is specified and designed to capture, process and reproduce a scene, covering the entire perceivable range of detail in shadows and highlights, with sufficient accuracy and an acceptable level of artifacts, including sufficient separation between diffuse white and specular highlights.”
HDR technology should not be confused with what in still photography is also called HDR. A technique by which different exposures are mixed in a single image to achieve more detail in shadows and highlights(Tone Mapping). The end goal is similar but the technique is completely different for television and film.
“Although it’s using rather different techniques, HDR video is often likened to HDR photography as their aims are similar: to capture and reproduce scenes with a greater dynamic range than traditional technology can, in order to offer a truer-to-life experience. With HDR, more detail is visible in images that would otherwise look either overexposed, showing too little detail in bright areas, or underexposed, showing too little detail in dark areas. ” (Geutskens 2016)
This technique of multiple exposures to maximize dynamic range was also implemented by Red in its film cameras under the name HDRx.
But the film and television ecosystem is a complex scenario that requires international technical normalization (standardization) so that the various industry players – producers, distributors and screen manufacturers – can interoperate.
Technical standardization starts with the receiving equipment, i.e. professional monitors, televisions and cinema projectors. Once the standard for consumer devices is defined, camera manufacturers and content providers can prepare HDR material and its distribution for proper end-user reception.
It is also worth noting that the viewing conditions in a movie theater and on a television set in a home are very different. The dark room favors the rendering of shadows, but if the brightness of the highlights is increased too much, a problem arises with light reflection in the room – on walls, furniture and viewers’ faces – which affects the depth of blacks.
There are two key parameters that condition HDR production: the Electro-Optical Transfer Function (EOTF) of the displays and the maximum brightness level of the displays. To get into this subject, let’s start with the definition of the ‘contrast ratio’ of a monitor and the metrics used in this field.
Display contrast ratio
The unit used to measure the brightness of a display is the ‘nit’, or ‘candela per square meter’ (cd/m2). Nits measure the maximum brightness level, which corresponds to pure white, and the minimum, which represents black.
The ‘contrast ratio’ is the value that expresses the ability of a display to reproduce contrast. It is the division between the maximum and minimum level of brightness it can reproduce.
For example, a TV set with a maximum brightness of 100 nits and a minimum brightness of 0.1 nits has a contrast ratio of 100/0.1, i.e. 1000:1. This is the current standard for HD televisions. If we increase the maximum brightness to 400 nits while keeping the black level at 0.1 nits, we get a contrast ratio of 4000:1. In other words, the contrast ratio and therefore the dynamic range has been increased.
In digital cinema projection, peak brightness peaks at 48 nits (14 FtL) but minimum levels for blacks are lower, reaching a contrast ratio of 2000:1.
High dynamic range displays
As we noted in the introduction, most of the 4K TVs being marketed these days in department stores meet the spatial resolution requirements of Ultra HD but still operate with the BT-709 dynamic range and color space, i.e. with the conventional characteristics of high-definition (HD) displays.
In the following graph we can compare the difference in human vision capability with conventional and new high dynamic range displays.

Human vision is very capable of adjusting to different brightness levels in a scene because it has very complex and efficient micro-adaptation mechanisms. So it is very difficult to compare the human eye with a camera or a screen. Barten (2004) has proposed a ‘formula’ for assessing the contrast sensitivity of the human eye that indicates a perceptual ability of a contrast ratio of 10,000:1 that corresponds to about 15 f-stops of dynamic range (Borer 2014: 1)[1].
Traditional film cameras for photochemical film and the new digital cameras are perfectly capable of capturing this dynamic range. The problem is in the displays that have so far had a very limited contrast and dynamic range capability, approximately 6 to 9 f-stops with a contrast ratio of 1000:1.
Therefore, the focus is currently on the production of a new generation of displays with higher contrast rendering performance.
The idea of producing HDR television displays came about four years ago when manufacturers started using LEDs(Light Emitting Diodes) with better color and brightness responses. The development of OLED displays with ‘organic semiconductors’ and without the need for backlighting has further improved these performances.
With this new technology, manufacturers have been able to create brighter displays while keeping shadow levels very low. In other words, displays with higher contrast. Many HDR displays are capable of reaching a maximum brightness level of 1,000 nits, but standards foresee up to 10,000 nits in the future. The industry is thus laying the groundwork for a major step forward in image representation: the move from HD television with standard dynamic range (SDR, Standard Dynamic Range) and a maximum brightness value of 100 nits to Ultra HD with high dynamic range.
HDR could also find its way into HD displays, although manufacturers may find it more commercially interesting to link it exclusively to the new Ultra HD displays.
For theatrical cinema, we are working with laser projectors capable of keeping blacks at very low levels while increasing contrast ratio and dynamic range and minimizing the impact on ambient light.
Dynamic range is also directly related to color spaces. The following graph shows the relationship between brightness level and color rendering. You can see how the extended color gamut of the Ultra HD standard (BT-2020) is meaningless unless it is accompanied by an increase in dynamic range. To present the full color brightness levels of Ultra HD requires an HDR monitor.

It should be noted that these dynamic range values cannot be achieved with a color depth of 8 bits per channel. It takes 10, 12 or 16 bits to record all this information.
In short, Ultra HD ‘high dynamic range’ (HDR) images contain much more brightness and color information for each pixel than HD with ‘standard dynamic range’ (SDR). This affects not only the maximum and minimum brightness values but also the color space.

It should be noted that the purpose of high dynamic range is not to create brighter images with more color, but to present more detail on the screen, especially in the highlights. This is information that is normally lost by being burned or ‘clipped’. It is also intended to achieve more realistic colors and closer to the direct vision of reality(life-like).
In a second part of this guide we will look at gamma correction in HDR and the various technical solutions being discussed in international standardization organizations.
Luis Ochoa and Francisco Utray, March 29, 2016, 709 MediaRoom.
[1] (…). This dynamic range is similar to the simultaneous dynamic range of the human visual system, which is about 10,000:1. Humans can simultaneously in the same scene, see brightness variations of this range, for example between shadows and highlights. (Borer 2014: 1).