[Updated 8/3/2019, with a corrected chart and explanation of how file size increases as bit-depth increases.]
Over the last few weeks, I’ve learned a lot about high dynamic range media. Let me share what I’ve learned.
Here are two definitions to get us started:
HDR actually means different things depending upon whether you are talking about stills or video. HDR stills combine multiple exposures to form a new image. while HDR video records a wider range of pixel values by capturing images with a broader dynamic range and saturation.
High-Dynamic Range (HDR) media is defined by three characteristics:
We often describe these as:
NOTE: I should point out that it is entirely possible to use HDR media, or its close cousin, log media, in non-HDR projects. In fact, this is often done to provide more image flexibility during color grading.
As with all media, there are standards that govern HDR:
However, though it encompasses more, we principally think of Rec. 2020 as governing the color space of HDR.
We principally think of Rec. 2100 as defining the ultimate goal of where we are going with HDR media. Rec. 2020 is where we are now, while Rec. 2100 is where we are headed.
Greater pixel resolution is the easiest of the three characteristics to meet. Most of us are familiar with using 4K or greater frame sizes. And, aside from needing more storage capacity and bandwidth, the process of editing 4K is no different than HD or, even, SD.
(Source data courtesy of Apple, Inc.)
This chart illustrates how storage capacity and bandwidth needs expand as frame sizes expand, given the same codec and frame rate. Notice, especially, that this curve is not linear, but logarithmic.
(Source data courtesy of Apple, Inc.)
This chart illustrates how storage capacity and bandwidth needs expand as frame rates increase.
CORRECTED: And this chart illustrates how storage capacity and bandwidth needs expand as bit depth increases. As bit depth increases by 2, file size increases by 25%. (Remember, HDR requires a minimum of 10-bit video, while most RAW files are 12- or 14-bit depth.)
NOTE: Here’s an article I wrote that describes this in more detail.
In all cases, moving up to HDR media means using a LOT! more and faster storage to be able to access it for editing. In fact, total storage needs grow even faster than these charts, because we often combine larger frame sizes with faster frame rates and greater bit depth.
A bigger challenge than increasing resolution is increasing chroma saturation. We are all familiar with the Rec. 709 color space that we use everyday in HD media. In fact, Rec. 709 and sRGB use the same color space. This explains why, when we create an image in Photoshop, it looks the same when we move it to an HD project in our NLE.
Unlike HD, HDR is a work in progress. Why? Because our current display technology can’t deliver all the colors specified in Rec. 2020, or the pixel brightness in Rec. 2100.
Before I illustrate this, let me provide some definitions.
(Image courtesy of Apple, Inc.)
Here’s an example of what I’m talking about. The big swatch of color represents all the colors a normal human eye can see.
The general consensus is that greater saturation, combined with greater brightness, makes our images look more “real.”
Of the three characteristics of HDR media, brightness is the most difficult to achieve. Current monitors just don’t have backlights bright enough to fully achieve Rec. 2100 specs. Well, not for a reasonable cost.
Here’s a good illustration, using video scopes from Final Cut Pro X. The image on the left is an HD waveform monitor with a white-to-black gradient in it. The image on the right is the same gradient in a full-range HDR project. There is 100 TIMES the brightness in an HDR image compared to an HD image!
Much of this additional brightness is designed for highlights, but the very top end is for speculars; to allow them to glisten with a brightness that more nearly approaches reality.
Most “HDR” displays today range from 500 – 1000 nits. Digital Cinema uses 2,000 nits.
NOTE: In HD, we refer to these levels as IRE. In HDR, we refer to them as “nits.” A nit is measured in “candela,” which is the amount of light emitted by a common wax candle. One nit is the equivalent of one candela per square meter (cd/m2). Essentially, from the point of view of video scopes, nits and IRE are equivalent.
“As of 2018, high-end consumer-grade HDR displays can achieve 1,000 cd/m2 of luminance, at least for a short duration or over a small portion of the screen, compared to 250-300 cd/m2 for a typical SDR display.” (Wikipedia)
In editing, we work with two transforms of HDR: PQ and HLG. PQ provides the greatest flexibility, while HLG provides backwards-compatibility with SDR monitors and distribution hardware.
For final distribution, there are three formats you need to know:
OTHER STUFF YOU NEED TO KNOW
While HDR is defined by more, richer and brighter pixels, when it comes to editing, there are other tips you need to know.
Digital Cinema Report wrote Jan. 16, 2019: “It is well-known that motion in high dynamic range [media] will reveal frame rate judder at 24 fps. So, when shooting HDR, use higher frame rates.” Yes, this will create bigger files, but it will also improve image quality.
In all cases, you can not view HDR media on a computer screen. It is not bright enough and most displays do not display sufficient saturation. This means that you need to connect an external monitor to view HDR.
Currently, Apple advises using an AJA Io 4K which connects via Thunderbolt and provides both SDI and HDMI monitor outputs. However, the Io 4K is expensive: $1,995 (US).
HDMI began supporting some form of HDR with version 2.0a. The current 2.1 version supports PQ, HLG and Dynamic HDR. HLG is supported by video services such as the BBC iPlayer, DirecTV, Freeview Play, and YouTube.
NOTE: Currently, costs for a reference HDR monitor runs between $10,000 and $20,000. They are not cheap. This is why the upcoming Apple Pro HDR Monitor is so exciting. It includes both a Thunderbolt interface and an HDR reference monitor for less than $5,000 (US).
Having HDR as a goal is not helpful unless we can clearly implement a workflow to achieve it.
Academy Color Encoding System (ACES) was created by the Academy of Motion Picture Arts and Sciences and released in December 2014. ACES is a complete color and file management system that works with almost any professional workflow and it supports both HDR and wide color gamut. More information can be found at https://ACESCentral.com.
Currently, the latest version of Avid Media Composer supports both HDR and ACES.
Apple Final Cut Pro X supports HDR Rec. 2020 PQ and HLG projects, but not the full ACES workflow.
Adobe Premiere supports HDR media in a Rec. 709 project, but provides only limited support for HDR media in an HDR project. It does not currently support any version of Rec. 2020.
COLOR GRADING TIPS
It beyond this short article to describe grading HDR in any detail. But here are three tips from experts who are grading HDR today:
HDR is an evolving standard, both from a technology and a software viewpoint. But, since even HD projects will look better if given a color grade to add more brightness and color saturation, it’s time for the rest of us to start paying much closer attention.
Running parallel with HDR are terms like “raw,” “log,” and “LUTs.” Here’s an article that explains what these mean and illustrates how they are used.
Apple has an excellent white paper on HDR. While this focuses on FCP X, it provides an excellent background on how Macs handle color.
A second Apple white paper worth reading covers the new ProRes RAW format. This is an excellent discussion of what a RAW format is and how ProRes RAW differs from it.
To work with RED files on a Mac, you’ll need to install the Red Apple Workflow Installer, available from https://www.red.com/download/red-apple-workflow-installer.
MediaInfo, from MediaArts, is an excellent utility for examining the metadata in HDR media. Available from the Mac App Store, it costs $0.99.
NEW & Updated!
Edit smarter with Larry’s latest training, all available in our store.