Dear Mama

7/12 - Update: “Dear Mama” Emmy nomination for Best Documentary or Nonfiction Series

Well, if you are my age and grew up in the ’90s, you listened to Tupac. Even all of us white boys from Camarillo knew every word to every song. His music really was the soundtrack to the last decade of the millennium and my youth.

Tonight is the final installment of “Dear Mama.” FX’s most-watched unscripted show.

Perfect for a Mother’s Day weekend! Please go check it out on FX tonight or streaming on Hulu.



Allen Hughes directed this insightful docuseries. Fitting because Allen and his brother directed Tupac’s early music videos. Sure, there was a bit of drama, but that adds to the flavor of the story. That connection to the material made Hughes the quintessential choice for captaining this ship. Tupac wasn’t any one thing; more like an eclectic stew of many influences and identities. One thing is for sure. Dude was thug life for real.

Cognac hues or Hughes as it were

Allen was clear on the look and vibe he wanted for the series. Cognac was the word. We spent a couple of weeks developing a look that feels like you have filtered the light through a fine liquor. We also used Live Grain to achieve that end-of-the-film-era perfect Kodak grain structure of the 90s.


Documentary grading is an entirely different beast. Here are a few tips for you to tackle your next interview-based production.

  1. Color management - I preach this a lot, but even more critical with many different sources.

  2. Sounds basic, but group your interviews.

  3. Normalize the frame rate upfront.

  4. AI up-rez is like salt; a little is good, but too much ruins the dish. Don’t be afraid to let some pictures just look old.

  5. Build a KEM reel of all interview setups. Having the A and B cam shots together in the timeline will help you reference grades quickly.

The first step was look development. Allen had already shot some of the interviews we used to refine the look. I built an LMT that had the cognac golden vibe. I used that look and the ACES standard outputs to create a 709 LUT for Avid media creation. Eric DeAzevedo was the operator responsible for many terabytes of dailies. We also normalized all the archival footage to 23.98 during the dailies step. Cortex was used to make the mxf files and bins. We had to double-hop to render in LiveGrain since it wasn’t supported in Cortex at the time.

Early on, we were still in the late stages of the COVID lockdown. I built a reel of every interview setup and had a ClearView session with Hughes and Josh Garcia (Producer). This scene was super critical to our success going forward. It set the bible for the show's look and ensured that Allen’s vision was consistent through the many days of shooting. At the start of each episode, I applied our base settings using a “Fuzzy” match. (yes, that is a real Baselight thing.) Basically, “Fuzzy” is a setting that allows the machine to match grades presumed to be from the same camera roll rather than a timecode approach. This put all the interviews 90% of the way there from the get-go. The next step was to sort the timeline by clip name and time of day. I would then work through a pass where I would track the shapes and balance out any inconsistencies in lighting as the sun hung lower throughout the day. The archival footage didn’t have as graceful of a strategy applied. Each shot was its own battle as the quality differed from source to source. My main goal was to ensure that it was cohesive and told the story Allen was crafting.

The first deliverable out of the gate was a theatrical version for the Toronto International Film Festival. I graded in ACES cc going out to PQ 1000nits. Then that was run through the DoVi analysis, and a P3D65 48nit version was trimmed. Finally, we applied a P3D65 to XYZ lut on the output render to create the DCDM.

The biggest challenge of this show was keeping up with editorial. As you can imagine, documentary storytelling is honed in the edit bay. The edit was constantly being updated as shots were cleared or discovered. Back at my shop, Leo Ferrini would constantly update my project to chase editorial. Multi-Paste (Remote Grades for our Resolve friends) was clutch in this situation. We took the old grades and copied them across. Leo would categorize the new material so I could sort the scene for the changes. The timelines constantly evolved and took shape until we got Allen in for the final grade. Allen has a great eye and religiously kept us in the world he had envisioned. We paid particular attention to eye-trace and ensured the information from each visual told a straightforward story without distraction. Next was a pass of Dolby trimming to take the approved PQ to 709. We would send that 709 file to Allen and get notes before creating the final IMF for delivery.

A super big thanks to Paul Lavoie for managing this one. There were many moving parts on this production but thanks to him, I rarely felt it. It’s a blessing to have a partner that doesn’t mind getting his hands dirty even though he’s one of the suits😜.


Be sure to check out this killer doc about one of our generation’s most prolific artists, told through Hughes's equally unparalleled artistic voice. Allen is a true master of many formats but has solidified his place as one of the best documentarians. Thanks for taking the time to peek behind the curtain, and let me know what you think.

Here are some more before and afters. Mellow yella’ Dan Muscarella would have been proud.







How To - Dolby Vision

Dolby Vision - How To and Best Practices



What is Dolby Vision

Dolby Vision is a way to dynamically map HDR to different display targets. At its core, the system analyzes your media and transforms it to ranges less than your mastering target, typically SDR 100 nits.

Project Setup

The first step is to license your machine. Once that is in place you need to set up your project. Go into settings and set your CMU(Content Mapping Unit) version. Back in the day, we used an external box, but nowadays the software does it internally. You will also need to set which version. V4 is the current iteration whereas V2.9 is a legacy version that some older TVs use. Finally set your Mastering Display. That is a Sony x300 in my case which is set up for PQ P3 D65 1000 nits.

Baselight Project Setup

Resolve Project Setup

It’s not what you said, it’s your tone

The goal is to make our HDR master look good on an SDR display. To do this we need to tone map our HDR ranges to the corresponding SDR ranges. This is a nonlinear relationship and our shadows mid-tones and highlights will land in the wrong areas if we don’t tone map them first. See below for an example of an SDR image that has not been tone mapped correctly. You can see the highlights are way too hot. Now we could use a curve and shape our image to a discreet master for SDR, but most studios and streamers are requesting a Dolby delivery regardless if a separate SDR grade was made. Plus, Dolby does a pretty decent job of getting you there quickly since the v4 release.

The first step is to analyze your footage. This will result in three values that will set a tone curve, min, max, and average. These values inform the system how to shape the curve to get a reasonable rendering of your HDR master in SDR.

Image courtesy of Dolby

Tone mapping from HDR to SDR

What we are trying to do here is fit 10 pounds of chicken into an 8-pound bag. Something has to give, usually the bones but the goal is to keep as much chicken as you can. Rather than toss data out, we instead compress it. The system calculates the min, max, and average light levels. The idea is to keep your average or the “meat and potatoes” of your shot intact while compressing the top and bottom ranges. The end result is an SDR image that resembles your HDR only flatter.

How a colorist goes about the analysis is just as important as the analysis itself. This is going to get into a religious debate more than a technical one and everything from this point on is my opinion based on my experiences with the tech. Probably not what Dolby would say.

The original design of the system wanted you to analyze every shot independently. The problem with this approach is it can take a consistent grade and make it inconsistent depending on the content. Say you had two shots from the same scene.

One side of the coverage was shooting the character with a blown-out window behind them. The other side shoots into the darker part of the house. Now even though you as a colorist have balanced them to taste, the Dolby analysis will have two very different values for these shots. To get around this, I find it is better to average the analysis for each scene vs doing independent shots. The first colorist I saw work this way was my good friend and mentor Walter Volpatto. He went toe to toe with Dolby because his work was getting QC rejections based on his method. He would analyze only a grey ramp with the d-min and d-max values representing his media and apply that to his entire timeline. His thought process was if it was one transform to HDR it should be one transform down.

Most studio QC operations now accept this approach as valid metadata (Thank you, Wally!) While I agree with his thought process, I tend to work based on one analysis per scene. Resolve has this functionality built in. When I’m working in Baselight I set it up this way and copy the scene averaged analysis to every shot in preparation for the trim.

Scene average analysis in Baselight.

Setting the tone

Now that your analysis is complete it’s time to trim. First, you need to set what display output your trim is targeting and the metadata flag for the intended distribution. You can also set any masking that was used so the analysis doesn’t calculate the black letterbox pixels. The most common targets are 709 100nits, P3 48nits, and PQ 108nits. The 709 trim is for SDR home distribution whereas the other two are for theatrical distribution. The reason we want to keep the home video and cinema trims separate is that displays that fall in between two trim targets will be interpolated. You can see that the theatrical 108nit trim is very close to the home video 100nit trim. These two trims will be represented very differently due to the theatrical grade being intended for a dark theater vs home viewing with dim surround lighting conditions. Luckily Dolby recognized this and that is why we have separation of church and state now. The process for completing these trims is the same though, only the target changes.

Trim the fat

Saturation plus lift gamma gain is the name of the game. You also have advanced tools for highlight clipping and mid-tone contrast. Additionally, you have very basic secondary controls to manipulate the hue and saturation of the six vectors.

Baselight Dolby trim controls.

Resolve Dolby trim controls.

These secondary controls are very useful when you have extremely saturated colors that are on the boundaries of your gamut. I hope Dolby releases a way to only target the very saturated color values instead of the whole range of a particular vector, but for now, these controls are all we have.

Mid tone offset

Another tool that affects the analysis data but could be considered a trim is the mid-tone offset. A good way to think about this tool is a manual shifting of what your average is. This slides the curve up or down from the midpoint.

I usually find the base analysis and subsequent standard conversion a little thick for my taste. I start by finding a pleasing trim value that works for a majority of shots. Then I ripple that as a starting place and trim from there until I’m happy with the system’s output. The below before and after shows the standard analysis output vs where I ended up with the trim values engaged.

It’s time to export once you are happy with the trims for all of your needed outputs. This is done by exporting the XML recipes that when paired with your PQ master will create all the derivative versions.

XML

Here are two screenshots of where to find the XML export options in Baselight and Resolve.

Rightclick on your timeline -> timelines - >export -> Dolby XML

Shots View -> Gear Icon ->Export Dolby Vision Metadata… This will open a menu to let you choose your location and set primaries for the file.

The key here is to make sure that you are exporting an XML that reflects your deliverable, not your DSM. For example, I typically export PQ P3 D65 tiffs as the graded master files. These are then taken into Transkoder, placed into a rec 2020 container, and married with the XML to create an IMF. It’s important to export a rec2020 XML instead of a P3 one so that when it is applied to your deliverable it yields the intended results. You can always open your XML in a text editor if you are unsure of your declared primaries. I have included a screen grab of what the XML should look like for the Rec2020 primaries on the left and P3 primaries on right. Always go by the numbers because filenames can lie.

Rec2020 XML vs P3 D65

There is beauty in the simplicity of this system. Studios and streamers love the fact there is only one serviceable master. As a colorist, I love the fact that when there is a QC fix you only need to update one set of files and sometimes the XML. That’s a whole lot better than in the height of the 3D craze where you could have up to 12 different masters and that is not even counting the international versions. I remember finishing Katie Perry’s “Part of Me” in 36 different versions. So in retrospect, Dolby did us all a great service by transmuting all of those versions we used to painstakingly create into one manageable XML sidecar file.

Thanks for reading

I bet in the future these trim passes end up going the way of the 4x3 version. Especially with the fantastic HDR displays available from Samsung, Sony, and LG at continually lower price points. Remember the Dolby system only helps you at home if it is something other than what the media was mastered at. Until then, I hope this helps.

Check out this Dolby PDF for more information and deeper dive into the definition of the various XML levels. As always thanks for reading.

-JD

DC League of Super Pets

Super excited for everyone to check out the latest from Warner Animation. DC League of Super Pets is a super fun romp expertly animated by the talented team at Animal Logic.

Toshi the Wonder Dog!

Anybody that has ever had a pet is going to love this movie.

Post services provided by Warner PPCS include an ACES HDR picture finish and sound. Truly a post-production one-stop-shop.

The project was supervised by Randy Bol. The great thing about working with Randy is we have a level of trust that has been built over many other projects collaborating together. There is definitely a shorthand when both of us are in the suite. One of the best post sups you’ll work with plus just a good dude too.

Color was supervised by co-director Sam Levine. This guy was cracking me up every session. Not only was he hilarious, but damn, what an eagle eye. I was sort of bummed when our time together ended.

A big thanks to Paul Lavoie and Leo Ferrini too for keeping the ship afloat. I would be drowning in a pile of pixels without these guys.

Now go see DC League of Super Pets only in theaters… Preferably a Dolby Cinema one.

Space Jam: A New Legacy - Multiple Worlds, Multiple Deliveries.

Hey Everybody! Space Jam: A New Legacy directed by Malcolm D. Lee is out today. I wanted to take a second to highlight the super slick color workflow which allowed us to work on multiple versions concurrently.

Capture

Space Jam: A New Legacy was masterfully lensed by Salvatore Totino. The two primary capture mediums were 35mm Kodak film and the entire lineup of Arri cameras, mainly the LF. The glass used was Zeiss Supremes and Master Primes. There were also a handful of scans from archival films which were used as plates for animation.

VFX

ILM was running point for the VFXs on this show. Grady Cofer and his team were a dream to work with. There is a reason ILM continues to be the best in class. The knowledge and expertise ILM employs is second to none. Early on Grady connected me with their head of color science, Matthias Scharfenberg. I thought I knew what I was doing when it comes to color science until I saw what Matthias had going on with CTL and Nuke. I learned a lot from our chats. He was super gracious in sending over his Nuke scripts which allowed me to build a Baselight transform that matched ILM’s pipeline. This insured a one-to-one representation of their stellar work.

Two Worlds, One Grade

The show can basically be broken down into two looks. In “Space Jam: A New Legacy” there is the real-world and the Warner Bros Serververse.

We chose an analog celluloid vibe for the real world. The Serververse has a super clean, very 1s and 0s look to it. Most of the real world is shot on film or is Arri Alexa utilizing film emulation curves paired with a grain treatment. Some sequences have a mix of the two. Let me know if you can tell which ones😉.

The look of the digital world changes depending on where the characters are in the Serververse. The base look of the Serververse is the vanilla ACES ODT with restricted primaries in the mid-tones complimented by exaggerating the saturation for highly saturated colors.

All the other looks are riffs off this base LMT with the exception of the library classics. These were graded to look like their existing masters and the new footage was matched in.

Multiple Deliverables, One Timeline

The challenge of this show, beyond the sheer number of VFX and moving parts, was the delivery schedule. The Post Supervisor Lisa Dennis asked to have the theatrical version and the HDR video versions delivered days apart. To hit the dates requested, I graded simultaneously in HDR and SDR. I did most of the heavy lifting in HDR PQ 1000nits. Then I trimmed at 14FL to ensure the reel was ready for filmmaker review. Poping back and forth between outputs was made possible by two great tools. Firstly, I used ACES 1.1 color management to normalize all the different sources into one grading space.

Secondly, I used Baselight’s “Bypass Categories” functionality to if/then the timeline. Basically, I had one timeline that would represent itself differently depending on the output selected. Different layers were toggled for different sources and outputs. The LMTs used often had SDR and HDR versions to further exacerbate the combinations. This was a critical hurdle to overcome and the Baselight gave me the tools to accomplish the organization of a very complicated timeline with ease.

Approvals

The Color sessions were supervised by Malcolm, Sal, and Bob Ducsay. We used Nevion and ClearView for remote sessions, but most of the work was done in-person on the lot here in Burbank. The Animated sequences were supervised by Spike Brandt and Devin Crane. These guys are animation heavyweights, so very cool to be in such good company for an animation nerd like me.

Most of the tweaking on the animation was for continuity fixing. A few of the shots we composited for final in the Baselight. This gave Devin and Spike a little extra creative freedom than a baked shot would have.

Reference for Tweety’s floor

After all the color decisions were made, Malcolm had his final pass and the masters were created. All deliverables from that point were sub-masters from the hero PQ deliverable. These included deliverables such as the Dolby Vision Theatrical version and 709 SDR version derived from the Dolby XML metadata.

Go See It!

Thanks for reading how the look of this candy-colored revival came together. Working on Space Jam was a wild ride. I had to tap into my background in photochemical film processing and knowledge of the latest digital grading techniques to create unique looks for all the different cinematic worlds visited. The film is a nostalgic love letter to the rich history and legacy of the Warner Bros. Studio. I couldn't be more proud of the Warner Color team, especially Leo Ferrini and Paul Lavoie. A big thanks to you guys! Putting this film together was a monumental task and I am ecstatic with end result. Check it out in theaters and on HBO Max today!