Post Perspective - Color Pipeline: Virtual Roundtable

Here is a Q&A I was recently included in. Check out the full article here at postperspective.com

Warner Bros. Post Creative Services Colorist John Daro

My mission control station at HQ

Warner Bros. Post Production Creative Services is a post house on the Warner Bros. lot in Burbank. “We specialize in feature films and high-end episodic projects, with picture and sound finishing under one roof. We also have editorial space and visual effects offices just one building over, so we truly are a one-stop shop for post.” 

What does your setup look like tools-wise?


I have been a devotee of FilmLight’s Baselight for the past five years. It is the beating heart of my DI theater, where I project images onto a 4K Christie projector and monitor them on two Sony X300s. For that “at-home” consumer experience, I also have a Sony A95K.

Although I spend 90% of my time on Baselight, there are a few other post-software necessities for my craft. I call my machine the “Swiss army box,” a Supermicro chassis with four Nvidia A6000s. I use this machine to run Resolve, Mistika, Photoshop, and Nuke. It also makes a fine dev box for my custom Python tools.

I always say, “It’s not the sword; it’s the samurai.” Use the right tool for the right job, but if you don’t have the right tool, then use what you’ve got.

Do you work in the cloud? If so, can you describe that workflow and the benefits?


Not really. For security reasons, our workstations are air-gapped and disconnected from the outside world. All media flows through our IO department. However, one cloud tool I do use is Frame.io, especially for the exchange of notes back and forth. I really like how everything is integrated into the timeline. It’s a super-efficient way to collaborate. In addition to those media uploads, the IO team also archives finished projects and raw scans to the cloud.

I do think cloud workflows are gaining steam, and I definitely have my eye on the space. I can envision a future where we send a calibrated Sony X3110 to a client and then use Baselight in the cloud to send JPEG XS straight to the display for remote approvals. It’s a pretty slick workflow, and it also gets us away from needing the big iron to live on-prem.

Working this way takes geography out of the equation too. I would love to work from anywhere on the planet. Bring on the Tiki drinks with the little umbrellas somewhere in the tropics with a laptop and a Mini Panel. All joking aside, it does open the talent pool to the entire world. You will be able to get the best artists regardless of their location. That’s an exciting prospect, and I can’t wait to see what the future holds for this new way of looking at post.

Do you often create LUTs for a project? How does that help?


I mostly work with curves and functions to do my transforms, but when on-set or editorial needs a preview of what the look will be in the room, I do bake LUTs out. They are especially critical for visual effects reviews and dailies creation.

There’s a film project that I’m working on right now. We’re doing a scan-once workflow on that show to avoid overly handling the negative. Once scanned, there is light CDL grading, and a show LUT is applied to the raw scans to make editorial media. The best looks are the ones that have been developed early and help to maintain consistency throughout the entire workflow. That way, you don’t get any surprises when you get into the final grade. Temp love is a thing… LUTs help you avoid loving the wrong thing.

Do you use AI as part of your daily job? In what way?

Superman II Restoration


I do use a bit of AI in my daily tasks, but it’s the AI that I’ve written myself. Originally, I started trying to make an automated dust-buster for film restoration. I failed miserably at that, but I did learn how to train a neural net, and that led to my first helpful tool.

I used an open-source image library to train an AI up-rezer. Although this is commonplace now, back then, it was scratching an itch that hadn’t been scratched yet. To this day, I do think my up-rezer is truer to the image and less “AI”-feeling than what’s available off the shelf.

After the up-rezer, I wrote Match Grader in 2020, which essentially takes the look and vibe from one shot and applies it to another. I don’t use it for final grading, but it can be very useful in the look-dev process.

Building on what I had learned coding Match Grader, I subsequently developed a process to use machine vision to create a depth channel. This turns your Power Windows from circles and squares into spheres and cubes. It is a very powerful tool for adding atmosphere to images. When these channels are available to me, one of my favorite moves is to desaturate the background while increasing the contrast in the foreground. This adds dimension to your image and helps to draw your eye to the characters where it was intended.

These channels can also aid in stereo compositing, but it’s been a minute since I have had a 3D job cross my desk that wasn’t for VR.

Machine vision segmentation with YOLO. 16fps @4k

Lately, I have been tinkering with an open-source library called YOLO (You Only Look Once.) This software was originally developed for autonomous driving, but I found it useful for what we do in color. Basically, it’s a very fast image segmenter. It returns a track and a matte for what it identifies in the frame. It doesn’t get everything right all the time, but it is very good with people, thankfully. You wouldn’t use these mattes for compositing, but they are great for color, especially when used as a garbage matte to key into.

I have also recently refreshed my AI uprezer. I built in some logic that is somewhat “intelligent” about the source coming in. This way the process is not a one size fits-all operation.

SamurAI Image Restoration

It can auto-detect interlace and cadence now and can perform a general analysis of the quality of the picture. This allowed me to throttle the strength and end up with the perfect amount of enhancement on a case-by-case basis. The new tool is named SamurAI.

If given an example from another show or work of art, what is the best way to emulate that?


It’s good to be inspired, but you never want to be derivative. Often, we take many examples that all have a common theme or feeling and amalgamate them into something new.

That said, sometimes there are projects that do need a literal match. Think film emulation for a period effect. People can approach it in two ways. First — the easiest way, while also being more complicated — is to get a hold of some of the stock you are emulating. Next, you expose it with color and density patches and then develop and measure the strip. If you read enough points, then you can start to interpolate curves from the data.

FilmLight can help with this, and back in my lab days, that is exactly whose software we used. Truelight was essential back in the early days of DI, when the “I” was truly the intermediate digital step between two analog worlds.

The second way I approach this task would be to use my Match Grader software. I can push the look of our references to some of the production footage. Match Grader is a bit of a black box in that it returns a completed graded image but not the recipe for getting there. This means the next step would be to bring it into the color corrector and match it using curves, keys, and scopes. The advantage of doing it this way instead of just matching it to the references is that you are working with the same picture, which makes it easier to align all the values perfectly.

Oh, or you can just use your eyeballs. 😉

Do your workflows include remote monitoring?


Not only do they include it, but there was a time in the not-too-distant past when that was the only option. We use all the top solutions for remote sessions, including Streambox, Sohonet ClearView, Colorfront and T-VIPS. The choice really comes down to what the facility on the catching side has and the location of the client. At the moment, my preference is Streambox. It checks all the boxes, from 4K to HDR. For quick approvals, ClearView is great because all we need on the client side is a calibrated iPad Pro.

What film or show or spot resonates with you from a color perspective?


Going back to my formative years, I have always been drawn to the austere beauty of Gattaca. The film’s use of color is simply flawless. Cinematographer Sławomir Idziak is one of my favorites, and he has profoundly influenced my work. I love Gattaca’s early flashbacks, in particular. I have been gravitating in that direction ever since I saw the picture.

Gattaca

Magic Mike

The Sea Beast

You can see a bit of Gattaca‘s influence in my own work on Steven Soderbergh’s Magic Mike and even a little bit on the animated film The Sea Beast, directed by Chris Williams.

Gattaca

The Sea Beast

I am always looking for new ways to push the boundaries of visual storytelling, and there are a ton of other films that have inspired me, but perhaps that’s a conversation for another time. I am grateful for the opportunity to have worked on projects that I have, and I hope that my work will continue to evolve, inspire and be inspired in the years to come.

Dear Mama

7/12 - Update: “Dear Mama” Emmy nomination for Best Documentary or Nonfiction Series

Well, if you are my age and grew up in the ’90s, you listened to Tupac. Even all of us white boys from Camarillo knew every word to every song. His music really was the soundtrack to the last decade of the millennium and my youth.

Tonight is the final installment of “Dear Mama.” FX’s most-watched unscripted show.

Perfect for a Mother’s Day weekend! Please go check it out on FX tonight or streaming on Hulu.



Allen Hughes directed this insightful docuseries. Fitting because Allen and his brother directed Tupac’s early music videos. Sure, there was a bit of drama, but that adds to the flavor of the story. That connection to the material made Hughes the quintessential choice for captaining this ship. Tupac wasn’t any one thing; more like an eclectic stew of many influences and identities. One thing is for sure. Dude was thug life for real.

Cognac hues or Hughes as it were

Allen was clear on the look and vibe he wanted for the series. Cognac was the word. We spent a couple of weeks developing a look that feels like you have filtered the light through a fine liquor. We also used Live Grain to achieve that end-of-the-film-era perfect Kodak grain structure of the 90s.


Documentary grading is an entirely different beast. Here are a few tips for you to tackle your next interview-based production.

  1. Color management - I preach this a lot, but even more critical with many different sources.

  2. Sounds basic, but group your interviews.

  3. Normalize the frame rate upfront.

  4. AI up-rez is like salt; a little is good, but too much ruins the dish. Don’t be afraid to let some pictures just look old.

  5. Build a KEM reel of all interview setups. Having the A and B cam shots together in the timeline will help you reference grades quickly.

The first step was look development. Allen had already shot some of the interviews we used to refine the look. I built an LMT that had the cognac golden vibe. I used that look and the ACES standard outputs to create a 709 LUT for Avid media creation. Eric DeAzevedo was the operator responsible for many terabytes of dailies. We also normalized all the archival footage to 23.98 during the dailies step. Cortex was used to make the mxf files and bins. We had to double-hop to render in LiveGrain since it wasn’t supported in Cortex at the time.

Early on, we were still in the late stages of the COVID lockdown. I built a reel of every interview setup and had a ClearView session with Hughes and Josh Garcia (Producer). This scene was super critical to our success going forward. It set the bible for the show's look and ensured that Allen’s vision was consistent through the many days of shooting. At the start of each episode, I applied our base settings using a “Fuzzy” match. (yes, that is a real Baselight thing.) Basically, “Fuzzy” is a setting that allows the machine to match grades presumed to be from the same camera roll rather than a timecode approach. This put all the interviews 90% of the way there from the get-go. The next step was to sort the timeline by clip name and time of day. I would then work through a pass where I would track the shapes and balance out any inconsistencies in lighting as the sun hung lower throughout the day. The archival footage didn’t have as graceful of a strategy applied. Each shot was its own battle as the quality differed from source to source. My main goal was to ensure that it was cohesive and told the story Allen was crafting.

The first deliverable out of the gate was a theatrical version for the Toronto International Film Festival. I graded in ACES cc going out to PQ 1000nits. Then that was run through the DoVi analysis, and a P3D65 48nit version was trimmed. Finally, we applied a P3D65 to XYZ lut on the output render to create the DCDM.

The biggest challenge of this show was keeping up with editorial. As you can imagine, documentary storytelling is honed in the edit bay. The edit was constantly being updated as shots were cleared or discovered. Back at my shop, Leo Ferrini would constantly update my project to chase editorial. Multi-Paste (Remote Grades for our Resolve friends) was clutch in this situation. We took the old grades and copied them across. Leo would categorize the new material so I could sort the scene for the changes. The timelines constantly evolved and took shape until we got Allen in for the final grade. Allen has a great eye and religiously kept us in the world he had envisioned. We paid particular attention to eye-trace and ensured the information from each visual told a straightforward story without distraction. Next was a pass of Dolby trimming to take the approved PQ to 709. We would send that 709 file to Allen and get notes before creating the final IMF for delivery.

A super big thanks to Paul Lavoie for managing this one. There were many moving parts on this production but thanks to him, I rarely felt it. It’s a blessing to have a partner that doesn’t mind getting his hands dirty even though he’s one of the suits😜.


Be sure to check out this killer doc about one of our generation’s most prolific artists, told through Hughes's equally unparalleled artistic voice. Allen is a true master of many formats but has solidified his place as one of the best documentarians. Thanks for taking the time to peek behind the curtain, and let me know what you think.

Here are some more before and afters. Mellow yella’ Dan Muscarella would have been proud.







Looking Back on 2021

I wanted to take a quick moment to look back on all the great work that the team and I accomplished this year. There were a ton of fantastic projects with amazing filmmakers. Paul Lavoie and I also got the opportunity to take a second crack at some of our earlier work by giving it a 4k HDR makeover. Have we really been at it that long?

I owe a huge debt of gratitude to Leo Ferrini and Paul Lavoie for their dedication to our clients and never compromising on quality. They keep me honest. I’m very grateful to have partners like this A-team. Our operation is strong here on the Warner lot. Looking forward to what will come in 2022!

2021 New Theatrical and Remasters

Happy New Year and happy grading everyone!

-JD