Post Perspective - Color Pipeline: Virtual Roundtable

Here is a Q&A I was recently included in. Check out the full article here at postperspective.com

Warner Bros. Post Creative Services Colorist John Daro

My mission control station at HQ

Warner Bros. Post Production Creative Services is a post house on the Warner Bros. lot in Burbank. “We specialize in feature films and high-end episodic projects, with picture and sound finishing under one roof. We also have editorial space and visual effects offices just one building over, so we truly are a one-stop shop for post.” 

What does your setup look like tools-wise?


I have been a devotee of FilmLight’s Baselight for the past five years. It is the beating heart of my DI theater, where I project images onto a 4K Christie projector and monitor them on two Sony X300s. For that “at-home” consumer experience, I also have a Sony A95K.

Although I spend 90% of my time on Baselight, there are a few other post-software necessities for my craft. I call my machine the “Swiss army box,” a Supermicro chassis with four Nvidia A6000s. I use this machine to run Resolve, Mistika, Photoshop, and Nuke. It also makes a fine dev box for my custom Python tools.

I always say, “It’s not the sword; it’s the samurai.” Use the right tool for the right job, but if you don’t have the right tool, then use what you’ve got.

Do you work in the cloud? If so, can you describe that workflow and the benefits?


Not really. For security reasons, our workstations are air-gapped and disconnected from the outside world. All media flows through our IO department. However, one cloud tool I do use is Frame.io, especially for the exchange of notes back and forth. I really like how everything is integrated into the timeline. It’s a super-efficient way to collaborate. In addition to those media uploads, the IO team also archives finished projects and raw scans to the cloud.

I do think cloud workflows are gaining steam, and I definitely have my eye on the space. I can envision a future where we send a calibrated Sony X3110 to a client and then use Baselight in the cloud to send JPEG XS straight to the display for remote approvals. It’s a pretty slick workflow, and it also gets us away from needing the big iron to live on-prem.

Working this way takes geography out of the equation too. I would love to work from anywhere on the planet. Bring on the Tiki drinks with the little umbrellas somewhere in the tropics with a laptop and a Mini Panel. All joking aside, it does open the talent pool to the entire world. You will be able to get the best artists regardless of their location. That’s an exciting prospect, and I can’t wait to see what the future holds for this new way of looking at post.

Do you often create LUTs for a project? How does that help?


I mostly work with curves and functions to do my transforms, but when on-set or editorial needs a preview of what the look will be in the room, I do bake LUTs out. They are especially critical for visual effects reviews and dailies creation.

There’s a film project that I’m working on right now. We’re doing a scan-once workflow on that show to avoid overly handling the negative. Once scanned, there is light CDL grading, and a show LUT is applied to the raw scans to make editorial media. The best looks are the ones that have been developed early and help to maintain consistency throughout the entire workflow. That way, you don’t get any surprises when you get into the final grade. Temp love is a thing… LUTs help you avoid loving the wrong thing.

Do you use AI as part of your daily job? In what way?

Superman II Restoration


I do use a bit of AI in my daily tasks, but it’s the AI that I’ve written myself. Originally, I started trying to make an automated dust-buster for film restoration. I failed miserably at that, but I did learn how to train a neural net, and that led to my first helpful tool.

I used an open-source image library to train an AI up-rezer. Although this is commonplace now, back then, it was scratching an itch that hadn’t been scratched yet. To this day, I do think my up-rezer is truer to the image and less “AI”-feeling than what’s available off the shelf.

After the up-rezer, I wrote Match Grader in 2020, which essentially takes the look and vibe from one shot and applies it to another. I don’t use it for final grading, but it can be very useful in the look-dev process.

Building on what I had learned coding Match Grader, I subsequently developed a process to use machine vision to create a depth channel. This turns your Power Windows from circles and squares into spheres and cubes. It is a very powerful tool for adding atmosphere to images. When these channels are available to me, one of my favorite moves is to desaturate the background while increasing the contrast in the foreground. This adds dimension to your image and helps to draw your eye to the characters where it was intended.

These channels can also aid in stereo compositing, but it’s been a minute since I have had a 3D job cross my desk that wasn’t for VR.

Machine vision segmentation with YOLO. 16fps @4k

Lately, I have been tinkering with an open-source library called YOLO (You Only Look Once.) This software was originally developed for autonomous driving, but I found it useful for what we do in color. Basically, it’s a very fast image segmenter. It returns a track and a matte for what it identifies in the frame. It doesn’t get everything right all the time, but it is very good with people, thankfully. You wouldn’t use these mattes for compositing, but they are great for color, especially when used as a garbage matte to key into.

I have also recently refreshed my AI uprezer. I built in some logic that is somewhat “intelligent” about the source coming in. This way the process is not a one size fits-all operation.

SamurAI Image Restoration

It can auto-detect interlace and cadence now and can perform a general analysis of the quality of the picture. This allowed me to throttle the strength and end up with the perfect amount of enhancement on a case-by-case basis. The new tool is named SamurAI.

If given an example from another show or work of art, what is the best way to emulate that?


It’s good to be inspired, but you never want to be derivative. Often, we take many examples that all have a common theme or feeling and amalgamate them into something new.

That said, sometimes there are projects that do need a literal match. Think film emulation for a period effect. People can approach it in two ways. First — the easiest way, while also being more complicated — is to get a hold of some of the stock you are emulating. Next, you expose it with color and density patches and then develop and measure the strip. If you read enough points, then you can start to interpolate curves from the data.

FilmLight can help with this, and back in my lab days, that is exactly whose software we used. Truelight was essential back in the early days of DI, when the “I” was truly the intermediate digital step between two analog worlds.

The second way I approach this task would be to use my Match Grader software. I can push the look of our references to some of the production footage. Match Grader is a bit of a black box in that it returns a completed graded image but not the recipe for getting there. This means the next step would be to bring it into the color corrector and match it using curves, keys, and scopes. The advantage of doing it this way instead of just matching it to the references is that you are working with the same picture, which makes it easier to align all the values perfectly.

Oh, or you can just use your eyeballs. 😉

Do your workflows include remote monitoring?


Not only do they include it, but there was a time in the not-too-distant past when that was the only option. We use all the top solutions for remote sessions, including Streambox, Sohonet ClearView, Colorfront and T-VIPS. The choice really comes down to what the facility on the catching side has and the location of the client. At the moment, my preference is Streambox. It checks all the boxes, from 4K to HDR. For quick approvals, ClearView is great because all we need on the client side is a calibrated iPad Pro.

What film or show or spot resonates with you from a color perspective?


Going back to my formative years, I have always been drawn to the austere beauty of Gattaca. The film’s use of color is simply flawless. Cinematographer Sławomir Idziak is one of my favorites, and he has profoundly influenced my work. I love Gattaca’s early flashbacks, in particular. I have been gravitating in that direction ever since I saw the picture.

Gattaca

Magic Mike

The Sea Beast

You can see a bit of Gattaca‘s influence in my own work on Steven Soderbergh’s Magic Mike and even a little bit on the animated film The Sea Beast, directed by Chris Williams.

Gattaca

The Sea Beast

I am always looking for new ways to push the boundaries of visual storytelling, and there are a ton of other films that have inspired me, but perhaps that’s a conversation for another time. I am grateful for the opportunity to have worked on projects that I have, and I hope that my work will continue to evolve, inspire and be inspired in the years to come.

Looking Back on 2021

I wanted to take a quick moment to look back on all the great work that the team and I accomplished this year. There were a ton of fantastic projects with amazing filmmakers. Paul Lavoie and I also got the opportunity to take a second crack at some of our earlier work by giving it a 4k HDR makeover. Have we really been at it that long?

I owe a huge debt of gratitude to Leo Ferrini and Paul Lavoie for their dedication to our clients and never compromising on quality. They keep me honest. I’m very grateful to have partners like this A-team. Our operation is strong here on the Warner lot. Looking forward to what will come in 2022!

2021 New Theatrical and Remasters

Happy New Year and happy grading everyone!

-JD

Finishing Scoob!

Scoob!” is out today. Check out my work on the latest Warner Animation title. Here are some highlights and examples of how we pushed the envelope of what is possible in color today.

Scoob-Expected-Release-Date-Cast-Plot-and-every-other-detail.png

Building the Look

ReelFX was the animation house tasked with bringing “Scoob!” from the boards to the screen. I had previously worked with them on “Rock Dog” so there was a bit of a shorthand already in place. I already had a working understanding of their pipeline and the capabilities of their team. When I came on-board, production was quite far along with the show look. Michael Kurinsky (Production Designer) had already been iterating through versions addressing lighting notes from Tony Cervone (Director) through a LUT that ReelFX had created. This was different from “Smallfoot” where I had been brought on during lighting and helped in the general look creation from a much earlier stage. The color pipeline for “Scoob!” was Linear Arri Wide Gamut EXR files -> Log C Wide Gamut working space ->Show LUT -> sRGB/709. Luckily for me, I would have recommended something very similar. One challenge was the LUT was only a forward transform with no inverse and only built for rec.709 primaries. We needed to recreate this look targeting P3 2.6 and ultimately rec.2020 PQ.

Transform Generation

Those of you that know me, know that I kind of hate LUTs. My preference is to use curves and functional math whenever possible. This is heavier on the GPUs but with today’s crop of ultra-fast processing cards, it hardly matters. So, my first step was to take ReelFX’s LUT and match the transform using curves. I went back and forth with Mike Fortner from ReelFX until we had an acceptable match.

My next task was to take our new functional forward transform and build an inverse. This is achieved by finding the delta from a 1.0 slope and multiplying that value by a -1. Inverse transforms are very necessary for today’s deliverable climate. For starters, you will often receive graphics, logos, and titles in display referred spaces such as P3 2.6 or rec.709. The inverse show LUT allows you to place these into your working space.

Curve and it’s inverse function

Curve and it’s inverse function

After the Inverse was built, I started to work on the additional color spaces I would be asked to deliver. This included the various forward transforms to p3 2.6 for theatrical, rec.2020 limited to P3 with a PQ curve for HDR, and rec.709 for web/marketing needs. I took all of these transforms and baked them into a family DRT. This is a feature in Baselight where the software will automatically use the correct transform based on your output. A lot of work up front, but a huge time saver on the back end; plus less margin for error since it is set programmatically.

Trailers First

The first piece that I colored with the team were the trailers. This was great since it afforded us the opportunity to start developing workflows that we would use on the feature.

My friend in the creative marketing world once said to me “I always feel like the trailer is used as the test.” That’s probably because the trailer is the first picture that anybody will see. You need to make sure it’s right before it’s released to the world.

Conform

Conform is one aspect of the project where we blazed new paths. It’s common to have 50 to 60 versions of a shot as it gets ever more refined and polished through the process. This doesn’t just happen in animation. Live-action shows with lots of VFX (read: photo-real animation) go through this same process.

We worked with Filmlight to develop a workflow where the version tracking was automated. In the past, you would need an editor to be re-conforming or hand dropping in shots as new versions came in. On “Scoob!”, a database was queried and the correct shot if available was automatically placed in the timeline. Otherwise, if not available, the machine would use the latest version delivered to keep us grading until the final arrives. This saves a huge amount of time (read: money).

Grading

Coloring for animation

I often hear, “It’s animation… doesn’t it come to you already correct?” Well, yes and no. What we do in the bay for animation shows is color enhancement; not color correction. Often, we are taking what was rendered and getting it that last mile to where the Director, Production Designer, and Art Director envisioned the image to be.

This includes windows and lighting tricks to direct the eye and enhance the story. Also, the use of secondaries to further stretch the distance between two complementary colors, effectively adding more color contrast. Speaking of contrast, it was very important to Tony, that we never were too crunchy. He always wanted to see into the blacks.

These were the primary considerations when coloring “Scoob!” Take what is there and make it work the best it can to promote the story the director is telling. Which takes me to my next tool and technique that was used extensively.

Deep Pixels and Depth Mattes

I’ve always said, if you want to know what we will be doing in color five years from now, look at what VFX is doing today. Five years ago in VFX deep pixels or voxels as they are sometimes referred, was all the rage. Today they are a standard part of any VFX or Animation pipeline. Often they are thrown away because color correctors either couldn’t use them or it was too cumbersome. Filmlight has recently developed tools that allow me to take color grading to a whole other dimension.


A standard pixel has 5 values R,G,B and XY. A Voxel has 6 values RGB and XYZ. Basically for each pixel in a frame, there is another value that describes where it is in space. This allows me to “select” a slice of the image to change or enhance.

This matte also works with my other 2D qualifiers turning my circles and squares into spheres and cubes. This allows for corrections like “more contrast but only to the foreground” or desaturate the character behind Scooby, but in front of Velma.


Using the depth mattes along with my other traditional qualifiers all but eliminated the need for standard alpha style mattes. This not only saves a ton of time in color since I’m only dealing with one matte but also generates savings in other departments. For example with fewer mattes, your EXR file size is substantially smaller, saving on data management costs. Additionally, on the vendor side, ReelFX only had to render one additional pass for color instead of a matte per character. Again, a huge saving of resources.

I’m super proud of what we were able to accomplish on “Scoob!” using this technique and I can’t wait to see what comes next as this becomes standard for VFX deliveries. A big thank you to ReelFX for being so accommodating to my mad scientist requests.

Corona Time

Luckily, we were done with the theatrical grade before the pandemic hit. Unfortunately, we were far from finished. We were still owed the last stragglers from ReelFX and had yet to start the HDR grade.

Remote Work

We proceeded to set up a series of remote options. First, we set up a calibrated display at Kurinsky’s house. Next, I upgraded my connection to my home color system to allow for faster upload speeds. A streaming session would have been best but we felt that would put too many folks in close contact since it does take a bit of setup. Instead, I rendered out high-quality Prores XQ files. Kurinsky would then give notes on the reels over Zoom or email. I would make changes, rinse and repeat. For HDR, Kurinsky and I worked off a pair of x300s. One monitor was set for 1000nit rec.2020 PQ and the other for the 100nit 709 Dolby trim pass. I also made H.265 files that would play off a thumb drive once plugged into an LG E-series OLED. Finally, Tony approved the 1.78 pan and scan in the same way.

I’m very impressed with how the whole team managed to not only complete this film but finish it to the highest standards under incredibly trying times. An extra big thank you to my right-hand man Leo Ferrini who was nothing but exceptional during this whole project. Also, my partner in crime, Paul Lavoie, whom I have worked with for over 20 years. Even though he was at home, it felt like he was right there with me. Another big thanks.

Check Out the Work

Check out the movie at the link below and tell me what you think.

https://www.scoob.movie/

https://www.scoob.movie/


Thanks for reading!

-John Daro