-
Posts
125 -
Joined
-
Last visited
Content Type
Profiles
Case studies - Free
Case studies - Premium
Resources
Insider
Courses
Forums
Store
Everything posted by Marc Wielage
-
Working on it! Note there are pros and cons with this approach, and there are occasions where you don't want a Fixed Node Tree. But for longform features & TV, they're ideal for working quickly and keeping a consistent look across many scenes.
-
There's a lot of potential variables. Bear in mind that the specific lighting and exposure on set, plus the art direction (particularly the set design and color) will limit the potential of any grading you do. For example, the teal & orange look isn't necessarily possible with all kinds of images: color contrast happens when the background and foreground lend themselves to this kind of "pushing and pulling" the image. Downloading an H.264 from the net is problematic, because they will (as you discovered ) "fall apart" when you try to push them too hard. Try to do the grade with 12-bit or 16-bit Raw material, and I'd bet you can get there. Arri Alexa, Blackmagic, Red, and Sony all have free Raw files you can download and experiment with.
- 1 reply
-
- 3
-
Tales from the Color Suite, Eric Whipp
Marc Wielage commented on Lowepost's insider article in Color Grading
Very scary. Many, many years ago (about 1980), I threaded up a 16mm neg on a Rank-Cintel MkIIIB telecine for a music video at Modern Videofilm here in Hollywood. I went back to the control room, sat with the director, and color-corrected the image as we laid it down to tape (which was the standard-def workflow back then). After the first reel was safely on tape, the director said, "hey, could we do that again with a different look?" I said, sure, and went to rewind the film at the machine. Much to my shock, I saw shards of emulsion and plastic on the base of the scanner. In my haste and nervousness, I had wound the film around the 35mm guides... which were supposed to be bypassed for 16mm, due to its narrower width. The 35mm guides gouged into the film and put a half dozen deep scratches in the frame, all the way through! Just as I was staring at the film and wondering how I was going to explain it, the director walked in, saw what was happening and said, "oh, I guess there's some equipment problems?" I gulped and nodded and said, "totally my fault," and profusely apologized. He shrugged and said, "eh, what we've already recorded looks fine. Let's just move on to the next roll. But don't scratch the next one." The director was totally unphased, was happy with what we did, we continued with the session, and it all ended up well. Needless to say, I was much, much, much more careful about loading 16mm on scanners after that. -
Tales from the Color Suite, Eric Whipp
Marc Wielage commented on Lowepost's course in Finishing & VFX
Very scary. Many, many years ago (about 1980), I threaded up a 16mm neg on a Rank-Cintel MkIIIB telecine for a music video at Modern Videofilm here in Hollywood. I went back to the control room, sat with the director, and color-corrected the image as we laid it down to tape (which was the standard-def workflow back then). After the first reel was safely on tape, the director said, "hey, could we do that again with a different look?" I said, sure, and went to rewind the film at the machine. Much to my shock, I saw shards of emulsion and plastic on the base of the scanner. In my haste and nervousness, I had wound the film around the 35mm guides... which were supposed to be bypassed for 16mm, due to its narrower width. The 35mm guides gouged into the film and put a half dozen deep scratches in the frame, all the way through! Just as I was staring at the film and wondering how I was going to explain it, the director walked in, saw what was happening and said, "oh, I guess there's some equipment problems?" I gulped and nodded and said, "totally my fault," and profusely apologized. He shrugged and said, "eh, what we've already recorded looks fine. Let's just move on to the next roll. But don't scratch the next one." The director was totally unphased, was happy with what we did, we continued with the session, and it all ended up well. Needless to say, I was much, much, much more careful about loading 16mm on scanners after that. -
RCM Workflow - White Balancing in Log space
Marc Wielage replied to Douglas Dutton's topic in DaVinci Resolve
One thing you can do: highlight the clips, then right-click and select "Bypass Color Management." Then, using a Color Space Transform OFX plug-in, you can drop it into the first node and tell it what kind of camera made the ProRes file. I can't guarantee this will work every time, since it's possible when the ProRes was made, some kind of change or adjustment will happen, but it can work. Experiment with different settings and see if this gets you to a closer starting point. I should acknowledge Joey D'Anna of MixingLight for coming up with this "Roll Your Own" color management idea, which I think is very clever. I hadn't used CST nodes until his suggestion, but they've proven to be very useful. -
We have a tutorial on Fixed Node Trees now being worked on (literally as we speak), and I hope to have it done within the month. No promises, but it will give you about a dozen examples of different node trees you can use, explains the thought process behind each one, and also will explain how to build a Fixed Node Tree from scratch. One key is to understand the Image Processing Order of Operations. This is such an important topic, it's given its own chapter in the Resolve manual, Chapter 141, starting on p. 2806. The point is to understand how one node will affect the nodes coming after it, and how it's possible to damage the image if you (say) deliberately crush it in one node, then try to bring everything back in a subsequent node. This is particularly crucial if you use LUTs, which can have a destructive effect on the image. I'm also a stickler for labeling every node, and that's so that you can understand the signal flow and what each node is doing. This can be very important if you come back to a project six months or a year later, and can understand how and why certain shots were changed. Splitting up different kinds of functions can also help the client demonstrate how a shot was changed and (hopefully) improved, a node at a time.
- 14 replies
-
- 11
-
Chapter 8 of the v17 manual, "Data Levels, Color Management, and ACES" (starting on p. 181), does at least mention OOTF. There's a course here on Lowepost that covers Resolve Color management in detail:
-
Davinci Wide Gamut Non Color Managed
Marc Wielage replied to Douglas Dutton's topic in DaVinci Resolve
My suggestion is to try to keep things simple. I don't necessarily think working in a wide-gamut world will help you unless you plan some serious HDR deliveries. Having said that, Alexis Van Hurkman has an excellent 3-hour tutorial on Resolve Color Management, and it specifically covers wide gamut as well as the advantages and disadvantages of ACES vs. RCM: https://www.rippletraining.com/products/davinci-resolve/color-management-in-davinci-resolve-17/ One thing I think is helpful is that he shows how to take a project completely corrected in SDR and then do a trim pass for HDR. I think this will be useful for certain situations where, long after the fact, the client decides to spend the extra money to have the colorist provide an additional HDR version. As far as matching different cameras goes, I think that's something you can already get with Color Space Transform nodes and actually work independently of color management or even LUTs, for that matter. As long as you have a calibrated display and a color-managed output, you can work just fine in a display-managed environment and get the whole project done. Knowing scopes and the peculiarities of specific cameras will help a lot. -
Beautifully said, Bruno. I always say, "the beauty of Resolve is that there's often at least 4 or 5 different ways to get good results. The key is to use the one with which you're comfortable, and the one that works the fastest (for you). I never tell another colorist how to work, because if they get good results, if the client is happy, and if the check clears... then there is no problem. It is possible to NOT work in ACES, but still deliver an ACES-compatible archival file at the end of the process if the client wants one. That's covered in the manual.
-
"Most" movies is in the eye of the beholder. There's lots and lots of different ways to work nowadays. I think even Netflix will allow facilities to use other kinds of color management as long as you deliver ACES in the end. And you can deliver ACES-compatible files with Red Color Management 2. I do a lot of stuff manually, but much of what I do is just for Rec709. We are using RCM2, so I have the ability to change the pipeline if we wind up in HDR/Dolby Vision, but that still requires a trim pass. We've proven it works, so I'm confident it's a good way for us to handle sessions. I often say, "the power of Resolve is that it gives you multiple ways to do the same thing." You have to make the decision which method is best for you. As long as the final color is right and the files are acceptable, everything is fine.
-
Purely my opinion: I'm not a fan of ACES because I don't like the feel of the controls. I feel like the tone-mapping is fighting me too much. I think RCM2 can work to a point, but there's also value in using CST nodes to "normalize" the camera source material instead, basically doing it all manually. (And I credit Joey D'Anna from MixingLight for this idea.) I'm the "NO LUTS" guy who would prefer to come up with a look with PowerGrades, since it gives you more control over any image problems. Of course, a Technical LUT or a Show LUT can work, and if the client insists on it, we'll use it. Custom LUTs can also work under the right conditions.
-
Timeline Rendering in Resolve - what's the deal?
Marc Wielage replied to Craig Melville's topic in DaVinci Resolve
I'm actually OK with Neat Video, but I generally only use it as a "second pass" technique. I'll render a timeline to a mezzanine format like ProRes 444 (or XQ), then take the flattened file apply Neat Video on a scene-by scene basis, and then render it again. For our purposes, 444 is what I'd call "visually lossless" and nothing is lost going down one more generation. We do make sure the Neat settings are optimized per sequence, and I'm not afraid to bypass it when it looks OK without any NR. We also optimize Neat for our specific GPUs, and it's actually reasonable, I think somewhere around 5-6fps without caching. There is a school of thought where you can run the flattened Neat Video pass with the regular timeline, and do a composite or a blend based on screen content: in other words, not use 100% of the NR pass, but just a piece of it. My philosophy is to lean towards "less is more" processing with NR whenever possible. I agree that you have to do some strategizing with plug-ins and figure out what's going to drag the system down, what needs to be cached, what a reasonable node structure should be for a specific project, and so on. Every project has different challenges. Difficult formats, like H.264 and so on, do present problems and we try to transcode those in advance whenever possible. "Render in Place" is a valuable tool and a welcome change in Resolve 17. -
Hey, Stefan. I agree with you to a point, but from my point of view (and Kodak's), Halation and vibration are actually flaws, not always a positive creative look. The halation is a conditional thing: I've been using Glow, Scatter, and sometimes BorisFX tools to add selective diffusion when the scene needs it. But it's not something I'd want all the time, and sometimes I only want it over a specific part of the frame, and I need to eliminate it from some shots entirely. I concede it's a creative choice. 90% of my work these days is color-correcting 1980s/1990s films for reissue, and because I started in telecine more than 40 years ago, I'm extremely aware of what actual film looks like. The trick is that different stocks have different looks, negative looks different than print, both look different than IP and IN, and not all these looks are desirable. When we were doing film D.I.'s in the 2000s, we actually calibrated the LUTs so that we knew how the image would be interpreted in the film recorder, and by doing tests with the lab, we knew how an answer print struck off the digital negative would look. That's a case where we had a "real" print look (created by Kodak) that would resemble a projected print right next to it in the D.I. theater. Making film-outs was a huge headache, and I was not sorry to see that part of the business (mostly) go away. But as the Filmbox people have said: "this is not exactly film, but it kind of resembles film to a point." I can accept that with no problem. It's an interesting look that might be beneficial for certain projects. BTW, @Walter Volpatto agrees with you: he's actually using ResolveFX Grain more than other tools these days at CO3, and it's very telling that a colorist at his level can get great results out of off-shelf tools.
- 13 replies
-
- 2
-
- kodak vision
- luts
-
(and 6 more)
Tagged with:
-
BTW, for yet another approach to film print emulation, check out Stefan Ringelschwandtner's blog and link here: https://mononodes.com/photochemical-film-look/ He's giving away a Resolve correction (4 or 5 nodes) that are actually pretty effective from what I see. (You can tip him a coffee if you like it.) His approach is fairly complicated, but it's certainly cheaper than buying a film emulation plug-in, and there's a lot of good thought behind it. Having said that, if I had to do film emulation, I'd just use Filmbox and call it a day: it's one node and actually looks pretty good. But it's not free.
- 13 replies
-
- 2
-
- kodak vision
- luts
-
(and 6 more)
Tagged with:
-
Well, Resolve is not Baselight, but there are always workarounds. One method: used a Fixed Node Structure so that every single shot in the entire show has the same number of nodes. Make one of the nodes this specific adjustment -- say, Node #7. Now, deselect this node, then highlight all clips and choose Color -> Ripple Node Changes to Selected Clips, and just that one node will be turned off. If you need to turn it back on, do the opposite. Another method: used a Shared Node for all clips. Place that specific correction in the Shared Node. Then, bypass the node or delete the correction. Another method: make a keyframe in the Timeline Node Window, add a node for that specific function for the entire show. Bypass it when you don't need it or use a Keyframe to turn it on or off.
-
Yes to both. Dado Valentic is a real character, but he's passionate about what he does, and I'm actually pressed with the ideas and interface of Look Designer and GrainLab. ColourLab isn't compatible with the way I work -- I'm not a fan of the idea of having one different LUT and a CDL (or whatever it is) per shot for 1500-2000 shots per feature reel, created in a different program, and then bring it over to Resolve with a script -- and as far as I'm concerned, I match perfectly quickly on my own with Gallery stills, Memories, and scopes. But Look Designer has some fascinating ideas. I would say Look Designer and Filmbox are lightyears beyond something like FilmConvert or a mere LUT. BTW, I'm also impressed with the same company's Scatter, which is the best OFX diffusion plug-in I've ever seen. Remarkable tool.
- 13 replies
-
- kodak vision
- luts
-
(and 6 more)
Tagged with:
-
Yes, a friend of mine (formerly from CO3/Santa Monica) bristles when I say "Orange & Teal," because he insists it's "Flesh Tone & Teal." Eh, either way, it's two opposites on a straight line on the vectorscope, which provides huge color contrast. I just do the overall grade as a normalizing grade, then a teal-ish node at the end, then add a layer node and take the flesh tone from the original grade before the teal. Absolutely normal flesh in a teal "world" looks too weird, so generally you need to back it off like with a Key Mixer Output adjustment down to 50% or 60%. It's very subjective. It's not like you're pushing orange/flesh into it... it's actually the original skin as a key, in a world of teal/green or teal/blue. There are other ways to do it, but of course it helps if the DP actually lights it that way on set and you have an art director that arranges the set accordingly. You get a bold DP like Lawrence Sher on Joker who's not afraid to use colored gels on set, and you can get some spectacular results.
-
Yeah, I'd say the reason I'd go with a layer node is that it works better for me. Use what makes the most sense to you. I always say, "only the results matter." (Well, that plus whether the check clears.)
-
If it's overall cold, but you're trying to preserve somewhat natural fleshtones, the classic solution (in the last 10 years) has been a layer mixer pulling the skin as a key from an earlier source node. Usually the trick is to lower opacity so it's not quite perfect normal skin, but at least halfway there. That way, you don't wind up with blue people. One thing you often have to consider is the music video you're using as a reference may have been lit that way on set. The problem with trying to create that look is you may be forcing a look that's never quite going to be equalled. It'll be tough, especially if it involves relighting and situations where there's not enough color separation (or light separation) between actors and background, or foreground and background.
-
I have actually worked in workflows where the LUT comes first. If the end result is correct, then it IS correct. In truth, I think if you're just trying to get the "look" of 2383 print stock -- which is more contrasty than I think most people know (particularly people who've never done a film-out and struck a print from digital files) -- you can actually fake it pretty well. I think a lot of the so-called Print LUTs and plug-ins out there are a lot of smoke and mirrors with very little real science or usefulness behind them. Having said that, I've been experimenting with Video Village's Filmbox lately, and I really like their philosophy and approach. They say this: Filmbox does not represent pure empiricism. We certainly tried to gather good data and stay close to that data but our methods are not prefect and there were subjective decisions made about how to tune and implement the data into a functional system that produces creatively satisfying results. We encourage you to try it and see if our model of film lives up to your mental model of film. It’s worth noting that both film and Filmbox can be made to have many looks, and what people think “film” looks like is a bit of a moving target. This is especially the case now that almost nothing is actually printed to film, and many people’s memory of “film” is actually of some hybrid film/digital processes. Some might say film looks like Vision3 negative stock scanned at their favorite post house and graded by their favorite colorist or processed by their favorite LUT. That process may or may not look anything like printing that same negative to a print stock. And even that might not look like an older film that used a different photochemical process. Ultimately our intent is that a high-end cinema camera processed with Filmbox faithfully reproduces the characteristics that have been hard to achieve since the advent of digital cinema. As skeptical as I am, and as outspoken as I've been in the past on the borderline-fraudulent claims made by some companies on their film emulation products, I have to say I agree with what they say. If your description is, "hey, it's not exactly like film, it's not like all film, but it's kind of in the ballpark of 5219," I can buy that because it's a film that still exists and you can actually shoot it and test it today. Video Village goes into the methods and testing process they went through to create Filmbox in this FAQ file: https://videovillage.co/images/filmbox/features/FAQ.pdf Two caveats: the software is not cheap (about $395 a year), and it's Mac-only at the moment. I have zero connection with the company except as a customer. But I'm impressed with what I see so far.
- 13 replies
-
- 1
-
- kodak vision
- luts
-
(and 6 more)
Tagged with:
-
The ultimate guide to color grading monitors
Marc Wielage commented on Lowepost's insider article in Color Grading
This is a great resource! I agree with a lot of choices on the list. If you can find a non-defective panel with good uniformity, the LG OLED CX can be excellent for Rec709. But so far, there isn't much of a selection for HDR grading below $25,000. I have taken SD sessions timed on an OLED CX and brought them over to a Dolby Vision-certified room with a Sony BVM-X300, and I was thrilled that the grades translated as well as they did. Switching to Dolby Vision, I mainly had to use the Dolby trims to bring up the highs and mid-highs, while keeping the blacks solid black. The end results were great, but it did take a couple of days of work. As has been discussed on Jason's LiftGammaGain, the Apple XDR is not yet happening for HDR grading. The lack of calibration capability and low number of backlit zones make it a poor choice for grading right now. -
Mixing Light, FXPHD, Ripple Training, and Lowepost are the main ones I recommend the most. Some books I'd recommend (some on color theory, some on color correction in general): "The Art & Technique of Digital Color Correction" by Steve Hullfish https://www.amazon.com/Art-Technique-Digital-Color-Correction/dp/024081715X "Color & Mastering for Digital Cinema" by Glenn Kennel https://www.amazon.com/Mastering-Digital-Cinema-Industry-Handbook/dp/0240808746 "Digital Cinematography: Fundamentals, Tools, Techniques, and Workflows" by David Stump https://www.amazon.com/Digital-Cinematography-Fundamentals-Techniques-Workflows/dp/0240817915 "Color Reproduction in Electronic Imaging Systems" https://www.amazon.com/Colour-Reproduction-Electronic-Imaging-Systems/dp/1119021766 "Cinematic Color" (which is free) https://raw.githubusercontent.com/jeremyselan/cinematiccolor/master/ves/Cinematic_Color_VES.pdf "Color Reproduction in Electronic Imaging Systems" by Michael Tooms https://www.amazon.com/Colour-Reproduction-Electronic-Imaging-Systems/dp/1119021766 "Digital Video and HD: Algorithms and Interfaces" by Charles Poynton https://www.amazon.com/Digital-Video-HD-Algorithms-Interfaces/dp/0123919266 "The Reproduction of Colour" by Dr. R.W.G. Hunt https://www.amazon.com/Reproduction-Colour-R-W-Hunt/dp/0470024259 "Color Mania: The Material of Color in Photography and Film" by Barbara Flückiger https://www.amazon.com/Color-Mania-Material-Photography-Film/dp/3037786078 "Colour Cinematograph" by Adrian Cornwell-Clyne Chapman & Hall https://www.amazon.com/COLOUR-CINEMATOGRAPHE-Third-Revised-enlarged/dp/B000GU30WE Some books on workflow are good to know: "Modern Post: Workflow & Techniques for Digital Filmmakers" by Scott Arundale https://www.amazon.com/Modern-Post-Workflows-Techniques-Filmmakers/dp/0415747023 "The Guide to Managing Postproduction for Film, TV, and Digital Distribution" by Susan Spohr & Barbara Clark https://www.amazon.com/Guide-Managing-Postproduction-Digital-Distribution/dp/1138482811 And the best book on Film Lab color-correction I've ever read is this one: "Film Technology in Post Production" by Dominic Case https://www.amazon.com/Film-Technology-Production-Media-Manuals/dp/0240516508 The latter explains how film was color graded in the laboratory prior to television and digital. Some of the basic principles still apply today.
- 1 reply
-
- 6
-
We use the Chromatic Adaptation OFX plug-in to change baked-in 6500°K color temperatures on material that really should be 3200°. In Resolve 17, the Temperature & Tint controls in the Primary pallet were altered so they more or less reproduce what the Chromatic Adaption plug-in does. You can also get some extreme looks with it, but you have to watch out for unexpected results and out-of-gamut issues.
-
Why you need a color control surface
Marc Wielage commented on Lowepost's insider article in Color Grading
I agree 100% with Stig: a control surface makes a huge difference in the speed of your work, particularly with DaVinci Resolve. I've had the opinion for a long time that the program is really optimized for the big Advanced Resolve panels, which tap into a lot more features than the Mini, Micro, or Tangent panels. I would say I save about an hour a day because of the Advanced panels, and while that may not seem like much, if you're working 6-7 days a week, it's almost like getting an additional day of productivity. What's interesting to note is that there are a few colorists (notably Steve Scott on Lustre and Peter Doyle on Baselight) who work almost entirely with a mouse and keyboard and no panel. They have the luxury of a support staff, assistants, and lots of time for their projects, which is not always possible for the rest of us. I've used almost every panel on the market -- Avid Artist, Tangent Wave, Tangent Elements, Resolve Mini, Resolve Advanced -- and they're all useful. The trick is to develop the muscle memory to reach for a knob without having to look down to find it. I'm struggling at the moment with the new Advanced Panel 2 keycaps for Resolve 17, but I'm getting better day-by-day. Still, it's kind of like starting over with a whole new thing, kind of like jumping into a speedboat after driving a truck for five years. BTW, we still have a Mini Panel and a Wave2 at our facility, "just in case" anything fails. If your livelihood depends on it, you always need backups in an emergency. -
Freelance Colorist Master Class?
Marc Wielage replied to Steven Crowley's topic in General Discussions
Ripple Training, MixingLight, and FXPHD are also all certified Blackmagic training classes, so there is that to consider as well.