dermot.shane

Premium+
  • Posts

    112
  • Joined

  • Last visited

Posts posted by dermot.shane

  1. one disadvantage of using the numpad ui is not being able to adjust say red down and blue up interactivly, you can adjust one -or- the other, and unless you have near endless time, you will never know how much better you could have made the image.. that interactivity thing

    a worth tradeoff for precision? i'd say yes,  i use those tool early in the stack /layer / node tree, aim is to normiise the image cleanly there

    For me precsion comes in later, the race is not won in the first corner, the grade is not finished in the first stack / layer /node

    on all the surfaces i use when working in Resolve (Advanced/Mini/Elements/Color) they are mapped already, some better than others, not sure about Wave/Micro/Ripple tho

    under Baselight, Color & Elements both offer the functional equliivent, as does Slate/Blackboard

    under Nucoda Elements offers the functional equliivent, as does Precision

    no idea about Luster and Scratch, id be surprised if exposure is not avb on a knob / ring / ball of some sort..

    • Like 2
  2. my thoughts ran to why would someone prefer useing the num pad in 2018 to adjust exposure rather than useing a knob to get to the same place much faster + more interactive feedback?

    as to the tools themselves, i turn to them first on any system, no matter what they are called....

     

    • Like 2
  3. 10 hours ago, Amada Daro said:

    Old tech but the only primary controls with full tonal range control. 

    care to expand on that thought?

    arn't printerlights in Resolve nothing more than a diffrent ui for Offset?

    Baselight's "Exposure", Nucoda's  "Brightness", Resolve's "Offset" are all the same thing?

    here's a rip from Nucoda's manual on the tool;

    Density is Printer Lights ganged together and a density reset makes each of the RGB Printer Light values neutral. Density works the same as the Brightness tool in Brightness/ Contrast and is also known as offset in some other systems.

    This control affects all pixels equally, regardless of luminance, color or position. Its affect is similar to camera exposure and it is useful for adjusting the source image dynamic range to a pleasing level. Since it does not squeeze or stretch the dynamic range it causes no artifacts and is a great control to start grading raw or flat scanned images.

    Some use Density to set black levels and others to use it to assign the mid range. Both methods are correct.

     

    i tend to use the same controls, in Resolve the Nucoda/Baselight method of normalising using Exposure/Contrast/Sat are pretty rudementary, or needing workarounds... but still vastly preferable to LGG for my working methods

    the only difrence is i often mapp *a and *b rather than G and B, to the controls, the maths are the same, the color science is substaintly diffrent tho, subtractive rather than additive, but the controls remain the same, i could type in printerlights to adjust *a, or turn a knob...  same tool, diffrent ui

     

     

    • Like 5
  4. Print stock filled the role of a ouput LUT/RRT, we called printer lights while viewing a answer print,  looking at something though an effective  RRT/LUT

    try looking as something through a 2383 lut, then a 2393 lut, the print stock is your lut in a lab finish

    that said, i prefer subtractive color and tend to use the a & b channels of L*a*b to ballance an image

    i've used printer lights for real, and see no reason to turn back to 1941 tech

    • Like 3
  5. there is no "precisley matching" a film stock, too many variables in lab tollerance for any match to be more than lab X on day X at best

    shoot a test, cut the camera roll, send it to three labs and the resualts are three densities, send it again a week later and there will be three more densities.... rinse and repete....

    and they are all going to be within lab tollerance....

     

     

    • Like 2
    • Thanks 1
  6. about the same as Alex for a feature,  10-20 days depending on budget, although i've not seen a feature with only 800 cuts in it for a deacde or more (if ever), 1800-2200 is more common in 2018 with my clent base

    but for TV i can go alot faster once we have looks in place, and that usualy happens when gradeing the pilot or next season's sizzle reel and trailers

    typcialy 10-12 hr per 48 min ep, more at the begning of a season, and less at the end, more cut-n-paste of looks from prevoius shows in the same set by ep12, we are really only looking at new sets and QC issues. and that goes fast

    • Like 1
  7. i often run it in groups, but being from the film era, i know the grain of Fuji500t has next to zero to do with the grain of Kodak 50D...  so i often group by 50 / 160 / 500 to match the scene

     

    usualy i'll run one of many grain pass i made about two decades ago in Flame using it's match grain to match to the grain in the scans of a grey card

    • Like 1
  8. from a user viewpoint, BLE's tools are awesome complement to MC,

    but not being able to navagate the timeline so one can play a sequence for review in context (BLE is locked to one clip at a time only) is problematic,

    the workaround of closeing the software playing a seq in MC's ui, then launching the software is also problematic

    Unlwess / Untill that is sorted, my hopes are for Daylight to get ported to Linux... i'm not going to buy a trashcan, and last time i looked that's the only machine it will run on

     

     

     

     

     

    • Like 1
  9. On 1/4/2018 at 9:56 AM, Mike Leisegang said:

    Dermot, in the old days when you pushed film transparency  to a radical point, 3 to 4 stops you would - could start to see the DMax of the film showing.

    prolly around the same time (early 2000's)Technicolor had a process called Dmin/Dmax that one lighted a roll with a cineon-ish encodeing, based on density between the sprokets.. all from memory and i never transfered a reel this way, but i have graded a few features (on Cyborg2k) from rushes transfered to DPX with this method, worked well for the time given the priceing of 2k scans was still by the frame mainly for VFX, and it was rare for anyone to be doing full DI back then, the Dmin/Dmax made it afforable

     

    • Like 1
  10. i think Jussi is correct - it's the gain ball mapped to two axis, or as Resolve has those controls implemented they can be replacated with the gain hue offset ball

    another option that i use alot in Resolve is setting a node to L*a*b and bypassing channel 1 so you have only *a*b, for me that's the simplest cleanest workflow, but be aware that the surface will react more like it's in 1998 mode

    Baselight's tint/temp controls are somewhat diffrent btw, and really feel closer to L*a*b than to Resolve's gain offset method

    true color temp is more a horseshoe shaped matrix than a straight line from blue to yellow

     

    • Like 3
  11. i remembered going through this years ago, and it was nothing like known 2383 emulation luts. I compared it to two luts that are un-encumbered by active IP both of them are solid...  i compared to a number of other 2383 luts from Resolve, Nucoda and Baselight, all very similar, as were the Luthier and CineByte lut's

    from memory, there was some wonky maths in the shadows,red channel doubling back over top of it's self, and some jaggy steps in the skintones area

    i rolled my own when i had them presented by a producer, was not crazy about dealing with the outfall of the maths in Koji

    i'm rendering overnight, (feature film DCP has to be in LA tomorow, mix arrived at 4.30 pm.. sigh)  will try to squeeze in a test before client arrives tomorrow

     

    • Like 1
    • Thanks 2
  12. i have had log encoded DPX arrive at my doorstep, not  clue how they got there, it's what the DiT created with DiT magic

    no issues with the files or the footage, nice looking highspeed food shots, flying peppers and dropping  / bounceing msuhrooms in a studio, no issues with expoure or highlight rentention, but there was grownup behind the camera.

     

    • Like 2
  13. awesome article Andy!

    i've been using L*a*b in Resolve for a while now, and more recently  in BLE -  both myself and my cleints and find the resualts to bring us close to our happy place quickly

    a questions for you if you have the time;

    where are the diffrences between using R-Lab and Basegrade?

    mainly the 4 zones and interactive surface mapping?

    or are the maths substaintily diffrent under the hood?

    next - a question about the article it's self, when you talked about Filmgrade + exposure you mentioned exposure not being truly linear due to input log curves, when in AP1 with raw sources is exposure also non-linear?

    • Like 2