Lee Lanier

Contributors
  • Content Count

    22
  • Joined

  • Last visited

Community Reputation

4 Neutral

2 Followers

About Lee Lanier

  • Rank
    Contributor
  • Birthday 02/07/1966

Personal Information

  • Gender
    Male

Recent Profile Visitors

103 profile views
  1. With a Loader, you can click the Browse button and navigate to the correct file location.
  2. Looks like there was a naming mix-up. My apologies. 11 and 12 are supposed to be using what's currently called Beauty_D. I exported 11 and 12.drp files, just in case, and placed them on my Adobe Cloud storage: https://adobe.ly/2VgIJIv https://adobe.ly/2Y7l2Pw
  3. I'm sure there's a way to emulate more standard dodge and burn techniques - the question is: is it efficient and worth the time to set it up? I come from a VFX background, so I tend to think in terms of VFX solutions.
  4. I haven't compared Photoshop blending modes directly to Fusion apply modes, but here is information on the math used: Fusion: http://www.designimage.co.uk/merge-tool-maths/ Photoshop: http://www.deepskycolors.com/archive/2010/04/21/formulas-for-Photoshop-blending-modes.html https://www.adobe.com/content/dam/acom/en/devnet/pdf/pdf_reference_archive/blend_modes.pdf If you want to upload an example project, i can take a look when I get some time.
  5. Using "Import with Output LUT" when importing a still might give you the chance to convert the still to the correct color space in Resolve. It requires a matching LUT file. I have not had a chance to test this. Although a bit of a workaround, I suppose you could also use a Loader to bring the still into Fusion and apply a Gamut or OCIO Colorspace color transform to it.
  6. Thanks for all the positive feedback, everyone.
  7. Does this occur during playback, or just when rendering? Does this happen without the Crop? I've seen this behavior before - if I remember correctly, I deleted to tool to which the motion tracking data was applied and recreated that section of the network.
  8. Thanks for watching. You could apply the tracking data with the Unsteady Position option via the Center XY parameter of an additional Transform tool. Steady Position is an inverted Unsteady Position, where Unsteady Position moves the input as if shot by the original camera. That might be a good approach if you wish to replicate the original camera motion and not generate new motion through keyframing..
  9. You can try using the Clean Plate tool to generate a clean plate with the averaged background color and then use it with the Delta Keyer to key the background. Alternatively, the Difference Keyer is will create a key after comparing a clean plate and the footage with the model. Noise always makes difference keying more difficult. Using the Subtract blending mode may remove too much background color from the foreground to make it viable.
  10. That is a good question and I don't think there is one answer. One way is to switch to Resolve Color Management (RCM) through the Color Science menu in Project Settings and assign the correct Input Color Space (based on the camera) to the clip in the Media tab. Fusion, buy default, should show you the same interpretation seen by the other tabs with RCM. Make the VFX fixes in Fusion. Last, color grade in Color. You can choose a Timeline Output Color Space and an Output Color Space through the Color Management section if you need to wind up in a different space, like Rec.709. Another approach is not to use RCM, but assign to a LUT to the clip in the Media tab so the log footage is interpreted correctly. Either way, you are adding the VFX to the _ungraded_ version. I would avoid color grading in Fusion, unless you need to match elements to each other (like match a 3D render to a background video). The Alpha channel from Fusion is passed to the other tabs. I haven't roundtripped from standalone Fusion to Resolve. The integrated Fusion is definitely more buggy than the standalone and is missing some features.
  11. I'm currently working on a different series, so no immediate plans to expand the beauty series. However, we will keep that in in mind for the future. When warping an actor over a background, you really have to separate the actor via rotoscoping or some form or masking. If you make the actor skinnier or taller, there is the danger that part of the set will be missing; hence, I would say it's only viable with some shots, else it requires a lot of extra work. Dodge and burn would fall into the color correction category. In Fusion, add a Color Correction tool and draw a mask to limit the area where it's operating (connect the mask to the Color Corrector's Effect Mask input). In the Color tab, youcan also mask, although I do most of work in Fusion as I am a VFX guy.