Marvin Nuecklaus

Premium
  • Posts

    16
  • Joined

  • Last visited

Everything posted by Marvin Nuecklaus

  1. Amazing course, thank you as always Lee and Lowepost! Would love to see more advanced tutorials like this and beyond. One question. What prevented you from connecting your first Camera Node to Merge3D1_1 (the Merge 3D of the second scene)? Wouldn't that make it easier by not having to worry about changing one parameter when fine tuning in Camera 1 and forgetting to duplicate the camera again each time? Also is there a way to avoid grabbing a still of frame 88 (phone screen before flying through) with a more "accurate" approach? Seems very cumbersome to match the frames especially when it becomes a bit more complex.
  2. Not sure where else to ask but I was wondering whether there's any new courses coming soon especially now that the full version of Resolve 18 is out?! Personally, I'd love to see more advanced courses that pick up where the other ones left off. Think of editors / vfx artists / colorists with 5-10+ years of experience who want to continue educating themselves and see workflows of other professionals to follow along. One course in particular would be amazing which is professional HDR workflows starting with the right gear set up to color management outputting SDR and HDR with trim passes, different types of HDR, etc.
  3. I might be wrong but I don't think the Atomos monitors let you control the camera. Small HD partnered with RED and has a proprietary software installed (for an upcharge) that lets you access the camera settings via touchscreen.
  4. I might be wrong but I don't think the Atomos monitors let you control the camera. Small HD partnered with RED and has a proprietary software installed (for an upcharge) that lets you access the camera settings via touchscreen.
  5. Thank you for this thorough course, easy to follow as always! I got two questions that I believe haven't been answered. 1) When creating multiple busses, how do I decide which one I'm monitoring. In L14 you send the Dialogue Bus to the Main bus but at what point in time have you categorized it as the main Bus? Is there an option or does it go chronologically down the list? 2) When creating a stereo and 5.1 Bus and I start panning individual tracks in the 5.1 space, how does that translate to stereo? Will it compress automatically, i.e. L and L s will just translate to L and ignore that separation?
  6. Hey Christian, I have the LG CX series here and while they're amazing, I wouldn't recommend using it as your reference display for several reasons. First, and probably the biggest is that these monitors are fluctuating in color accuracy and brightness. Take a look at the Calman Home Forum and their results in the HDR space. My display measured 678nits, others get results up to 750 or as low as 600nits. Same goes for the color space as well. You see anything from 82% to 95% (those are just numbers I saw in my research but I'm sure there's other results out there). I had mine professionally calibrated here in LA and I'm incredibly happy with the results. I use my CX as my client monitor and I know that CO3 has a few of those hanging around as well for that purpose. Those monitors won't get you Dolby Vision approved and depending on who you're delivering to, this might become a crucial factor. Now, if your goal is to grade HDR for web, go for it! I now do that all day long and clients are loosing their mind. I normally hand them an iPad which goes up to around 600nits and holds its colors pretty well. Most people consume their content on those retina apple displays anyway. If that is your final delivery, I claim that the CX is more than enough while keeping an eye on your scopes .
  7. Hey everyone, I'm delivering a feature and the following is required for Audio delivery Audio Configuration for digital video masters: CH 1: Left, CH 2: Right, CH 3: Center, CH 4: LFE/Subwoofer, CH 5: Left Surround, CH 6: Right Surround CH 7: Stereo L (or Lt), CH 8: Stereo R (or Rt) Is there any specific settings you choose when exporting a ProRes422 or can you leave it at default? I got all my tracks lined up the way it is required.
  8. Hey there, I consumed all Fusion courses here on Lowepost, particularly the sky replacement and rotoscoping one. Unfortunately, I couldn't find a technique that is quite applicable for the shot attached. My goal is to replace the BG with a desert shot so everything from where the fence starts. Do you guys have any advise how to approach this? Luckily it was shot in R3D in 6K so lots of information.
  9. Hey everyone, I'm curious if anyone here has built a color suite for SDR as well as HDR workflows? I'd love to learn about your hardware setup and signal chain. Currently, I'm using an iMac Pro -> TB3 to Ultra Studio 4K Mini -> 1. HDMI to LG CX Oled 2. SDI to Teranex 8K HDR -> HDMI to Reference Monitor I read about those HD Fury converters and HDMI splitters that can send HDR metadata. Perhaps that could replace the Teranex 8K HDR?
  10. Hey everyone, I would like to ask for your advise on how to set up the following hardware: - iMac Pro (primary monitor) - Asus ProArt pa32ucx - LG OLED CX Series - Blackmagic Ultra Studio 4K Mini - Davinci Resolve 17.08 Beta - Display CAL - Xrite i1 My current color environment is in Rec 709 and P3 however, I would like to ultimately be able to get into HDR finishing as well. Currently, I have my Ultrastudio Mini 4K connected over HDMI to my Asus ProArt while the LG gets a clean feed from resolve with the calibration LUT loaded internally. The Asus gets the calibration LUT through Resolve Display LUT. My question is whether this is the smartest set up with the hardware available or if you recommend a different way? Also, I'm experiencing that in Resolve 17, I'm no longer able to apply the Display LUT to the I/O box only but it affects the GUI as well.
  11. Hey everyone, I'm currently using Resolve 17 on an iMac Pro which is connected through thunderbolt 3 to a UltraStudio 4K which feeds the signal through HDMI to an Asus ProArt PA32UCX 32. For calibrations, I'm using the xrite i1 display pro and for the software Display Cal. My question is, how do I calibrate the monitor so that the changes will apply to the monitor connected to my video I/O box individually from my GUI Monitor (imac)? When applying my calibration LUT for my Asus to the option "Video monitor lookup table " it will also affect my iMac. I believe in Resolve 16 you could apply a LUT for each but Resolve 17 doesn't seem to have that option anymore. Any tips would be appreciated!
  12. @Lee Lanier Thank you for your prompt answer. I guess to simplify my question, while you're technically able to edit in AE and approach a lot of sequence composition work as such, does Fusion within Resolve allow you the same thing? F.e. if I load in multiple clips as media in, can I adjust the timing of those clips or have them overlap to add a transition as simple as a cross dissolve or would I need to have these clips overlap in the edit page, load them into fusion individually, animate transparency for clip 1 on the tale end and on clip 2 at the beginning while really only seeing the results in the edit page as only one clip at a time is being loaded in?
  13. Hey there, just finished the Fusion Fundamentals class. I worked in After Effects on a semi professional level before so, the workflow is sort of implemented in my brain and now I'm trying to wrap my head around Fusion. The question I have is how Fusion's workflow is in comparison to After Effects specifically when it comes to multiple clips in form of a timeline? In AE, when doing a GFX sequence, I would create a composition with x-amount of frames and import my graphics or clips within that composition which basically interacts as a timeline. That way I can transition those graphics easily, manipulate timing, and so on. In Fusion, it seems way more of a clip by clip approach where you just bring in a clip from your timeline one by one. But what if you have complex GFX sequences in which they build on another? I understand you can add a loader or a 2nd, 3rd, ... media in, but how would you tell Fusion that that loader or that media in should only show up after frame X? I hope this makes any sense as I'm trying to figure out how to apply my AE workflow to fusion if even possible. Cheers, Marvin!
  14. You're the man! Thank you Jussi. When reconform from bin that contains the same clips (let's say with updated VFX but runtime is identical) and I want to match the same in and out points of the clips in the timeline, would I need to manually set the in point of each clip in that bin that I'm trying to reconform?
  15. Hey everyone, I'm transitioning from Premiere and there's one key function I would always use which is holding the alt/option key and dragging and dropping a clip from my media pool on a clip or multiple pre selected clips in order to simply replace them. That way, all effects and attributes stay intact. I understand Resolves version is to select a clip from the media pool, select a clip in the timeline that I would like to replace and choose Conform Lock with Media Pool Clip. This however is greyed out when selecting multiple clips. I'm looking for a similar functionality as I had in Premiere that works for a single AND multiple clips. To give you a scenario for clarification why this is important: I received a new export from the VFX house which is a sequence that includes edits. The timecode is identical. Now, I would like to replace all clips in my timeline with that new Export file. However, I don't want to relink to the new file, as I need to keep the older version intact for reference.
  16. Hey Lowepost Team, thank you for all the great courses you offer. While I just finished the Fairlight Basic Course, I have a good understanding how the main tools work. However, when it comes to work with Audio creatively, I'm still green behind my ears and normally send it off to an Audio Post House. There are situations however, where the budget doesn't allow me to get external folks involved and I would love to earn creative ways to work with audio such as improving your VO or OCD, use of the effects library, more advanced ways of noise reduction, etc. Same as you do for Color when going more into specifics about Skin retouching or for Fusion when you talk about Sky replacements, etc. Even if there was some content on your Insider page watching Audio Mixers/Sound Designers working would be amazing! Thank you in advance and I hope you will consider my idea.
  17. I have a question about Lesson 12. 1. What's the reasoning behind exporting a Mono Music track in the first place? Is this just to show that you can easily convert Stereo to Mono or does that have any 'real life' practical reasons? 2. Why did you create a M&E track and didn't include it in the Export?