This stream auto-updates   

  1. Today
  2. Hi folks, I'm having trouble with Premiere and Resolve not understanding the scene and take metadata contained inside r3d files. On a shoot I diligently logged the takes and scenes into the camera metadata on the red. Red cine X shows the correct information as shot. This was simply the Scene and Take numbers we created as we shot the material. (see below) Resolve sees the Scene as description and the Take as the Take. Which is good enough to make it somewhat useful. However, by default Premiere sees neither of these bits of metadata information. People have suggested exporting an XML from Redcine X. That only imports the scene number but not the take. Surely this information is super useful. So why the heck is it so hard to actually get an editing system to actually read it? Also Premier and resolve don't seem to understand the metadata when I move between the apps in XML. Also when I export an XML out of Redcine X it mangles the embedded audio tracks. They appear off-line and want to reconnect to non-existent files. So the only way I can get any Scene metadata into Premiere causes the timeline to become essentially corrupted. Any clues here? I'm trying to find a way to make this process streamlined and organised but it's proving to be very messy. Love to hear any theories. REDCINE X: Resolve sees the info in a slightly wrong way but at least it's there. Scene is mapped to description and take appears as take:
  3. keep smart render on. and wait for red line turn into blue
  4. Yeah, I could save it out a different extension but it would be faster and better if Resolve could read PNGs natively without any issues.
  5. Yesterday
  6. Netflix recommends using ACEScct instead of ACEScc. I like the idea of a real logarithmic transfer curve of ACEScc, but unfortunately it has noticeable artifacts in the shadows. It's less noticeable with Alexa, but I often get bright pixels in the shadows with Red cameras as Margus mentioned. So I stick with ACEScct.
  7. You don't mention what software you want to grade with, but assuming Davinci Resolve. According to the configuration guide.... MacBook Pro As all laptop systems are designed for portability and low power, for use with Davinci Resolve we recommend selecting the fastest CPU, 16GB of system memory and the GPU with the most memory that is currently available.
  8. Thanks for the information, Dolby said you don't need licence if only want L1 metadata and mostly it takes care. But if needed to trim pass then required licence and we can work with I- CMU built in resolve for dolby vision analysis and delivery. Only thing I don't have is HDR monitor, but I inquired and lucky me, I can get it on rent. I already have FSI SDR monitor so I can output 2 signals HDR and SDR from Resolve with my decklink and work accordingly. Am I correct on this one? thanks for all the information Oscar.
  9. Last week
  10. Considering to buy a new Macbook Pro to do some on the go grading. Would 16GB memory be enough?
  11. A big resume. You need to get a certification by Dolby in order to get a Dolby Vision license for your facility. You need a grade 1 HDR Monitor like, Sony x300, Sony X310, Flanders Xm310k or Canon DP-V3120 in order to be able monitor PQ curve gamma and P3 or Rec2020 color space, Once you have you HDR versión you going to need to make a trim pass version of the Rec1886 display master transfer función. You going to extract a .xml of the metadata of the trim pass versión and apply this xml to the Dolby Vision IMF package in a proper system as cortex, trankoder or DVS Clipster.
  12. Any other suggestions that I might try? I'm facing the same annoying problem.
  13. I'm having the same problem. Constantly showing error code 209 and the preview is broken. How do I disable the GPU acceleration in th ecurrent version?
  14. ACES offers a simplified workflow. RCM gives you a bit more control by allowing you to apply the camera LUTs. I would test both ACES and RCM and see which one suits your footage the best. There are so many permutations with color management, it is hard to predict which one might give you the best result.
  15. ACES offers a simplified workflow. RCM gives you a bit more control by allowing you to apply the camera LUTs. I would test both ACES and RCM and see which one suits your footage the best. There are so many permutations with color management, it is hard to predict which one might give you the best result.
  16. Hi everyone, In this online session, Leandro Marini of Imperial Creative will go over the problems of modern DI-workflows and how to solve them by minimizing the back and forth with clients utilizing SCRATCH’s biggest strengths: Realtime color & comp, remote & live streaming and integration of other tools right into the workflow. Leandro will show a recent project, on which he approached color and vfx finishing simultaneously in realtime via remote, from concept art through the finished sequence. This webinar will not be SCRATCH-only – Leandro will also show how to integrate After Effects efficiently into the workflow to support SCRATCH in advanced vfx workflows, bringing the strength of both tools together into the finished film. Cheers, Mazze
  17. And the recording is online, friends: Cheers, Mazze
  18. Lee, what type of workflow do you recommend? Im often work with Arri & Blackmagic Cameras. //Jesper
  19. Lee, what type of workflow do you recommend? Im often work with Arri & Blackmagic Cameras. //Jesper
  20. Hello all, I had done grading of a regional feature film, now they have given it for OTT platforms, and they are asking for a dolby vision version of it. if someone can tell me the proper workflow to obtain proper result in davinci resolve 16.. thanks
  21. I need to get access to the 5220-2 LUT from the ARRI look library. Does anyone of you have a version that is readable in DaVinci Resolve?
  22. Worth saying the BVMX300 maybe, and not the PVM LCD 300! I very nearly overpaid for a PVM a year or so ago Maybe worth specifying BVM X300 mark2 as well.
  23. Hi friends, here's a quick video on how to set up Nobe OmniScope with SCRATCH (and all other Assimilate products) and also a quick overview of its features and how to use it 🙂 . Also, you can get 15% OFF by using coupon code SCRATCHOSCOPE at the checkout over at www.timeinpixels.com! Cheers, Mazze
  24. Hi everyone I'm selling an Eizo monitor CG319X 4K The studio already got two and we don't need this one anymore. Pristine condition (like NEW) email : Contact@groovybabystudio.com Monitor in located in France. If shipping please note that this is a huge screen and shipping charges will be expensive. Thanks
  25. Thanks @Christian Berg-Nielsen, I will try "render cache color ouput". That's new to me. Guess, with an NLE background, I thought a render in-to-out option was available.
  26. You can combine clips and animate their opacity in Fusion. You can combine clips in pairs with a Merge tool and adjust the FG input's opacity by animating the Blend parameter. However, for general editing, where you arrange, overlap, and add transitions to clips, I would use the Edit tab. Fusion will build a separate comp for each clip within the Edit/Timeline tab.
  27. @Lee Lanier Thank you for your prompt answer. I guess to simplify my question, while you're technically able to edit in AE and approach a lot of sequence composition work as such, does Fusion within Resolve allow you the same thing? F.e. if I load in multiple clips as media in, can I adjust the timing of those clips or have them overlap to add a transition as simple as a cross dissolve or would I need to have these clips overlap in the edit page, load them into fusion individually, animate transparency for clip 1 on the tale end and on clip 2 at the beginning while really only seeing the results in the edit page as only one clip at a time is being loaded in?
  1. Load more activity