Bruno Mansi

Premium+
  • Posts

    336
  • Joined

  • Last visited

Everything posted by Bruno Mansi

  1. Have you looked at the Kodak site? https://www.kodak.com/en/motion/product/camera-films/250d-5207-7207
  2. Are these two shots meant to match? Two different locations with completely different lighting. You can see from the windows of the interior shot, the daylight is somewhat cool, so it's obvious the artist is being lit by artificial light. Music videos often have completely different lighting between locations. What was the Director's/DOP's intention?
  3. Have a look at this tutorial by Darren Mostyn, where he does a sky replacement. Although he generates his matte using qualifiers, you should be able to adapt it for your external matte.
  4. Did the previous comments in this thread not give you the info you wanted? A description of your problem might help.
  5. Not sure about the chart but if you want to see it being used in look development I would recommend you delve into some of Lowepost's courses. The one I'm thinking specifically is LOOK DEVELOPMENT & WORKFLOW IN DAVINCI RESOLVE by Jason Bowdach Take a look at lesson 7
  6. I agree that it's still a mystery to me as to what it actually does! I've found that it seems to turn on and off depending on various settings which I haven't really worked out the logic of. As an example, it turns on the forward OOF when you set the CST Input Gamma to Arri Log C, but if you set the Output Gamma to Cineon Film Log it turns off again. I guess it's doing what it should (?) but it would be nice to know exactly what this is.
  7. You might want to take a look at the Look Development & Workflow course (Jason Bowdash) on Lowepost. In lesson 13 (Complex Saturation Workflows) he uses LAB and HSL/HSV workflows to reproduce the sort of colour depth that's being discussed in this thread. 292xap87.75.54.224
  8. Going back to my still photography days, density was a term used to to describe the amount of silver (and hence the amount of dye in colour stock) in a negative. Underexposed negatives were called 'thin', whilst adding one or two stops of light above the optimum exposure would produce a denser negative. Since film had a good exposure latitude, it wasn't uncommon to overexpose by a stop or two. As long as you weren't blowing out highlights, this technique could capture more detail in the blacks and produce richer images due to the increase in silver/dye. However, this wasn't without it's own problems as colour film could produce hue shifts when subjected to overexposure. Fujifilm, for example was known to lean towards magenta when overexposed. As others have stated, this characteristic of film density is being translated to digital images as a way of mimicking the film look.
  9. Not seen the course myself, but I've heard good things about the Tom Cross Masterclass. See these links... https://www.mzed.com/educators/tom-cross
  10. Mixing Light has some good material on HDR. Most of it is behind a membership pay wall, but they offer a seven day trial. In fact, I found a free one on their site which I think is the sort of thing you're looking for. Admittedly, it's dealing with Dolby Vision, but it does talk about making SDR trims. https://mixinglight.com/color-grading-tutorials/getting-know-dolby-vision-hdr-part-2/
  11. My understanding is that the LG OLEDS can be decently calibrated for SDR work, but are unsuitable for any sort of accurate HDR grading. It's due to the fact that they're WOLEDS, and at higher light outputs the saturation drops off to an unacceptable level. I believe this topic has come up before in this forum and on the LGG forums. I seem to remember reading that the likes of Amazon and Netfix won't sign off on a HDR grade on these monitors.
  12. You need to think about whether speed or space is your primary concern. The OWC, with it's USB Gen3.1 interface is quoting around 400MB/s, but I'd be surprised if you get this with two spinning hard drives. It's more likely you might achieve this with SSD drives. The G-technology is only USB 3.0 and quotes specs around half the speed of the OWC. If you're simply working at HD, then these interfaces should suffice, but if you want to work with 4K material, you're likely to need more speed. Things like Resolve caches/renders really benefit from fast storage (like NVME storage). You might be able to use your internal SSD for these caches, which should achieve read/write speeds in excess of 2000MB/s Having Thunderbolt 3 storage with SSDs is going to achieve the best speeds. If you only need a TB of storage, then getting something like the Samsung X5 (with it's Thunderbolt 3 interface) would give you speeds of around 2500MB/s Puget have an article about hardware recommendations for Resolve, and include a section about storage. Have a look at... https://www.pugetsystems.com/recommended/Recommended-Systems-for-DaVinci-Resolve-187/Hardware-Recommendations
  13. All depends on your budget. Your Thunderbolt port will allow you to connect anything from a simple external SSD drive (1 to 4TB) right up an external RAID enclosure with 6 or 8 Drive slots. The cheaper SSD drives will be SATA, so will give you around 500 - 1000MB/s. A proper Thunderbolt SSD should give you speeds similar to your internal drive. A Thunderbolt RAID system will give much better storage capacity and performance , especially if you populate it with SSD drives, although this starts to become quite expensive. RAID systems can also offer protection against a drive failure if you're happy to trade some performance and storage capacity. Look at manufacturers such as Promise or OWC.
  14. It's the Artist Color that is EOL. The Artist Mix panels and Eucon are still supported. You're asking a lot for software that was never designed to run on a M1. I'm assuming the Eucon software is using Rosetta to function on your machine. This may well be introducing problems. Since Avid are still supporting Artist Mix panels (especially with Protools), it's likely they'll get around to improving the Eucon software. One thing I would do is give your panel a static IP address. It's set to DHCP by default, so could change it's IP address if switched off for a period. Although Eucon should still identify the panel after an IP address change, it's one less thing to complicate matters. The whole M1/Avid issue is constantly being discussed on the Avid forums. There's some info that Avid is publishing on the the current state of support for M1/Big Sur, which might be useful reading. Link to article... https://avid.secure.force.com/pkb/articles/en_US/Compatibility/macOS-Big-Sur-Support
  15. The way I read it is that an OOTF is essentially a combination of an OETF and an EOTF. I don't see why you would use such a transfer in normal situations. It seems to suggest that HLG is an encoding that has both a scene and display referred relationships, and therefore seems to require an OOTF. Maybe a forward OOTF is when you're going to HLG and a reverse OOTF is when coming from HLG? I'm really clutching at straws here!
  16. There does seem to be a lack of information about these options. I wasn't even sure what the acronym stood for, but it seems to be 'Optical to Optical Transfer Function'. Still looking, but I found this in a technical article on colour science... Output-referred spaces have only an Electro-Optical Transfer Function (EOTF) which defines the relationship between code values (CV) and display light. This may be defined relative to display peak luminance, or as an absolute encoding in terms of display cd/m2 . Scene-referred spaces have only an Opto-Electrical Transfer Function (OETF) which defines the relationship between relative scene light and the encoded value. Some encodings, such as Hybrid Log-Gamma (HLG), define a relationship to both scene and display light, so have both an OETF and and EOTF. An EOTF is not necessarily the inverse of the corresponding OETF. The combination of the two is referred to as an Optical to Optical Transfer Function, or OOTF. Also found this.... Before we proceed, it must be reiterated that there are two aspects of Hybrid Log Gamma – a Scene Referred aspect which is encoded directly from a Camera and a Display Referred aspect which is relevant for displaying on a TV/Monitor. The key differentiator is that in a Display Referred context, HLG normally includes an opto-optical transfer function (OOTF) which transforms Scene Referred content into Display Referred content, such as an HLG 1000 NIT output from a HLG recording. Scene referred content on the other hand is as encoded directly from a camera. When we import footage recorded from a camera, this is almost always Scene Referred. Not sure this makes it much clearer, but is it suggesting it would need to be used if creating HLG deliverables? I know the BBC were experimenting with HLG as a solution for terrestrial broadcasting, but I don't know if anyone on these forums has actually worked with this HDR standard.
  17. Sohonet seems popular with high-end post houses here in London UK. One of the issues with any online storage is decent upload speeds. Many ADSL broadband connections have abysmal upload speeds. Sending 300 GB when your upload speed tops out at a couple of Mbps is going to be painful!
  18. I wouldn't have thought Blackmagic would have supplied training footage that only worked with Resolve Studio. It's likely that most people doing the training would be using the free version. I've seen the footage from the beginner's guide and it mostly seems to be ProRes. It it possible that the footage is Avid DNxHD? I know that you used to have to get those codecs directly from Avid. They're free of charge - just search for LE codecs. on the Avid website.
  19. If you're talking about the film emulation LUTS included in Resolve, Darren Mostyn covered this recently on YouTube.
  20. There's lots of tutorials on YouTube with titles like "Five Mistakes Amateur Colourists Make" and "The Pro Colourist Secret to Beautiful Cinematic Images". It's trying to sell the idea that there's a right way and all other ways are inferior/wrong. I appreciate there's a certain amount of click-bait going on here, but looking at the number of 'likes' and general comments, a lot of people are being sucked into this way of thinking. One of these 'secret-sauce' ideas is that by working in ACES, you'll automatically achieve the so-called cinematic look. It seems to me that every pro colourist that takes the time to comment on their workflow, will have a different way of achieving their goal. It's about developing your own style and methods to achieve the results that clients want. Unfortunately, many aspiring colourists don't want to hear this as it involves patience and a few years of 'skill-honing' (is that a word?) They want the industry secrets they think top colourists jealously guard. I'm not suggesting that you can just grade in any haphazard manner. There's much that professional colourists can teach about being methodical and efficient in your approach, but that doesn't have the wow factor that pulling out some fancy LUT that will do all the work for you.
  21. I wouldn't have thought there's looks that are simply reserved for sci fi, but I do see the same sort of colour palettes used in many films of this genre. There's plenty of tutorials in the 'courses' and 'insider' sections that will help you achieve the look you want. It's probably true that you can push the boundaries and go more extreme - especially with things like foliage and skies on alien planets! I do think that some sci fi does tend to gravitate (get it?) towards certain clichés, such as harsh & cold inside spaceships because they're often lit with flickery, fluorescent lighting, especially on the lower decks. Even though by the time we're all flying in space, discharge tubes would be consigned to the museum. Dystopian nearly always means desaturated/bleach bypass or yellow-orange tints to denote major pollution. I remember in the Blade Runner sequel, it was very, very orange out in the 'badlands'. Laser beams are often (boringly) red, although I've been told that blue lasers have better spectral power. It's difficult to break away from the mold when it comes to audience expectations for sci fi. Since non of us have any experience of interstellar travel, we're all just making it up as we go along!
  22. Just been watching Vincent Teoh's review of the Sony A90J OLED monitor, which supposedly has the latest LG OLED panel Reviewer rates it quite highly - maybe a contender to the latest LG and Panasonic models often discussed.
  23. My vote would be to use Mocha Pro - available from Boris FX It's a planar tracking tool that is used a lot for this sort of work. It's available as a standalone version, but there's also plugin versions for OFX, Avid and Adobe After Effects. I've used it lots in the past to solve roto work like this. Never used the OFX version, but the standalone works well as does the AE version. I seem to remember there was a free, cut down version included in AE - don't know if this still is the case. The latest version has added mesh warping tools - especially useful for cloth etc. If you're using it under AE, I found a very useful add-on called Mocha Import +, which allowed you to work on stabilised precomps of your roto'ed area. Great for working on those tricky areas where there's a bit of additional blur/touchup required.
  24. Use ColorTrace. Of course, this requires the original grade to be available, which should be the case in your example. ColourTrace is accessible from the Edit page. Import the AAF, relink media as necessary, right click on your timeline in bin and select Timelines/ColorTrace. There's numerous tutorials about this feature on YouTube.