Bruno Mansi

Premium+
  • Posts

    360
  • Joined

  • Last visited

Everything posted by Bruno Mansi

  1. The native format in Avid is MXF OP-Atom to be exact. This is the variant where each video and audio track are separate files. The good news is that Premiere Pro reads it happily. If you export an AAF from Avid with linked media it will import into Premiere without problems. Premiere will handle both video and audio, although Avid effects won't be translated. Audio rubber-banding does come through, although if you've set any clip gain beyond +6db, it will get clipped back to +6, which tends to upset your sound mix. As far as as other formats are concerned, Premiere Pro users generally expect to just be able to link to their files, and I know that Adobe supports a lot of formats. The reason Avid users traditionally imported/transcoded their media to OP Atom was to do with playback performance, as Media Composer struggled with lots of AMA (linked) clips. Things have improved somewhat with today's modern, multi-core processors, however Premiere uses the graphics card CUDA technology to give you real-time playback of your clips. Mac users might also be interested to know that the latest versions of Premiere now have Apple Metal GPU support. Premiere is likely to struggle to playback in real-time on older computers, or where there's formats that require lots of 'muscle' to decode (4K or highly compressed long-GOP), so recent versions of Premiere have included proxy workflows, where you're transcoding to a more friendly format.
  2. I don't really understand Bizon's answers to Dimitril's question. If you're talking about pure Gigaflops/sec, graphics cards are over ten times faster than a modern CPU. If they're saying the processor must have a similar performance to the graphics card, are they implying that your GPU will only run at the performance of your CPU? The whole point of offloading processing to the GPU is because they have thousands of cores all running in parallel, which is great for rendering complex graphics. A great visual example of this is at... https://www.youtube.com/watch?v=-P28LKWTzrI In a typical example, a CPU will instruct the GPU to process a particular rendering task, and leave it to do it's work. The GPU would then access the main memory using DMA (direct memory access) to get the data, perform the computations, and then tell the CPU it was finished with the task. It can only do this as fast as the weakest link, so the PCIe slot and DMA need to be able to keep up with the demands of the parallel processing that goes on in a modern GPU - ie around 5000 Gflps/sec. Now, the question is this... when this transfer of data is happening over thunderbolt, how involved is the CPU? Ideally, it would hand over the task to something like a thunderbolt controller chip, which would handle the access to memory directly. In this 'best case' scenario, the interface can only transfer data at a maximum of 1/3 that a PCIe bus. However, If the CPU is more directly involved, you are only going to be able to transfer data as fast as the CPU can run (around 500Gflps/sec). The 'Processor performance must be comparable with graphics card' answer seems to imply that the CPU is more directly involved. This could be the case if there was additional driver software required to control the Bizon box. The second answer doesn't make much sense to me. 97 percent performance of what? - the CPU? Finally, it's worth remembering that if you also intend to use your thunderbolt 3 to attach your media drive, this will also add to the bandwidth requirements of the port, so now your disk and graphics data are all going up & down this one cable. And don't even think about trying to piggy-back a second monitor! As I said, get hold of a demo unit and try-before-you-buy!
  3. Hi Tom, thanks for your kind words. It's just that I hate seeing people getting ripped off. Your comment about Apple's limitations is spot-on. Are we the only people that see the questionable decisions that tech companies (like Apple) are making, when they sacrifice functionality for style? I have visions of users having new Macbooks hooked up to external ports, external PCIE boxes, external hard drives etc, telling me how wonderful the new Mac laptops are. Sure they are, but you've had to add over $1000 dollars of extras! How is this still a laptop? - ie, something you can put on your lap? Luckily, there are companies (like HP) who still understand how to design a laptop for professionals.
  4. I would seriously test this box out before parting with $650, which seems a ridiculous price to pay for a case with a PSU, a PCIE slot and a bit of electronics. Did you read the comments below the review? Some very good points made about how well this would work with GPU-hungry tasks. The Thunderbolt 3 interface has a theoretical maximum throughput of only a third of what a modern PCIE (16 lane) slot can deliver. This may make a big difference if you're using a graphics card that can really push the data through.
  5. Any more information about what these issues are, and where one can read up about them?
  6. The info on Blackmagic's web site for version 12.5 states...'RED Camera R3D files including +5K, monochrome and HDRx'. There's no mention of any additional support for R3D files in any of the point releases (currently 12.5.3), so I'm not sure upgrading will fix the problem. Have you tried bringing these files into something like Premiere or Avid? You could also try a test file from Red's web site at... http://www.red.com/sample-r3d-files This would identify if any of the named file formats are a problem with your setup.
  7. Looks nice, but I'm always worried about how these sort of devices will perform. Looking at the specs, you can only have a maximum of 4GB GPU memory and 32GB main memory. Also a maximum of 2TB hybrid drive, which isn't a lot these days. Looks like no thunderbolt or USB 3.1, which I think is a big mistake. If you go with these specs and an i7 processor the price is over $4000! For that price you can buy a decent HP workstation and a Wacom Cintiq 24HD display tablet, which will do the same thing. It won't look as pretty, but you'll have a more powerful system with lots of upgrade potential.
  8. The lack of professional features that are found in broadcast monitors and the lack of SDI inputs may be a problem for some post houses, but they should make great client monitors.
  9. Forbid them from using 'look' LUTs that promise to make you into a top colourist with just a few mouse clicks!
  10. If you're considering online courses, I'd recommend FXPHD. For $79 a month, you can access all their courses and get access to VPN versions of some of the best software out there. They are fairly VFX biased, but they do have production courses that deal with things like lighting and camera techniques, as well as their 'background fundamentals' series. They also have a couple of courses specifically on colour grading.
  11. I assume you're talking about certain issues that have been going around about the factory calibration accuracy of the BVM-X300 and the inability to allow users to properly re-calibrate the displays. For those who want more information about this, head over to... http://www.liftgammagain.com/forum/index.php?threads/sony-bvm-x300-factory-calibration.7195/ However, the original post wasn't really about the Sony display, but more on the quality of the new LG OLED panels, which would seem to rival some of the best reference monitors out there, at a much lower cost.
  12. No, you don't, that's the beauty of the Baselight method. Assuming that both the full Baselight and the Avid workstation have access to the original rushes (such as in a shared storage system), You send your cut from Avid to Baselight via AAF in the normal way. Once you've graded on the full system, you then export an AAF of the metadata - that is, all the colour correction decisions that were carried out during the grade. This could (of course) be sent to another full Baselight system to replicate the grade, but the AAF can also be used by the Baselight Editions AVX plugin installed in a separate Avid workstation to add the exact same grade to your Avid timeline. No need to export any media to Avid or do any sort of media re-linking. You simply add the Editions plugin on a spare track above your media, point it to the AAF file and within seconds you have a fully graded timeline! Filmlight provide the AVX plugin for free, which can be installed on as many Avid workstations as you wish. They also have a paid version of the Plugin which allows you to have Baslight grading within Avid, or even do a preliminary grade that can be sent (and read) by the Full Baselight. This plugin also lets you alter any of the grades you received from the full Baselight. If the director and Colourist do a second grade, then you simply point the AVX plugin to the new version. If they just want to alter one or two shots, they can send you a BLG file - one for each shot, which you can use to upgrade those shots in the Avid timeline. It's a very neat system.
  13. The beauty of Baselight is that if you have the Editions AVX plugin installed in your Avid, there's no need to do any media exporting and re-linking from the full Baselight system - just a small AAF metadata file with all the colour correction information. Additionally, if you need to make a change to a colour correction, you can do that from the plugin, without having to go back to the full Baselight.
  14. While price is an important factor for small facilities or independent film-makers, I don't think the cost of the software is going do be of much concern to a professional post house. Given that the cost of installing a top-end grading suite (and the associated hardware) is going to reach a six figure sum, it's issues around workflow and support that are going to effect the decision about which software they choose. I've talked to a few colourists in the UK who like the Davinci software, but have opted for Baselight. When asked why, they say that the support from Filmlight is much better to what Blackmagic offer, and that the Avid-Baselight-Avid workflow is vastly superior. Given that the majority of high-end productions are still cut on Avid, it makes perfect sense for them to invest in Baselight products. I don't think one system is 'better' than another, they both have their strengths and weaknesses. In the world of compositing, it's true that for complex work, nodal systems like Nuke seem to be better suited, but for simpler work, you can often knock something out on After Effects much quicker than in Nuke. In the world of colour grading, Colourists find the system that suites their style/workflow best. Whether you use nodal or layer based software will not effect the speed or quality of your end product - it's the toolset and the craft of the operator that will determine the final result.
  15. For those of you who dream of owning a Sony BVM X300, the new LG OLED TVs may be the alternative. The 2016 models are making quite a stir, with claims that they are matching studio grade, 4K colour critical monitors. They don't have SDI inputs of course, but reviewers are saying that the panels are visually indistinguishable from perfect in areas such as colour and luminance accuracy. Other major bullet points are... Perfect black levels (0 cd/m2) The highest peak brightness for an OLED TV (630 to 730 cd/m² for HDR) The smallest brightness variation with viewing angle up through 45 degrees 98% DCI-P3 gamut at 4K Excellent average screen reflections (1.1%) There's an in-depth article for those interested at... http://www.displaymate.com/OLED_TV2016_ShootOut_1.htm
  16. Looks like the Lian Li D8000 chassis. Is this just for your RAID, or are you intending to install a motherboard, PCIE cards etc?
  17. The ACES workflow in Resolve is pretty straightforward - as it should be. My reasons for using it in preference to other colour management systems would be in situations where I'm sending out shots for (say) VFX work. My understanding is that if I gave a Nuke compositor a graded shot, and he/she was using ACES within Nuke, I would get back a comped shot that would slot back in without any problem. Now, given that VFX work may involve accessing original log camera plates, 3D animation, stills and various effect loops (fire, smoke, particles effects etc) It's important that they see the graded shot correctly so that they can match the other elements to it. Nuke uses the OpenColorIO (OCIO) system, which is compatible with ACES, so you can set up the OCIO configuration to emulate ACES. All good so far, but did you know that there are some different ACES standards that Nuke compositors may use as their default settings? Have you heard of ACES 1.01, ACES 2065-1, ACEScc or ACEScg? I'm in the process of trying to find out more about some of these various flavours of the ACES standard, however one quote (on Nukepedia) got me a little worried... "ACES2605-1 is very large gamut encoding which makes it very hard to work with. ACEScg primaries are much smaller making it an ideal working space for compositing and working with cg images." I'm not sure what this means, but it implies that VFX artists may be working in a reduced colourspace before the final ACES ODT back to what I originally sent them. I'm no colour scientist, but this doesn't sound good for my grade. I'd be interested to hear from anyone on these forums who has any experience of this sort of workflow and can shed any light on the subject.
  18. OK, I had a go at looking at the output of my Decklink card. To get a file to play through the card from the desktop I had to use Blackmagic's Media Express software. I used a test file from http://www.sync-one2.co.uk/support/test-files/ This is a series of 1 frame bleep/flashes. I fed the HDSDI output from the card through a Leader waveform monitor to both a CRT monitor and a Dreamcolor LED display. The audio was de-embedded from the SDI by the Leader and fed as AES digital audio to a TSL monitoring unit & speakers. This is done to make sure there's as little difference as possible in the audio & video cable lengths, and everything's kept digital for as long as possible. Running the test file and watching the CRT monitor, the sync looked spot-on to me. As I previously mentioned, the Dreamcolor display showed the expected video delay (compared with the CRT) of a frame or two. I did notice that I'm running quite an old version of the Blackmagic Desktop video software (10.5.4). Never did upgrade as I remember there were problems when 10.6 was released, and I'm a firm believer that if everything works, leave well alone!
  19. I've been to a couple of HDR demos run jointly by Filmlight and Sony at Filmlight's London office, where they demonstrated SDR and HDR side-by-side with standard monitors and the Sony X300/X550. The prevailing wisdom I got from these sessions was that it was not the intention to just make everything super-bright but to have the majority of the scene with similar light levels as in the past, but allow the specular highlights to exploit the display's increased brightness capabilities. Similarly, the increased colour gamut means more colours that we see in real life, but again that didn't mean we should be 'screaming' colour at the audience. Some of the reasons for this were technical limitations. Current display technology will only allow high brightness levels over a relatively small portion of the picture to stop the panels overheating. Sony have built-in protection to clamp the brightness of the pixels if the circuitry detects conditions where this might happen, so no large swathes of sky at 800 nits! Another issue that might effect things is colour fatigue. It's not an issue that's really been discussed, but with all these higher brightness displays coming along, what is that going to do to a Colourist's perception of hue and saturation? This was the reason some grading suites used to set their monitors to 80 nits.
  20. Wearing sunglasses will make you look cool during a grading session, but won't impress the client. :-) Seriously though, I did once try grading with a pair of sunglasses, just to see what happens. Apart from the monitor being a bit dim for my liking, the results weren't as horrendous as I'd thought. It was mainly all just tinted away from neutral white and some hues were a bit too strong - you could almost call it a look!. I did listen to the podcast and would agree that the cardiovascular system is one of the most important mechanisms to maintaining the health of all our organs. It's the only way we receive the nutrients and oxygen required to nourish our eyes. One thing that wasn't really covered was how our eye's rods and cones are effected in older age. The lens yellowing effect on the blue end of the spectrum was discussed, but little else about failing colour acuity was mentioned. One other thing that I felt was skipped over was the eye muscles themselves. Of course, any muscle strain in the body is to be avoided, but she didn't answer the question about exercising the eye muscles. Every athlete knows the importance of building and maintaining muscle strength by exercise, followed by eating lots of carbs and protein. Can the same philosophy not be applied to the eye muscles, or is simply following the 20/20/20 rule the best we can achieve?
  21. OK, so when you're watching using quicktime player or VLC for example. I'm trying to remember if the Blackmagic Desktop Video software has any settings that would control the video or audio timing to your Decklink - I don't remember seeing any. I have a Decklink card in one of my workstations and will try a test to see if I have the same problem. Are you using the latest version of this software? Looking at the Blackmagic site I see that it's up to version 10.8. I know that VLC has the ability to alter the timing of the audio signal. It's under Tools/Track Syncronization, but this isn't going to help you with YouTube videos.
  22. I'm not sure quite what you mean... When you say "playback on Avid and Resolve" do you mean when you're looking at the viewers inside the software with the computer monitor(s)? And when you say " desktop audio is out of sync" are you referring to the output of your Decklink to an external grading/client monitor and speakers? If the above is true then the probable reason is due to the external monitor. Virtually all modern LCD,LED, OLED monitors have a delay. It's a function of the fact that due to the way the picture is processed by frame buffers, there's typically a 1 to 2 frame delay in the picture being displayed. CRT displays, (being essentially analogue in nature) didn't have this problem. So in a typical setup with an external display and external loudspeakers, you would need to introduce a delay box in the audio signal chain. Manufacturers of domestic LCD/LED televisions build this into their products so that the sound coming from the TV speakers is in sync. What's slightly confusing about the title of your post is that you say you're having an audio lag. In the situation I've described, it would be the picture that lagged behind the audio.
  23. Intermittent faults like this can be difficult to pin down. The only time I've seen something like this was when I was editing in Avid Symphony. One of my linked/AMA'd clips would occasionally not display and cause the software to hang for around 20 seconds, after which I'd get an error message. It turned out that there was a corrupt frame in the middle of the clip. If you loaded the clip from the start, all was fine. Once I'd played across the corrupt frame, my problems started.
  24. Playing with colour spaces will obviously alter the look of your material, but you still have to select an output colour space to match the intended destination (eg. P3, REC 709). The point of ACES is to make sure that your grade will look correct wherever it goes.