Bruno Mansi

Premium+
  • Posts

    360
  • Joined

  • Last visited

Posts posted by Bruno Mansi

  1. 2 hours ago, Filip Zamorsky said:

    I would prefer the old good tower too. But what can we do? I am really sorry, but I can't work with Windows. 

    "Thunderbolt 1 is up to 85-percent performance and Thunderbolt 2 is up to 97-percent performance." (GPU performance - not bad, I would say...)

    Screen Shot 2016-11-04 at 07.25.25.png

    This is a screenshot from their FAQ page:

    I don't really understand Bizon's answers to Dimitril's question.

    If you're talking about pure Gigaflops/sec, graphics cards are over ten times faster than a modern CPU. If they're saying the processor must have a similar performance to the graphics card, are they implying that your GPU will only run at the  performance of your CPU? The whole point of offloading processing to the GPU is because they have thousands of cores all running in parallel, which is great for rendering complex graphics. A great visual example of this is at... https://www.youtube.com/watch?v=-P28LKWTzrI

    In a typical example, a CPU will instruct the GPU to process a particular rendering task, and leave it to do it's work. The GPU would then access the main memory using DMA (direct memory access) to get the data, perform the computations, and then tell the CPU it was finished with the task. It can only do this as fast as the weakest link, so the PCIe slot and DMA need to be able to keep up with the demands of the parallel processing that goes on in a modern GPU - ie around 5000 Gflps/sec. 

     

    Now, the question is this... when this transfer of data is happening over thunderbolt, how involved is the CPU? Ideally, it would hand over the task to something like a thunderbolt controller chip, which would handle the access to memory directly. In this 'best case' scenario, the interface can only transfer data at a maximum of 1/3 that a PCIe bus. However, If the CPU is more directly involved, you are only going to be able to transfer data as fast as the CPU can run (around 500Gflps/sec). The 'Processor performance must be comparable with graphics card' answer seems to imply that the CPU is more directly involved. This could be the case if there was additional driver software required to control the Bizon box.

    The second answer doesn't make much sense to me. 97 percent performance of what? -  the CPU?

    Finally, it's worth remembering that if you also intend to use your thunderbolt 3 to attach your media drive, this will also add to the bandwidth requirements of the port, so now your disk and graphics data are all going up & down this one cable. And don't even think about trying to piggy-back a second monitor!

    As I said, get hold of a demo unit and try-before-you-buy!

     

    • Like 2
  2. 1 hour ago, Tom Evans said:

    Bruno, you are a lexicon of technical knowledge. Great to have you here!

    Hi Tom, thanks for your kind words.

    It's just that I hate seeing people getting ripped off.

    Your comment about Apple's limitations is spot-on. Are we the only people that see the questionable decisions that tech companies (like Apple) are making, when they sacrifice functionality for style? I have visions of users having new Macbooks hooked up to external ports, external PCIE boxes, external hard drives etc, telling me how wonderful the new Mac laptops are. Sure they are, but you've had to add over $1000 dollars of extras! How is this still a laptop? -  ie, something you can put on your lap? Luckily, there are companies (like HP) who still understand how to design a laptop for professionals.

    • Like 1
  3. On 02/11/2016 at 9:13 PM, Filip Zamorsky said:

    I would seriously test this box out before parting with $650, which seems a ridiculous price to pay for a case with a PSU, a PCIE slot and a bit of electronics.

    Did you read the comments below the review? Some very good points made about how well this would work with GPU-hungry tasks. The Thunderbolt 3 interface has a theoretical maximum throughput of only a third of what a modern PCIE (16 lane) slot can deliver. This may make a big difference if you're using a graphics card that can really push the data through.

    • Like 1
  4. The info on Blackmagic's web site for version 12.5 states...'RED Camera R3D files including +5K, monochrome and HDRx'.

    There's no mention of any additional support for R3D files in any of the point releases (currently 12.5.3), so I'm not sure upgrading will fix the problem.

    Have you tried bringing these files into something like Premiere or Avid?

    You could also try a test file from Red's web site at... http://www.red.com/sample-r3d-files

    This would identify if any of the named file formats are a problem with your setup.

     

    • Like 4
  5. On 26/10/2016 at 7:24 PM, Tom Evans said:

    What do you think about this new slick Creative workhorse? 

    Looks nice, but I'm always worried about how these sort of devices will perform. Looking at the specs, you can only have a maximum of 4GB GPU memory and 32GB main memory. Also a maximum of 2TB hybrid drive, which isn't a lot these days. Looks like no thunderbolt or USB 3.1, which I think is a big mistake. If you go with these specs and an i7 processor the price is over $4000!

    For that price you can buy a decent HP workstation and a Wacom Cintiq 24HD display tablet, which will do the same thing. It won't look as pretty, but you'll have a more powerful system with lots of upgrade potential.

    • Like 5
  6. If you're considering online courses, I'd recommend FXPHD. For $79 a month, you can access all their courses and get access to VPN versions of some of the best software out there. They are fairly VFX biased, but they do have production courses that deal with things like lighting and camera techniques, as well as their 'background fundamentals' series. They also have a couple of courses specifically on colour grading.

    • Like 4
  7. 4 hours ago, Margus Voll said:

    Sony monitors calibration is somewhat limited and does not make it super exiting as we can not expect ultimate precision.

    I assume you're talking about certain issues that have been going around about the factory calibration accuracy of the BVM-X300 and the inability to allow users to properly re-calibrate the displays. For those who want more information about this, head over to...

    http://www.liftgammagain.com/forum/index.php?threads/sony-bvm-x300-factory-calibration.7195/

    However, the original post wasn't really about the Sony display, but more on the quality of the new LG OLED panels, which would seem to rival some of the best reference monitors out there, at a much lower cost.

    • Like 3
  8. 1 minute ago, Thomas Singh said:

    Then you need to export files. Or am I missing something?

    No, you don't, that's the beauty of the Baselight method.

    Assuming that both the full Baselight and the Avid workstation have access to the original rushes (such as in a shared storage system), You send your cut from Avid to Baselight via AAF in the normal way. Once you've graded on the full system, you then export an AAF of the metadata - that is, all the colour correction decisions that were carried out during the grade. This could (of course) be sent to another full Baselight system to replicate the grade, but the AAF can also be used by the Baselight Editions AVX plugin installed in a separate Avid workstation to add the exact same grade to your Avid timeline. No need to export any media to Avid or do any sort of media re-linking. You simply add the Editions plugin on a spare track above your media, point it to the AAF file and within seconds you have a fully graded timeline!

    Filmlight provide the AVX plugin for free, which can be installed on as many Avid workstations as you wish. They also have a paid version of the Plugin which allows you to have Baslight grading within Avid, or even do a preliminary grade that can be sent (and read) by the Full Baselight. This plugin also lets you alter any of the grades you received from the full Baselight. If the director and Colourist do a second grade, then you simply point the AVX plugin to the new version. If they just want to alter one or two shots, they can send you a BLG file - one for each shot, which you can use to upgrade those shots in the Avid timeline.

    It's a very neat system.

    • Like 4
  9. 12 minutes ago, Thomas Singh said:

    as long as your color corrector can export MXF, you can simply set the destination to the media folder and Avid will read the files natively.

    The beauty of Baselight is that if you have the Editions AVX plugin installed in your Avid, there's no need to do any media exporting and re-linking from the full Baselight system - just a small AAF metadata file with all the colour correction information. Additionally, if you need to make a change to a colour correction, you can do that from the plugin, without having to go back to the full Baselight.

    • Like 2
  10. On 14/10/2016 at 0:16 PM, Alex Prohorushkin said:

    When I talk to colleagues in the majority believe that the priority and main advantage is the price

    While price is an important factor for small facilities or independent film-makers, I don't think the cost of the software is going do be of much concern to a professional post house. Given that the cost of installing a top-end grading suite (and the associated hardware) is going to reach a six figure sum, it's issues around workflow and support that are going to effect the decision about which software they choose.

    I've talked to a few colourists in the UK who like the Davinci software, but have opted for Baselight. When asked why, they say that the support from Filmlight is much better to what Blackmagic offer, and that the Avid-Baselight-Avid workflow is vastly superior. Given that the majority of high-end productions are still cut on Avid, it makes perfect sense for them to invest in Baselight products. 

    On 14/10/2016 at 3:35 AM, Ildus gabidullin said:

    But I think the nodal system is better for compositing, for color grading,  layer system is much better

    I don't think one system is 'better' than another, they both have their strengths and weaknesses. In the world of compositing, it's true that for complex work, nodal systems like Nuke seem to be better suited, but for simpler work, you can often knock something out on After Effects much quicker than in Nuke. In the world of colour grading, Colourists find the system that suites their style/workflow best. Whether you use nodal or layer based software will not effect the speed or quality of your end product - it's the toolset and the craft of the operator that will determine the final result.

    • Like 4
  11. For those of you who dream of owning a Sony BVM X300, the new LG OLED TVs may be the alternative.

    The 2016 models are making quite a stir, with claims that they are matching studio grade, 4K colour critical monitors. They don't have SDI inputs of course, but reviewers are saying that the panels are visually indistinguishable from perfect in areas such as colour and luminance accuracy.

    Other major bullet points are...

    • Perfect black levels (0 cd/m2)
    • The highest peak brightness for an OLED TV (630 to 730 cd/m² for HDR)
    • The smallest brightness variation with viewing angle up through 45 degrees
    • 98% DCI-P3 gamut at 4K
    • Excellent average screen reflections (1.1%)

    There's an in-depth article for those interested at...

    http://www.displaymate.com/OLED_TV2016_ShootOut_1.htm

    • Like 4
  12. The ACES workflow in Resolve is pretty straightforward - as it should be. My reasons for using it in preference to other colour management systems would be in situations where I'm sending out shots for (say) VFX work. My understanding is that if I gave a Nuke compositor a graded shot, and he/she was using ACES within Nuke, I would get back a comped shot that would slot back in without any problem.

    Now, given that VFX work may involve accessing original log camera plates, 3D animation, stills and various effect loops (fire, smoke, particles effects etc) It's important that they see the graded shot correctly so that they can match the other elements to it. Nuke uses the OpenColorIO (OCIO) system, which is compatible with ACES, so you can set up the OCIO configuration to emulate ACES.

    All good so far, but did you know that there are some different ACES standards that Nuke compositors may use as their default settings? Have you heard of ACES 1.01, ACES 2065-1, ACEScc or ACEScg? I'm in the process of trying to find out more about some of these various flavours of the ACES standard, however one quote (on Nukepedia) got me a little worried...

    "ACES2605-1 is very large gamut encoding which makes it very hard to work with. ACEScg primaries are much smaller making it an ideal working space for compositing and working with cg images."

    I'm not sure what this means, but it implies that VFX artists may be working in a reduced colourspace before the final ACES ODT back to what I originally sent them. I'm no colour scientist, but this doesn't sound good for my grade.

    I'd be interested to hear from anyone on these forums who has any experience of this sort of workflow and can shed any light on the subject.

     

    • Like 4
  13. OK, I had a go at looking at the output of my Decklink card.

    To get a file to play through the card from the desktop I had to use Blackmagic's Media Express software.

    I used a test file from http://www.sync-one2.co.uk/support/test-files/

    This is a series of 1 frame bleep/flashes. I fed the HDSDI output from the card through a Leader waveform monitor to both a CRT monitor and a Dreamcolor LED display. The audio was de-embedded from the SDI by the Leader and fed as AES digital audio to a TSL monitoring unit & speakers. This is done to make sure there's as little difference as possible in the audio & video cable lengths, and everything's kept digital for as long as possible.

    Running the test file and watching the CRT monitor, the sync looked spot-on to me. As I previously mentioned, the Dreamcolor display showed the expected video delay (compared with the CRT) of a frame or two.

    I did notice that I'm running quite an old version of the Blackmagic Desktop video software (10.5.4). Never did upgrade as I remember there were problems when 10.6 was released, and I'm a firm believer that if everything works, leave well alone!

  14. On 29/09/2016 at 1:12 PM, Andy Minuth said:

    Tips for the health of your eyes:

    Drink Plenty of Water
    Good Diet (same advice as you'd get from a cardiologist)
    Wear Sunglasses
    Don't Smoke
    30 Minutes Physical Exercise, Minimum 3 times a week

    Wearing sunglasses will make you look cool during a grading session, but won't impress the client. :-)

    Seriously though, I did once try grading with a pair of sunglasses, just to see what happens. Apart from the monitor being a bit dim for my liking, the results weren't as horrendous as I'd thought. It was mainly all just tinted away from neutral white and  some hues were a bit too strong - you could almost call it a look!.

    I did listen to the podcast and would agree that the cardiovascular system is one of the most important mechanisms to maintaining the health of all our organs. It's the only way we receive the nutrients and oxygen required to nourish our eyes.

    One thing that wasn't really covered was how our eye's rods and cones are effected in older age. The lens yellowing effect on the blue end of the spectrum was discussed, but little else about failing colour acuity was mentioned.

    One other thing that I felt was skipped over was the eye muscles themselves. Of course, any muscle strain in the body is to be avoided, but she didn't answer the question about exercising the eye muscles. Every athlete knows the importance of building and maintaining muscle strength by exercise, followed by eating lots of carbs and protein. Can the same philosophy not be applied to the eye muscles, or is simply following the 20/20/20 rule the best we can achieve?

     

    • Like 5
  15. 12 hours ago, Nicolas Hanson said:

    #Output on the Decklink when looking at YouTube or playing a video on desktop. 

    OK, so when you're watching using quicktime player or VLC for example.

    I'm trying to remember if the Blackmagic Desktop Video software has any settings that would control the video or audio timing to your Decklink - I don't remember seeing any. I have a  Decklink card in one of my workstations and will try a test to see if I have the same problem. Are you using the latest version of this software? Looking at the Blackmagic site I see that it's up to version 10.8. I know that VLC has the ability to alter the timing of the audio signal. It's under Tools/Track Syncronization, but this isn't going to help you with YouTube videos.

    • Like 1
  16. I'm not sure quite what you mean...

    When you say "playback on Avid and Resolve" do you mean when you're looking at the viewers inside the software with the computer monitor(s)?

    And when you say " desktop audio is out of sync" are you referring to the output of your Decklink to an external grading/client monitor and speakers?

    If the above is true then the probable reason is due to the external monitor. Virtually all modern LCD,LED, OLED monitors have a delay. It's a function of the fact that due to the way the picture is processed by frame buffers, there's typically a 1 to 2 frame delay in the picture being displayed. CRT displays, (being essentially analogue in nature) didn't have this problem. So in a typical setup with an external display and external loudspeakers, you would need to introduce a delay box in the audio signal chain. Manufacturers of domestic LCD/LED televisions build this into their products so that the sound coming from the TV speakers is in sync.

    What's slightly confusing about the title of your post is that you say you're having an audio lag. In the situation I've described, it would be the picture that lagged behind the audio.

    • Like 1
  17. Intermittent faults like this can be difficult to pin down. The only time I've seen something like this was when I was editing in Avid Symphony. One of my linked/AMA'd clips would occasionally not display and cause the software to hang for around 20 seconds, after which I'd get an error message. 
    It turned out that there was a corrupt frame in the middle of the clip. If you loaded the clip from the start, all was fine. Once I'd played across the corrupt frame, my problems started.

  18. A photochrome print was created by imprinting up to 15, tinted litho stones onto paper. The photochrome prints I've seen often have brownish tints, which sort of reminded me of my youth, when I mixed too many colours together from my paint tin!
     Maybe printing that many colours on top of each other is the reason for the muddy tints in the photochrome process.

    Anyway, I've heard that using 3D LUT Creator does a good job creating photochrome-style effects without introducing problems such as colour banding.

    • Like 5