Orash Rahnema

Premium+
  • Posts

    150
  • Joined

  • Last visited

Posts posted by Orash Rahnema

  1. @Joseph Owens 

    Yes, you are totally right, sorry i overlooked the thing while writing.

    Although I have so say that it happen to me many times that, even with the supousedly correct idt, i had trouble with some footage, most of the time it was mattepaint.

  2. Just like margus i use aces in quite a lot of my Jobs for the exact same reason he describe. 

    Once rendered then its the same as every other workflow. 

    Doing adv most of the time i work in aces, render in prores o dpx and then online Comp everything with no problem

  3. if it's a vfx job it's better if anyone use the same colour science to avoid problems, artefacts or anything down the pipe.

    For example, a compositor composite an alexa footage with some photoshop elements, stock elements or 3d elements under a rec709 display lut. In nuke it looks good, so no problem.

    Then he export a flat comp for you to grade it.

    You start working in ACES using an alexa IDT, the alexa footage works fine but the comped element goes crazy as it's transformed with the alexa idt as well.

    So you end up going crazy doing lots of keys or masks to get the thing back where it belongs.

    if you both work in ACES, the compositor sees the same thing as you see (the beauty of aces) he can take care of the problems in comp and he knows exactely how you are going to see it's comp as well.

    Or you both work in YRGB and the comp knows roughly how you are going to see the result at the end, but it's a safe route that everyone knows.

    Hope makes sense.

     

     

  4. You can output to whatever you like and the ODT will command the look of your output.

    So if your ODT is set to REC709 then your file, whatever they are (tiff, dpx, exr, mov, mxf...) will be translated into rec709 plus your grade (if you grade it)

  5. I use a color transform ofx node on the timeline level keepin the colorspace as r709 and Just changing the gamma. 

    I know its not "right" but i thougt, davinci still transform the Linear aces into a working gamma space and that ofx works with the same math, so It can't Be top much of a problem. 

    At the end It might not be a perfect gamma transform but It did the trick so far. 

     

    In davinci its possibile to add a dctl which is a custom ctl. 

    But that works before the node correction i believe. 

    I expect the odt to work After. 

    By changing gamma before would not create some problems ti what comes After?

     

     

    • Like 1
  6. Resolve 12.5.3

    aces cct 1.0.2

    I have just done a quick test to show what i mean:

    Same scene, rec709, rec709 with a gamma transform from 2.4 to 2.2, and ODT sRGB

    I have done it with normal exposure and under exposed.
    I noticed that in a normal exposure environment the transform it's acceptable.
    the real problem appear within the black zones, so in an underexposed scene I don't think it really work

    As you can see the sRGB it's a lot more contrasty and with the blacks completely crashed.

    I would not expect that the 2.2 and the sRGB images look the same as I know that ACES takes in account how the colours are perceived and not just the math behind it.
    But still I would not expect something like this as I don't think it's right

     

    REC709

    ODT_RC709.jpg

    REC709 GAMMA 2.2

    ODT_RC709_G22.jpg

    sRGB

    ODT_sRGB.jpg

    REC709

    ODT_RC709_DARK.jpg

    REC709 GAMMA 2.2

    ODT_RC709_G22_DARK.jpg

    sRGB

    ODT_sRGB_DARK.jpg

     

    • Like 2
  7. Hi all!

    I would say that now half of my jobs are ACES, not that anyone asked me to do it, but I quite like it.

    Many times I need to output for the web, so once the job is done (graded on a FSI gamma 2.4 using Resolve) I have tried to change the odt from rec709 to sRgb.

    Unfortunately never worked as expected, The gamma change is very strong and the signal gets crashed down really hard.

    So my workflow so far has been not to change the odt and simply do a gamma transform from 2.4 to 2.2 on the timeline level.

    Has anyone experience this problem as well?

    Am I doing something wrong or i didin't understand the principle of the odt?

    Other system, like baselight, do behave different?

    cheers,

    Orash

    • Like 2
  8. Be carefull that happen to me quite few times that when the "proxy names o format" is shown, often it keeps also other metadata like resolution and stuff.

    You might want to double check it.

    Anyway, something I often do when it takes too long or it gives problems like it's importing proxies or proxies metadata, I load all the raw footage and untick import files.

    It goes a lot faster and usually it doesn't give any problems

     

    • Like 1
  9. Hi all,

    I have been asked to possibly do a remote grading online (in davinci i belive)

    I was wandering if anyone had any experience with it and if there are any particular requirements.

    Does davinci built in remore grading behave good?

    Would I need any kind of special hardware and how fast must be the internet connection on my hand and the other side hand.

    I'm wandering if I could do this easily from my "home office" or if i have to find a grading suite that has some special things.

    thanks!

    Orash

     

  10. 2 hours ago, Thomas Singh said:

    The color spaces available in Nuke are many so if you want to be 100% sure you need to find a way to extract the data if you can't have them tell you what is applied. nukecolorspace.png.376e1dd78b22669acd15339b4f66b5cc.png

    Exactely.

    exr default to linear and most of vxf pipeline read and write linear also to avoid confusion on color space. 

    Reading abby's description of the Image i would assume this is the case. 

     

    But then again It could have been rendered with davinci or something else and be an Image with some weird color transform applied. 

    If  it doesn't work with a lin2log or lin2rec709 then the best solution is to ask to Who provided the files. 

     

    • Like 1
  11. Hi Abby, 

    where did you get the exr sequence from?

    If it's from a vfx department I would expect it to be in linear light rendered out from nuke, try to normalize it with a linear to logC or rec709 lut and see if it works, at least you know if that's the case and then you can take the best approach to it.

    this is my first thought.

    • Like 1
  12. Hi Margus!

    I was hoping you were coming to rescue :)

    I will definitely try 1000 and 4000 patches

    In the main time i wanted to ask, could that be an issue with data or video level? maybe I have set davinci project wrong.

    I have done a pre- calibration with colornavigator nx but the only thing I set there was colorspace, gamma and max luminance (i set it to 110 nits) and if i remember correctly i have set my project to data level.

    The other thing is in the option, I saw in your tutorial that you have some kind of matrix set under the probe section, and somewhere i read that in order to work a backlight level matrix has to be set.

    I didn't set any of those as I have no idea where to find them.

    Last thing is, I have set a gamma of 2.2. In the manual it says that a higher gamma is needed to avoid clipping, does that mean that if i want a 2.2 gamma I have to set it to 2.1 or to 2.3? in my mind a higher gamma means a lower value, but maybe is the value that is intended.

    Anyway, i will try the patch thing and if you have some answer I appreciate it! :)

    thanks,

    Orash

     

     

     

    • Like 2
  13. Hi everyone!
    I need your help as I am sure I am doing many things wrong.

    I am trying to figure out how to calibrate my eizo CG246 with lightspace.

    I'm trying to save some money so I want to push this display as long as I can before changing it.

    I have followed Lightspace guides (as good as I could understand), mixinglight lightspace guide and @Margus Voll tutorial
    http://www.liftgammagain.com/forum/...on-workflow-with-lightspace-and-resolve.5001/

    To make a long story short I have done a quick profile and the values looked very strange.
    So I have trying calibrating using Margus tutorial with the 9000+ patches, the dialog box said I hit 98% of the target, then I've uploaded the lut and re-profile it.

    This is the result I got and to me it doesn't look too good. (as far as I understood from the various guides and tutorials)

    https://www.dropbox.com/s/dpmsretg2rzqupi/EIZO Profile 28-05-2017-Post.pdf?dl=0

    What I don't get is, first what I have done wrong, then, I'm currently using this display on many jobs and I didn't have problems with it, if it was so off I believe i would have notice it.

    Can anyone help me out with it?

    cheers,
    Orash

    • Like 1
  14. I was also looking at the Tvlogic Lem 205 A

    http://www.tvlogic.tv/Monitors/M_Spec.asp?sidx=52

    In the last few months I worked quite a while with both FSI DM250 and the Tvlogic

    And the Tvlogic it's definitely not bad at all. The price range it's similar if not a little cheaper then the FSI DM250

     

    As a personal monitor for my home color room and freelance jobs I was thinking to get the FSI DM240, it's led and not oled.

    Does anyone have any experience with it, would you suggest it?

    • Like 2
  15. 25 minutes ago, Bruno Mansi said:

    Wow, a bit harsh!

    I've played a little with Fusion and to be honest, I don't find it any slower than Nuke. Both suffer from a lack of any real-time playback, but given what these programs are trying to achieve, they're always going require a lot of computing horsepower.

    Hi Bruno!

    I'm not talking about real time playback, I would not expect that, doing compositing that is not online or in front of a client doeen't really require that, I mean, not even flame and smoke do it once you start paint and doing actual comp.

    I'm talking about how the software works, nuke paint tool is known to be slow, but if used in the right way and optimised in the number of point are used on the spline it handles a huge amount of paintstroke without having real issues.

    Fusion on the other side it's painfully slow, not just on paint, unfortunatelly on lots of different things.

    32 minutes ago, Bruno Mansi said:

    I think your comment highlights the problem companies like Blackmagic and Adobe have when trying to 'please all of the people all of the time'...

    I understand that, then again i'm not saying that resolve has to include fusion, I think the direct link it's the best thing and I think they should have done the same with fairlight, thats it.

    What bothers me is that there are some issues with the color tools that are not resolved and that at this point i don't think they will be resolved, I would much prefere to see those things resolved or to have tools that would be useful to the colorist rather than have the watercolor effect or a full audio suite at my disposal.

     

    • Like 1
  16. 10 hours ago, Abby Bader said:

    It looks like they're heading in that direction, and they will probably try to integrate Fusion or parts of its functionality. I guess speed and reel-time issues is the reason they haven't done so already.

    Unfortunately Fusion is a complete disaster, it has to be rewritten top to bottom to make it fast and usable.

    Still, having some compositing and paint abilities in DVR would be a nice touch and as far as i'm concerned a smarter move then the audio stuff.

  17. Oh... Next time i will get someone ti read my posts before, i think we would all save a lot of trouble! :)

    There is no difference, its a Word that i got used to use at work and It stuck with me in those explanation, i believe. 

    Both are some kind of transfomation/translation or whatever you want to call It. 

    When a lut is used, so where math is involved, i am used to use the Word "transform", so logtolinear or logtorec709 is the same, anotherone could be going from a 2.4 gamma to 2.2

    But, normalizing an Image It doesn't have to bè precise if there is no need to. 

    Its just expanding contrast and make the Image normal, but maybe without following any rules. 

    At l'east these are my thoughts about it. 

    What i wanted to say in the very First post is that there is a massive difference is saying make an Image linear and rec709.

    • Like 1
  18. Sorry!

    As i said previously english isn't my first language. I complittely misunderstood your previous question. :)

    Well, hopefully will be usefull to someone else!

    To me normalization really is whatever takes a log footage into a pleasing "normal" contrast. no difference if is a rec709 lut tranform or a simple contrast expansion.

  19. This are 3 logc images, the flat one is the log as you know, the "normal" one is transform with a logCtoRec709 lut and the third one, the darkest, is the linear image.

    Basically the light hit the sensor and the camera see a linear image, where the light with 0 energy return 0 as a value in the sensor, light value of 0.1 return 0.1, 0.5 is 0.5, and so on until 1.

    Mathematically this thing is perfect as the math it's easy, BUT, the law of light physics says that to gain a stop of light you have to double the power, so if we keep using the straight linear output from camera we end up with an image with most of the energy packed and wasted in the low range, where the noise live in the sensor.

    To avoid this energy waste the camera manufacturer transform the linear values that the sensor see with a logarithmic function so the can distribute in a more efficient way the energy collected by the sensor, creating images that are flat and well distributed in exposure.

    In this way we have images that holds detail and information across the whole range, but it's flat and ugly.

    Rec709 (and other like p3 or rec2020 and so on) is a gamma function that tells how the display bend the signal and make it pleasing, plus remain within broadcast standards.

    It's only a display gamma tho, not a camera gamma.

    Rec. stands for recommendation, in fact it's not a low, the only law it used to be with the color gammut, but the actual gamma curve was decided by the association of broadcast around the world to make something that was working for everybody once we started to get worldwide broadcast signal and to get all the display similar.

    So, the tv manufacturer took this recommendation and built their display using this gamma function (or close to it)

    The broadcast cameras used to record using this recommendation as well simply because they were going live straight to broadcast.

     

    So getting to today, we have cameras that record log to give us the best possible dynamic range and broadcast that display a "standard" gamma function.

    With a lut we can make the log image look like it has been shot with a rec709 camera.

    But no one is stopping us to send the log image on tv.

    It really all comes down to, what i see on a rec709 monitor is what i will see on tv, so it's up to the colorist to make the image that he wants.

     

    Ok, I'm done and I'm not even sure that is clear, writing about this kind of stuff not in my native language is not easy at all, so I'm sorry if you can't understand, if you need I can try to make it clearer :)

    Orash

     

    logtransform_1.1.1.jpg

    logtransform_1.1.2.jpg

    logtransform_1.1.3.jpg

    • Like 1
  20. 13 hours ago, Nicolas Hanson said:

    I just want to know if applying this LUT will put the footage back to the linear condition that was defined by the DOP on set.

    Sorry to jump in but there's something not clear to me on this.

    You want to "normalize" your log footage, not transforming to linear.

    By  normalize i mean transform from log gamma image to a display gamma image (eg. rec709)

    Transform it into a linear state is something different and Davinci does it automatically under the hood so the math works or you do iso you can render it out linear (without the working gamma applied) so if someone needs it to do comp or mixing it with some other linear material it won't have trouble with it.

     

    13 hours ago, Nicolas Hanson said:

    Put another way, is the software built in LUT applying the same transformation as the camera is doing on set?

    Davinci LogC to Rec709 isn't exactelly the same as what you get as a rec709 signal out of an Alexa mini or Amira, there are slightly difference but to me are not noticeable if not looked side by side.

    So if the dop wants to see the rec709 as on set, you can safely use the davinci transform.

    There is a bigger difference with the Alexa plus/XT (not sure SXT) as they use a 1D lut and not 3D.

     

  21. 21 hours ago, Marc Wielage said:

    Short answer: No, there are no inexpensive HDR grading monitors. (Not yet.)

    Hi Marc!

    I was expecting this answer :(

    I'm reading quite a lot of things on hdr, thanks for sharing more.

    What I honestly can't understand is how "the industry" is expecting to get quality product out.

    I mean, TVs are already pushing for hdr, same for very small monitors like SmallHD, Atmos, etc. so, people on set are starting to talk about it and expect to see it.

    I know that right now only netflix and amazon are demanding it, but when someone else will start to demand it as well how are we going to be ready?

    Unfortunatelly with hdr is not just matter of grading, as said before is understand what's the limits, what is going to hurt eyes, what's going to be "safe" for monitors, even more, how to handle noise on the low levels as everyone says that is the biggest pain in the ass.

    On top of that I wander about QC, how to handle metadata, gamma curve, grading both HDR and SDR.

    I mean, without monitors to try and practice is going to be hard to be ready.

     

    Do you think could be a good idea to use one of those SmallHD, or Atmos to practice?

    Or is it just a stupid thought?

    Here where I live (italy) is not possible to rent a sony monitor, otherwise what i would have done was to rent it one a month and do it.

    I'm asking to you guys with lots more experience and in market where I believe this issue will hit before than here!

    thanks!