Orash Rahnema

Premium+
  • Posts

    150
  • Joined

  • Last visited

Everything posted by Orash Rahnema

  1. In davinci I use a color space transform ofx on the timeline level. Setting color space and gamma to be converted from where you are to where you need to go.
  2. That's interesting, I always try to do the 2 deliverables but i actually never thought about people watching youtube and vimeo on tv. True that the surround is part of the colour space, but wasn't that the reason why computer screen manufacturer choose a 2.2. gamma? To compensate for a brighter surround? How do you attach a metadata to a prores or an h264 in order to let the player do the correct conversion? Thanks! Orash
  3. @Emily Haine when you talk about drt's you are talking specifically in an ACES world? Not in aces would that be an output LUT? Is this done only for long term projects or also for commercials? Could you please expand on the difference between print emulation and film emulation. My knowledge is quite limited and I thought that a print emulation was a way to emulate the film stock where the motion pic was printed after the DI, so at the end it still was a film emulation. Thanks!
  4. Hi all, In many interviews and insight I read about using different debayering systems and algorithm to achieve different purposes like get deeper color depth or better details or noise level and what not. I was wandering if anyone could explain me better how this is done. Is the footage debaiered and rendered out in different software, or maybe systems like baselight let you choose the debayer method? In Davinci looks like the choice is all taken in the camera raw tab and there is not really a different debayer option. Am i wrong? cheers, Orash
  5. I agree with Emily. I freelance in a small post house and they have a Sony bvm. Although the contrast is really nice, the color fedelity and the Green tint are really bad. I really don't like to work with it. I only saw but never worked with the hdr Sony monitor, i think that doasn't have any major issue, at least i Hope so, seen how much It cost
  6. Hi Emily, I use it quite a lot in resolve to add saturation, sometimes add color tint and to add sharpness only in the luma channel. I saw a Peter Doyle interview where he was saying that he used it to control the deep blue dress in the tim burton movie. I wish I knew exactely what he did and I wish I would learn more way to use the lab color space.
  7. Hi Abby, unfortunately I've never tried 3dlut creator so I can't really answer that. What I know is that 3dlut creator can actually be used to build luts, Lattice not really. With lattice you can take luts, analyse them by viewing curves or cubes, you can separate the contrast curve and the chroma, and you can combine different luts into one. But most of all, and that is what i used it more, you can export luts in different formats. I for example create a lut in davinci, export it as a .cube and convert it into .aml to load it in the alexa mini.
  8. Where can we see the whole movie?
  9. You can also use lattice Orbitate around the cube and with a slider you can check whats effected
  10. I would say yes It does, if i think italian 70s and American 70s are complittely different in style and look. That said for Italians, American culture Is well known lookwise, so American 70 look Is perceived the same as the italian. Not sure about the other way around. Now, film emulsion wise in Italy at that time was used ferrania and estmancolor quite a lot, in Japan were using fujicolor and in the states Kodak, so i slight change in look was happening, but i don't think is strong enough to be perceived
  11. @Joseph Owens Yes, you are totally right, sorry i overlooked the thing while writing. Although I have so say that it happen to me many times that, even with the supousedly correct idt, i had trouble with some footage, most of the time it was mattepaint.
  12. Love It! Every frame!
  13. Just like margus i use aces in quite a lot of my Jobs for the exact same reason he describe. Once rendered then its the same as every other workflow. Doing adv most of the time i work in aces, render in prores o dpx and then online Comp everything with no problem
  14. if it's a vfx job it's better if anyone use the same colour science to avoid problems, artefacts or anything down the pipe. For example, a compositor composite an alexa footage with some photoshop elements, stock elements or 3d elements under a rec709 display lut. In nuke it looks good, so no problem. Then he export a flat comp for you to grade it. You start working in ACES using an alexa IDT, the alexa footage works fine but the comped element goes crazy as it's transformed with the alexa idt as well. So you end up going crazy doing lots of keys or masks to get the thing back where it belongs. if you both work in ACES, the compositor sees the same thing as you see (the beauty of aces) he can take care of the problems in comp and he knows exactely how you are going to see it's comp as well. Or you both work in YRGB and the comp knows roughly how you are going to see the result at the end, but it's a safe route that everyone knows. Hope makes sense.
  15. You can output to whatever you like and the ODT will command the look of your output. So if your ODT is set to REC709 then your file, whatever they are (tiff, dpx, exr, mov, mxf...) will be translated into rec709 plus your grade (if you grade it)
  16. I use a color transform ofx node on the timeline level keepin the colorspace as r709 and Just changing the gamma. I know its not "right" but i thougt, davinci still transform the Linear aces into a working gamma space and that ofx works with the same math, so It can't Be top much of a problem. At the end It might not be a perfect gamma transform but It did the trick so far. In davinci its possibile to add a dctl which is a custom ctl. But that works before the node correction i believe. I expect the odt to work After. By changing gamma before would not create some problems ti what comes After?
  17. Thanks Cary! Would be good if there was a odt option to change between a R709 gamma 2.4 to 2.2. I do a color transform node on the timeline level to do that, but it becomes a pain in the ass once there are multiple timeline to manage.
  18. Resolve 12.5.3 aces cct 1.0.2 I have just done a quick test to show what i mean: Same scene, rec709, rec709 with a gamma transform from 2.4 to 2.2, and ODT sRGB I have done it with normal exposure and under exposed. I noticed that in a normal exposure environment the transform it's acceptable. the real problem appear within the black zones, so in an underexposed scene I don't think it really work As you can see the sRGB it's a lot more contrasty and with the blacks completely crashed. I would not expect that the 2.2 and the sRGB images look the same as I know that ACES takes in account how the colours are perceived and not just the math behind it. But still I would not expect something like this as I don't think it's right REC709 REC709 GAMMA 2.2 sRGB REC709 REC709 GAMMA 2.2 sRGB
  19. Hi all! I would say that now half of my jobs are ACES, not that anyone asked me to do it, but I quite like it. Many times I need to output for the web, so once the job is done (graded on a FSI gamma 2.4 using Resolve) I have tried to change the odt from rec709 to sRgb. Unfortunately never worked as expected, The gamma change is very strong and the signal gets crashed down really hard. So my workflow so far has been not to change the odt and simply do a gamma transform from 2.4 to 2.2 on the timeline level. Has anyone experience this problem as well? Am I doing something wrong or i didin't understand the principle of the odt? Other system, like baselight, do behave different? cheers, Orash
  20. Be carefull that happen to me quite few times that when the "proxy names o format" is shown, often it keeps also other metadata like resolution and stuff. You might want to double check it. Anyway, something I often do when it takes too long or it gives problems like it's importing proxies or proxies metadata, I load all the raw footage and untick import files. It goes a lot faster and usually it doesn't give any problems
  21. Hi all, I have been asked to possibly do a remote grading online (in davinci i belive) I was wandering if anyone had any experience with it and if there are any particular requirements. Does davinci built in remore grading behave good? Would I need any kind of special hardware and how fast must be the internet connection on my hand and the other side hand. I'm wandering if I could do this easily from my "home office" or if i have to find a grading suite that has some special things. thanks! Orash
  22. Exactely. exr default to linear and most of vxf pipeline read and write linear also to avoid confusion on color space. Reading abby's description of the Image i would assume this is the case. But then again It could have been rendered with davinci or something else and be an Image with some weird color transform applied. If it doesn't work with a lin2log or lin2rec709 then the best solution is to ask to Who provided the files.
  23. Hi Abby, where did you get the exr sequence from? If it's from a vfx department I would expect it to be in linear light rendered out from nuke, try to normalize it with a linear to logC or rec709 lut and see if it works, at least you know if that's the case and then you can take the best approach to it. this is my first thought.
  24. Hi Margus! I was hoping you were coming to rescue I will definitely try 1000 and 4000 patches In the main time i wanted to ask, could that be an issue with data or video level? maybe I have set davinci project wrong. I have done a pre- calibration with colornavigator nx but the only thing I set there was colorspace, gamma and max luminance (i set it to 110 nits) and if i remember correctly i have set my project to data level. The other thing is in the option, I saw in your tutorial that you have some kind of matrix set under the probe section, and somewhere i read that in order to work a backlight level matrix has to be set. I didn't set any of those as I have no idea where to find them. Last thing is, I have set a gamma of 2.2. In the manual it says that a higher gamma is needed to avoid clipping, does that mean that if i want a 2.2 gamma I have to set it to 2.1 or to 2.3? in my mind a higher gamma means a lower value, but maybe is the value that is intended. Anyway, i will try the patch thing and if you have some answer I appreciate it! thanks, Orash
  25. Hi everyone!I need your help as I am sure I am doing many things wrong.I am trying to figure out how to calibrate my eizo CG246 with lightspace. I'm trying to save some money so I want to push this display as long as I can before changing it.I have followed Lightspace guides (as good as I could understand), mixinglight lightspace guide and @Margus Voll tutorialhttp://www.liftgammagain.com/forum/...on-workflow-with-lightspace-and-resolve.5001/To make a long story short I have done a quick profile and the values looked very strange.So I have trying calibrating using Margus tutorial with the 9000+ patches, the dialog box said I hit 98% of the target, then I've uploaded the lut and re-profile it.This is the result I got and to me it doesn't look too good. (as far as I understood from the various guides and tutorials)https://www.dropbox.com/s/dpmsretg2rzqupi/EIZO Profile 28-05-2017-Post.pdf?dl=0What I don't get is, first what I have done wrong, then, I'm currently using this display on many jobs and I didn't have problems with it, if it was so off I believe i would have notice it.Can anyone help me out with it?cheers,Orash