Orash Rahnema

Premium+
  • Posts

    150
  • Joined

  • Last visited

Everything posted by Orash Rahnema

  1. I was also looking at the Tvlogic Lem 205 A http://www.tvlogic.tv/Monitors/M_Spec.asp?sidx=52 In the last few months I worked quite a while with both FSI DM250 and the Tvlogic And the Tvlogic it's definitely not bad at all. The price range it's similar if not a little cheaper then the FSI DM250 As a personal monitor for my home color room and freelance jobs I was thinking to get the FSI DM240, it's led and not oled. Does anyone have any experience with it, would you suggest it?
  2. Thanks a lot for the time taken to write this article, super helpful! I can't wait to get baselight and try it out!!!
  3. Thanks for asking, I was about to do the same but with a lot lower budget if anyone can answer Nicolas question and maybe throw in a lower budget option as well, would be fantastic!
  4. Hi Bruno! I'm not talking about real time playback, I would not expect that, doing compositing that is not online or in front of a client doeen't really require that, I mean, not even flame and smoke do it once you start paint and doing actual comp. I'm talking about how the software works, nuke paint tool is known to be slow, but if used in the right way and optimised in the number of point are used on the spline it handles a huge amount of paintstroke without having real issues. Fusion on the other side it's painfully slow, not just on paint, unfortunatelly on lots of different things. I understand that, then again i'm not saying that resolve has to include fusion, I think the direct link it's the best thing and I think they should have done the same with fairlight, thats it. What bothers me is that there are some issues with the color tools that are not resolved and that at this point i don't think they will be resolved, I would much prefere to see those things resolved or to have tools that would be useful to the colorist rather than have the watercolor effect or a full audio suite at my disposal.
  5. Unfortunately Fusion is a complete disaster, it has to be rewritten top to bottom to make it fast and usable. Still, having some compositing and paint abilities in DVR would be a nice touch and as far as i'm concerned a smarter move then the audio stuff.
  6. There are some interesting feature to explore, still i'm bugged that didn't fix some old annoying stuff like hue vs lum curves and the keyer (saturation)
  7. Oh... Next time i will get someone ti read my posts before, i think we would all save a lot of trouble! There is no difference, its a Word that i got used to use at work and It stuck with me in those explanation, i believe. Both are some kind of transfomation/translation or whatever you want to call It. When a lut is used, so where math is involved, i am used to use the Word "transform", so logtolinear or logtorec709 is the same, anotherone could be going from a 2.4 gamma to 2.2 But, normalizing an Image It doesn't have to bè precise if there is no need to. Its just expanding contrast and make the Image normal, but maybe without following any rules. At l'east these are my thoughts about it. What i wanted to say in the very First post is that there is a massive difference is saying make an Image linear and rec709.
  8. Sorry! As i said previously english isn't my first language. I complittely misunderstood your previous question. Well, hopefully will be usefull to someone else! To me normalization really is whatever takes a log footage into a pleasing "normal" contrast. no difference if is a rec709 lut tranform or a simple contrast expansion.
  9. This are 3 logc images, the flat one is the log as you know, the "normal" one is transform with a logCtoRec709 lut and the third one, the darkest, is the linear image. Basically the light hit the sensor and the camera see a linear image, where the light with 0 energy return 0 as a value in the sensor, light value of 0.1 return 0.1, 0.5 is 0.5, and so on until 1. Mathematically this thing is perfect as the math it's easy, BUT, the law of light physics says that to gain a stop of light you have to double the power, so if we keep using the straight linear output from camera we end up with an image with most of the energy packed and wasted in the low range, where the noise live in the sensor. To avoid this energy waste the camera manufacturer transform the linear values that the sensor see with a logarithmic function so the can distribute in a more efficient way the energy collected by the sensor, creating images that are flat and well distributed in exposure. In this way we have images that holds detail and information across the whole range, but it's flat and ugly. Rec709 (and other like p3 or rec2020 and so on) is a gamma function that tells how the display bend the signal and make it pleasing, plus remain within broadcast standards. It's only a display gamma tho, not a camera gamma. Rec. stands for recommendation, in fact it's not a low, the only law it used to be with the color gammut, but the actual gamma curve was decided by the association of broadcast around the world to make something that was working for everybody once we started to get worldwide broadcast signal and to get all the display similar. So, the tv manufacturer took this recommendation and built their display using this gamma function (or close to it) The broadcast cameras used to record using this recommendation as well simply because they were going live straight to broadcast. So getting to today, we have cameras that record log to give us the best possible dynamic range and broadcast that display a "standard" gamma function. With a lut we can make the log image look like it has been shot with a rec709 camera. But no one is stopping us to send the log image on tv. It really all comes down to, what i see on a rec709 monitor is what i will see on tv, so it's up to the colorist to make the image that he wants. Ok, I'm done and I'm not even sure that is clear, writing about this kind of stuff not in my native language is not easy at all, so I'm sorry if you can't understand, if you need I can try to make it clearer Orash
  10. Sorry to jump in but there's something not clear to me on this. You want to "normalize" your log footage, not transforming to linear. By normalize i mean transform from log gamma image to a display gamma image (eg. rec709) Transform it into a linear state is something different and Davinci does it automatically under the hood so the math works or you do iso you can render it out linear (without the working gamma applied) so if someone needs it to do comp or mixing it with some other linear material it won't have trouble with it. Davinci LogC to Rec709 isn't exactelly the same as what you get as a rec709 signal out of an Alexa mini or Amira, there are slightly difference but to me are not noticeable if not looked side by side. So if the dop wants to see the rec709 as on set, you can safely use the davinci transform. There is a bigger difference with the Alexa plus/XT (not sure SXT) as they use a 1D lut and not 3D.
  11. Hi Marc! I was expecting this answer I'm reading quite a lot of things on hdr, thanks for sharing more. What I honestly can't understand is how "the industry" is expecting to get quality product out. I mean, TVs are already pushing for hdr, same for very small monitors like SmallHD, Atmos, etc. so, people on set are starting to talk about it and expect to see it. I know that right now only netflix and amazon are demanding it, but when someone else will start to demand it as well how are we going to be ready? Unfortunatelly with hdr is not just matter of grading, as said before is understand what's the limits, what is going to hurt eyes, what's going to be "safe" for monitors, even more, how to handle noise on the low levels as everyone says that is the biggest pain in the ass. On top of that I wander about QC, how to handle metadata, gamma curve, grading both HDR and SDR. I mean, without monitors to try and practice is going to be hard to be ready. Do you think could be a good idea to use one of those SmallHD, or Atmos to practice? Or is it just a stupid thought? Here where I live (italy) is not possible to rent a sony monitor, otherwise what i would have done was to rent it one a month and do it. I'm asking to you guys with lots more experience and in market where I believe this issue will hit before than here! thanks!
  12. Yeah i know those ones, as i said on the previous post this are extremely expensive. Are there any inexpensive hdr monitors? How do colorist start practicing on grading hdr if there are not entry level monitors?
  13. Hi all, I was wondering what's the status right now with hdr capable reference monitor. At the moment almost any modern TV claim to be hdr (even if it's not true 100%) but the only monitor I keep hearing about it's sony and dolby and the cost of either is something like $36k. But as I said since lots of tvs are claiming to be hdr, lots of people are starting to talk about it (in a completely wrong way by the way) Something like that cuts out every freelancer and small grading houses, as someone like me for example can't even practice grading hdr and become proficient with the pros and cons. So, to cut it short, do you know if it even exist a hdr monitor, not as good as the sony or dolby, but good enough? How do you guys are approaching and practicing this new technique? cheers, Orash
  14. Hi guys, I'm sure that here or on the other millions forum around this subject has been discussed a lot, unfortunately i still can't find a proper answer. How do you compress to get a file that has the right specs to go on, let's say, vimeo. Every time i upload a file I feel it's just not right. It suffer from vimeo compression and become a little soft, mushy. If i upload something that has grain, the grain get completely destroyed. But then I see on vimeo some stunning videos that looks pristine. So, I have tried h264 straight in Davinci, in HD going from 10000 up to 30000 kb/sec (vimeo suggests 10 to 20K for HD) I have tried compressor, handbrake, different settings and stuff. I always get the same results. I just can keep the quality. So, How do you guys do it? Did you find a successful workflow, do you upload straight PR422HQ? I wish someone could help me on this, sometimes it feel more like witchcraft. cheers, Orash
  15. Can you play the same footage real time on the internal HD? My expirience with any level of Apple trashcan is that above 2k i can't play anything real time, no matter what. As far as i'm concerned is not a storage Speed problem.
  16. yeah, sorry, that's what i meant. I have never had shots that were filtered in camera. I always worked on normal shots that i was treating like red filtered, using tools like rgb mixer or splitter and combiner getting rid of the green and blue channel. But i always ended up needing to have the effect dialed down quite a lot because uncommon pattern were picking trough. So it was interesting to see and know that shots filtered in camera were working muck nicer and you were able to achieve suck black skies and silvery skins
  17. yes exactly, not proper noise, more uncommon pattern, i guess it's the lack of information of the single channel that emerge.
  18. It definatelly has the black and white red filter look above all the first frame you posted, I thought it was done in resolve tho, using rgb mixer or something like that. I mean, it's a really tough decision to take on set, to filter so heavy the image knowing that theres not coming back. I really respect cinematographers like that. Margus, having the image already red filtered did help with noise? I have done quite few b&w treatment and many times i go towards the red filter look, but most of the time i have to stop long before i would like (i'd love to get a b&w image as you got) as the image gets really noisy.
  19. Hi everyone. I usually don't post much of my work around, but this one here it's one of those where i'd really love to get a feedback. It's a short movie that i've recently graded, I quite like the end result, I was aiming to get it as "filmic" as possible. Was quite tough to grade as it was mainly shot with extremely low light and it was really easy to end up with noisy images. Was shot on alexa prores444 and graded in resolve http://www.orash.it/project/in-the-woods/ I'd be very happy to know what you think about it. cheers, Orash
  20. I love how it looks "silver" and not black and white. I have never tried to grade black and white in ACES, how did you find working on it, did it have any benefit?
  21. Every single time I see things like this I understand how little I know about this art and since. At the same time it's frustrating as I believe it's not just by studying but also the work environment and the people that you work with that helps learning, understanding and finessing how to work in such a deep and knowledgeable way. Anyway, thanks for sharing, hopefully sooner or later i will understand a bit more of what he said in some part of it and get more then just the surface!
  22. hi all, this thread is really interesting! I wanted to ask to whoever is already using it if baselight for avid (or nuke) is worth it, how similar is to the proper baselight and what tools are missing. I mean, I am working in the italian market and there are no baselight in Milan (where i am based), around here is mostly resolve and lustre. I've always been very curious about baselight (and nucoda and mistika and scratch) as i believe is good to learn how other tools works, unfortunately there is no way to put your hands on this colour tools here where i am, so either the knowledge remain theoretical based on tutorial or looks like the only real proper affordable solution would be to get a baselight plugin. Unfortunatelly i can't afford 50k+ to get a baselight machine (i wish i could). What i don't really understand is, if filmlight manage to make a plugin for avid and nuke, why don't do a standalone software as well.