-
Posts
409 -
Joined
-
Last visited
Posts posted by Margus Voll
-
-
This is exactly how i have imagined it as well. Thanks!
-
What would represent it visually better ?
Refering to "The ColourCrosstalk interface might not be our most intuitive one" -
I would also see Vectroscope in conjuction of this tool to see what you manipulate.
I would assume when pushing some colors around axis to much then you also get all sorts of artifacts. -
Thanks!
So that basically confirms my reasoning why to look and want to use BL in my feature work more and more.
-
Hi.
I'm little bit confused about the "lense" logic over AAF back to Avid.
Lets say we work on feature and have split everything up in reels.Now if i pass on AAF to editor and director to lay looks on reel and they change anything in edits say drop some shots or move its timing will
it reflect still on the timeline as looks and if shot worked on before that shot get indicated somehow as "not done".Another bit is reconform in BL after such edit change to adjust reels.
Would it mean reconforming reel versions and copy paste looks done either by automation or manually?
Trying to sort workflow side in my head around it a bit better. -
I mean actually something like this:
What i mean by this is it seems more intuitive and understandable visually to me.
This current one is screen capture from Scratch where it is built in feature in Resolve i just got plug in for that.
Did not find any in BL.The way @Andy Minuth mentioned it work well just it is not as clear visually.
Second benefit i see from vector grid or square grid is you can use it for extreme looks like some people do in lutsyou start to attack hue ranges ore violently. I have a job now where dop has made "home made" looks with vectors like this and for mimicking it its much easier and seems faster to me to go with same vector logic. Compress them clearly, ideally your hue is spread under the grid for you to see which part you compress or manipulate.
Maybe it is silly what i ask but seems like not very complex thing to do UI wise perhaps.
In Resolve it was not working very easy for me to mimic specific changes with regular curves for example.
-
I wonder not very much is talked about Nuke Studio and Nuke integration.
I know there is plugin that runs inside Nuke as "view lut" but not much mentioning about it in Nuke Studio in similar ways like in Avid and Flame timeline.
Ideally we would see all VFX side done in Nuke and Nuke Studio holding it together and BL being as mastering tool just showing up on Nuke and Nuke Studio as needed.
Would it be possible to cover this side of workflow a bit more?
-
Thanks.
Will try it out.
-
If you have older mac with FW800 and get FW400 to FW800 cable then that should work.
Software is another tricky bit.
-
In Resolve i'm playing now with Paul Dore's OFX
Not sure how would one do it in Baselight?
I meant if there any native tools or only OFX approach. -
I'm thinking now hypothetically here and wondering how one would approach on Baselight.
I have a feature (doing it on another app still) but the problem seems very universal.
Some times actors face is not very consistent in color, different patches and blobs and what not with light colored translucent skin that needs fixing.My question is if that is doable with very little steps like without masking and keying a lot?
In some simpler apps there is possibility to take some color vectors and "compress" them together to get such effect without key and mask.
Why i'm after it is purely for speed, say you have 2000 - 3000 shots and keying and masking all of them does not seem very practical.
(we don't have assistants here for that eithere) -
I had a job where it was shot with red glass filter on set and then worked under desat in color.
Overall worked well just when pushing channel mixer a lot it started to form some pattern noises at times in Resolve.
-
What do you have in mind ?
-
Thanks!
I will try both option.
Got there by mistake but did not know how to get out.
-
Hi.
I wonder how do you get mouse cursor over to external monitor to pick a key or such ?
I have seen this some were described but don't remember currently where.
Since i attached Aja device to my student seems normal UI drops video window and i'm a bit confused.
-
I would say its half and half going p3 if it calibrates well as you only see wider gamut but you do not
have the big screen behavior (reflective instead of emitting).I would do few test screenings after your p3 try and see.
Have been working like this from rec to cinema also now years and almost 3 years on aces as well. Works great so far.
Main thing i miss from cinema is the reflective big screen not so much gamut usually. -
Also it feels to me if concepts are a bit mixed for you in that question here?
I mean what are you after exactly?
-
Or have many screenings in calibrated cinema and take notes, do changes and screen again.
Works well. Over time you get experienced and it should work out fine.
-
Really like the tools in development now.
Then example in the end he gives is brilliant!
-
Have you seen Dolby demos about it that ML presenters did?
Start here:
https://www.dolby.com/us/en/professional/content-creation/dolby-vision-tutorial-series.html
-
Brilliant, thanks!
-
Here around 5:30 he talks about clip and timeline section.
Seems that is highly usable compared to version 4 as you don't have to go out of the plugin at all. -
I wonder if it could be added as a function so we could tag shots in normal usage and later on look to timeline and render plates
according to tags but with python as the example shows. -
Task based tracks and making them compact is just brilliant!
Creative Boost: On-set. Near-set. Off-set. Stream(line) Production.
in Announcements
I was looking for the recording this morning allready then i figured it is upcoming event