Welcome to my final post (for now?) on the benefits of compositing in Scene Referred Space, and how to get at least part of the way there even if you’re limited to footage from an inexpensive camera that can’t natively export Scene Referred values. In this post we wrap up by compositing CG elements against our Scene Referred plates and compare the benefits to working with a Display Referred plates.
For those more comfortable with a more traditional compositing package I wanted to share in this addendum post how to perform the same conversion using the OCIO implementation in the open source compositor Natron.
In this post we’ll wrap up the completion of our custom Color Transform. A LUT that’s unique to our camera characteristics that can be used with Blender’s implementation of OCIO to convert 8-bit camera footage into Scene Referred Space values perfect for more convincing compositing.
Working in Scene Referred Space will help you create more convincing, more accurate CG integration with background plates and remove much of the compositing guesswork of a Display Referred Space workflow. In this post we’ll take the data we captured from our camera and use some math to build a nice smooth response curve to generate a 1D LUT (Look Up Table) that will be used as a color transform in our favorite 3D package or compositor.
The continuing adventures of a Scene Referred VFX workflow on a budget. It starts with capturing our camera package’s dynamic range.
Champagne logarithmic camera output dreams on a beer budget? In this post I introduce a methodology for getting a scene referred video suitable for VFX out of a budget DSLR.
Did you know your computer is in an unhealthy relationship? One that’s holding it back?
I’m not talking about the relationship it has with you - that’s between you, your computer, and your Candy Crush addiction. No, I’m talking about the relationship your computer has with its display.
There are plenty of terrific tutorials that plop in a HDRI to light a scene; and many more that round out a render by using the same HDRI as a background for the shot. A couple of clicks - and hey presto - perfect lighting. Turns out in many cases that's only half the story.