The last week has been a whirlwind of testing Red's new 5K Epic digital cinema camera, special thanks to Blair Paulsen for lending us his camera, one of the first Epics available. Beyond its high resolution beautiful sensor, the new kid in town is called "HDRx," and is a creative take on something that digital cameras have lacked for awhile: High Dynamic Range.
Leandro put together a video summing up our tests, check it out on Vimeo:
The Epic sensor without HDRx is already in the 11 to 13 stop range, depending on who you ask. With HDRx, however, you get an entirely separate exposure, 2 to 6 stops darker than your original capture. How it works is this: the camera exposes a proper shutter speed of 1/48 of a second, and then right after takes a quick second exposure at a fraction of that, say 1/96 of a second or shorter. What you get on your R3D files afterward is a video file that contains two separate video tracks, "A" and "X" of the different exposures: one with better exposed shadows, and one with better exposed highlights.
Processing these two separate tracks into something usable is a job for the post process, using a fast system, perhaps a Red Rocket card, and the software that can take advantage of it natively, like Assimilate Scratch. RedCineX has a nifty little blending mode, where you can slide between the two streams and fade them together until you're happy. We at Local Hero, however, did a little experimenting on our own to move beyond simple opacity blending, to see how good we could get an HDRx final image to look using more complex techniques.
The HDRx implementation in Scratch is pretty simple. When you bring in your R3D you have the choice to show either the "A track" 0 or the "X track" 1. Drag the second track into a scaffold layer, and you can choose how to combine them.
We chose to use a Luma Key, where you lay the X track on top and key out the darker areas of the image, then grade both images so that you can't see where one video track leaves off an the other takes over. This can come together quickly, or be finessed quite a bit to make an image that you're happy with. True, the images are slightly different temporally, as well as exhibiting different ammounts of motion blur. If your shot has a lot of motion and hard edges, then you may see some artifacting. In these cases there are software solutions to add motion blur back into the X-track to help them blend better.
Here are some stills (with uncompressed 16 bit TIFF links available) to compare on your own. They are scaled to 2K, perhaps I can get a full 5K version up in the future. Down at the bottom are some ProRes 4444 files as well.
And now the second test image, first a side by side of the ungraded A and X tracks:
And now the final videos, rendered to Prores 4444 in 2K:
We hope to post the actual R3D files soon, as soon as we can figure out a practical way to do so.
Thanks for watching, let us know how you fared with your own grading tests in the comments below.
Chief Technologist, Local Hero Post