For the time being, the overall opinion towards Canon’s latest EOS C200 camcorder has been rather positive and even a bit exaggerated. As with any other camera out there, however, the Canon C200 has its own shortcomings although videographers do seem to really enjoy the newcomer as it provides an amazing feature set for the price. Of course, one of the most appealing features of the device is the ability to record 4096×2160 12-bit Cinema RAW Light up to 30p, as well as 4096×2160 10-bit Cinema RAW Light at 60p.
To seasoned professionals the topic of bit depths still seems to be a bit controversial in a way, so if you are interested to find out for yourself how the available recording options on the C200 stack up against each other, filmmaker Rubidium of Crimson Engine gives you an excellent opportunity as he has put together a short video that demonstrates the true real-world capabilities of the camera when it comes to available recording options.
It’s no secret that currently there are a plenty of consumer-grade video cameras that still shoot at a standard 8-bit depth regardless of the codec available on board. For most applications, this limitation doesn’t seem to be a real deal-breaker for content creators, unless one needs to shoot a documentary or TV broadcast production that specifically requires 10-bit 4:2:2 codec and 50 Mbps bitrate or higher.
The purchase of a more expensive professional camcorder usually deals with these restraints by providing higher bit depth recording options but just because you can shoot at a higher bit depth, doesn’t necessarily mean that you should always do, right? With that in mind, let’s take a closer look at the screen grabs below.
Starting with the 12-bit video shot in RAW, you can see that the overall image looks much more natural, sharper and true-to-life compared to the other footage shot in lower bit-depths. The blues seem more saturated and life-like in the picture, the yellow orbs of light in the background are warmer, while the distinct colors of the skin still manage to blend in a way that doesn’t ‘smudge’ the image. All of this is attributed to the fact that the higher bit depth retains all of the color information captured by the sensor.
While looking at the 10-bit video which was recorded to an external recorder in Apple ProRes 422, though, there is a subtle difference between the 12 bit and 10-bit images. While the 12-bit footage provides a vivid and accurate representation of the blue and yellow colors in the background, the 10-bit video seems to display those colors in a somewhat desaturated manner.
Even though the color patterns are less distinguished, the color palette can be easily improved with a bit of color grading. As for the skin tones, the colors still blend well together although their intensity that was apparent in the 12-bit footage seems to have been lost due to the lower color depth of the 10-bit recording.
Finally, when analyzing the traditional 8-bit video shot at MP4 4:2:0, the difference becomes even more evident when compared to the other shots set to record at higher bit depths. In the 8-bit 4:2:0 footage, the colors tend to be a lot more flat in comparison. There doesn’t seem to be much distinction between the saturation of the yellow orbs and blue shelves/drawers as well. The colors on the face somehow blend into a single overall color with the spots of ‘whiteness’ and ‘redness’ across the board.
So, apparently, there is indeed a difference between shooting with different bit depths in terms of color fidelity and tonal rendition. As for which is the best, the award goes to the 12-bit Raw recording although it’s important to note that with higher bit depths come more resource intensive codecs, and with these codecs come even larger file sizes. While the 12-bit Canon RAW Light provides the highest image quality, you should figure out what bit depth will work for you on a project-to-project basis and whether or not it would be a feasible option to shoot Raw for any of your productions.