Before answering this question we should take a more in-depth look at the definition and structure of video codecs.
There is a lot of confusion and misunderstanding about how codecs work, what are the differences among them, which one is better than the other, which codec should be used in a certain situation etc.
In the next in-depth video tutorial David Kong covers all the important aspects of codecs while keeping us away from all the boring and complex technical information that could be rather confusing for many filmmakers. If you simply enjoy shooting video and you are curious to know more about codecs and how they actually work, the next video might be beneficial for you.
Here are the main topics that David covers in his video:
- What a codec is – And how it differs from a container.
- Different types of codecs – And why he frequently uses 4 different codecs on a single project.
- Bit Depth – What it means and why it matters.
- Chroma Subsampling – 4:4:4, 4:2:2, and 4:2:0, and when it becomes an issue.
- Spatial Compression and Blocking – One of the most common artefacts you see with normal work.
- Temporal Compression – Long-GOP codecs, inter-frame compression, and ALL-I codecs.
- Lossless vs. Lossy compression – How image compression differs from data compression.
- Bit Rate – How to calculate bit rates and the differences between kbps/kBps/Mbps/MBps.
- Raw – Briefly, the difference between Raw, compressed, and uncompressed
How Codecs Work – Tutorial from David Kong on Vimeo.
As David Kong explains, there are certain situations where different codecs should be used. It depends whether you pick one as your primarily capturing codec or whether you use it for editing, delivering or on archival purposes. Whatever type of codec you use, in the first place, you should be aware of its technical characteristics and features such as bit depth, chroma sampling, bit rate, supported resolutions etc.
There is a certain reason why many camera manufacturers have implemented the industry standard, robust and efficient intermediate codecs such as ProRes
in their products. The benefit of an intermediate codec is that it retains higher quality than end-user codecs while still requiring much less expensive disk systems compared to uncompressed video.
ProRes for instance, is a lossy video compression format developed by Apple for use in post-production that supports SD, HD, 2K, 4K, and 5K resolutions. It has different flavours such:
- ProRes Proxy,
- ProRes LT,
- ProRes 422,
- ProRes 4444
- ProRes XQ.
On the other hand, DNxHD is another option intended to be usable as both an intermediate format suitable for use while editing and as a presentation format. DNxHD offers a choice of three user-selectable bit rates:
- 220 Mbit/s with a bit depth of 10 or 8 bits,
- 145 Mbit/s bit depth 8 bits
- 36 Mbit/s with a bit depth of 8 bits.
- DNxHD data is typically stored in an MXF container, although it can also be stored in a QuickTime container.
- Unfortunately, DNxHD still doesn’t support resolutions higher than 1080p.
The CineForm is an intermediate codec that is most commonly wrapped within AVI or MOV container. Current implementations support image formatting for 10-bit 4:2:2 YUV, 12-bit 4:4:4 RGB and RGBA, and 12-bit CFA Bayer filter RAW compression. Compression data-rates typically range from 10:1 to 3.5:1, based on quality settings. There is also an uncompressed mode for RAW files. Cineform supports resolutions up to 4K as well.
It is save to say that any of the above options could be used in almost any situation as a universal codec for capturing, editing, delivering and archival all-in-one. However, in situations where you shoot in resolutions above HD, as a rule-of-thumb, it is always better to choose ProRes or Cineform as an intermediate codec for your end-to-end workflow depending on operating system you use (Mac or PC).
[via philipbloom.net and davidkong.net]