
34
Using 8-bit Versus 10-bit Video in Standard Definition
While both 8- and 10-bit uncompressed video are capable of providing excellent
quality broadcast video, 10-bit represents a significantly higher quality and is
preferable in many situations when working with standard definition video. (When
working with HD, you’ll want to use Apple ProRes 422 in most cases—unless the
media has already been transferred via FireWire to the Mac, in which case you
might consider HDV or DVCPROHD as alternatives—although ProRes 422 will
still be generally preferable.)
Because 10-bit video has four times the numerical precision when compared to 8-
bit, it has a signal-to-noise ratio 12 db higher than 8-bit video.
Visually, in 8-bit video compared to 10-bit video, you will notice a substantial
difference. In 8-bit video there will be “contour lines” or “striations” visible,
particularly noticeable in scenes having soft gradients like a ramp or sunset. For
example, if a sky region is mostly the same color but varies by only a few digital
numbers from one side of the picture to another, you may see contour lines where
the signal passes from one digital value to the next higher value.
Since each numerical value in a 10-bit system is only one fourth as large as an 8-bit
system's, these contours become invisible and the sky varies smoothly.
10-bit video is often used when the source and output video (or “master”) is also 10-
bit. Even if the input and/or output video is 8-bit, a 10-bit “project” will maintain a
higher quality when significant effects rendering is involved.
Industry standard professional mastering formats—Sony Digital Betacam for
Standard Definition and Panasonic D5 for High Definition—are both true 10-bit
formats.
Summary of Contents for io hd
Page 1: ...www aja com Installation and Operation Guide B e c a u s e i t m a t t e r s Published 5 4 11 ...
Page 4: ...iv ...
Page 8: ...4 ...
Page 26: ...18 ...
Page 32: ...22 ...
Page 90: ...80 ...
Page 94: ...A 4 ...
Page 104: ...B 10 ...
Page 110: ...I 6 ...