Asked • 06/25/19

8bit vs 10bit vs 12bit?

I do a lot of HEVC encoding and what I've read is 10bit is always better than 8bit, **even from an 8bit source**, because it avoids rounding errors. Is that true?Also, then is 12bit better than 10bit or are the rounding errors insignificant at that point?

1 Expert Answer

By:

Albert G. answered • 06/28/19

Tutor
4.5 (10)

Albert G. - Photography, Photoshop, Video Production

Still looking for help? Get the right answer, fast.

Ask a question for free

Get a free answer to a quick problem.
Most questions answered within 4 hours.

OR

Find an Online Tutor Now

Choose an expert and meet online. No packages or subscriptions, pay only for the time you need.