Well in general, I though it was a hardware issue.
The drives read the "bits" (as it should be doing) as bytes. Reading the character as opposed to the bit I guess speeds things up for them.
Well when they master discs, they have the luxury of actual bit layout, thus how they generate bad bytes/sectors. (mind you I'm a geek, but not [all] knowing)
Thus the problem, when they introduce error bits as thier form of copy resistance, they do have access to bits. When opticals read it, they see the bytes. Also our writers are not "industrial grade" (if such exists) either way there is a presumstion that our writers write in bytes, not bits.
Course, I could just be a "munkies unkle".