rankus wrote on Tue, 17 October 2006 15:16 |
I read a few years back that the CDR acceptable error rate is x00 errors per second....(can't recall exact number but it was in the hundreds)
|
Red Book specifies two levels of Reed-Solomon coding, which if you weren't lazy like me, you could look up and find out exactly how many errors are correctable. It's obvious that it's a fairly robust system. But that's for audio. I don't recall the exact ECC used for CD data discs, but suffice it to say: it works.
Quote: |
Someone had done some testing on multi generation CDR to CDR transfers of audio, and found that they could hear a generational loss at approx. 14 generations....
|
Here's the deal. A corrected error is
indistinguishable from no error occuring at all. That's the magic behind error correction and convolutional coding. If you're getting uncorrectable errors, it's likely that the disc is damaged, or the mechanism that's reading the disc is damaged.
To be honest, I don't know if those CD-R duplicator thingies read raw bits from the disc and make the dupes that way, or if they actually read and decode the data before writing. I suspect that they do the latter, since the IDE interface gives you the decoded data, not the raw bits. (The drives do all of the work.)
Having said all of that: it's fairly simple to test to see if you've got identical data after
n-generation copies. Simply write a program that reads the data from the original disc, and also reads the data from the copy disc, and compare. No magic "nulling" or any other silliness (like a listening test) -- just a direct data compare.
And, again, back to the original hard drive vs the data copied onto another hard drive argument: if you couldn't copy data from one drive to another without 100% absolute confidence the copy would be identical, we wouldn't be having this discussion, as none of our computers would work.
-a