On Thursday, Jun 10, 2004, at 15:57 Canada/Eastern, James Asherman wrote: > On Thursday, June 10, 2004, at 03:25 PM, zhmmy harper wrote: > >> in the right circumstances, the normal human ear can even tell the >> difference between the original CD and a CD-R copy. > > I wouldn't bet on that one. I would have to try it and see. So you would. >> I'm just talking about the normal frequency range of the human ear >> not being able to hear ultra lows and highs. [...] > > No no no. Digitizing, limits the frequency range to rid us of surface > noise. [...] That's not the issue here, since we're talking only digitized music. The issues are, first, lossy compression. The amount of audio data which goes in prior to compression (input) is the same as the amount of audio data generated for listening (output). Lossy compression means that some of the original data in the input is thrown away during compression and recreated in the output. In other words, the output is an approximation of the original sound data, just like a photocopy is an approximation of the original picture. And, as is the case with a photocopy, how close the copy is to the original depends on many factors. The second issue is CD technology, and the weakness of the Red Book standard, which defines the CD-DA format. Ripping CD-DA can be inaccurate, and how well it's done depends both on the CD drive, and on the software used (and there is no Mac software that comes close to the precision offered by Exact Audio Copy on Windows). A further problem is posed by the technological difference between CD-ROM and CD-R. Because it uses a dye instead of metal, during playback the latter has a considerably higher error rate than the former, hence more frames are mechanical approximations of the original. All this means that a CD-DA/CD-R copy is rarely an exact duplication of the CD-DA/CD-ROM original, and, in the right circumstances -- a rich source, a good audio reproduction system, a trained (but normal) ear -- the difference can be detected. <0x0192>