Did any of you catch that "Wired Science" episode on PBS where analog recording was supposedly compared to digital recording? Personally I found their method of comparison impossibly bizarre - there's no way the testing subjects could be listening to anything other than a digital recording the whole listening test. I'm a friend of the sound mixer for the whole "Wired Science" series and I e-mailed him about my recent experience with setting up a 'Direct Drive' turntable in my studio's green room stereo.
My previous (broken) turntable was 'belt drive', and using a high quality Grado cartridge with that turntable I felt I was getting a very pure representation of my vinyl records into my excellent receiver feeding Mission speakers. With this Direct Drive turntable, also using a Grado cartridge, I felt that the sound of my records was much stiffer sounding. I have my studio's Coleman Audio passive speaker controller hooked up to my green room's receiver and am thus able to hear (for e.g.) my MP3's on my studio's computer on my green room's receiver, and I found the MP3's more cathartic to listen to than my excellent vinyl collection.
I concluded that there are aspects of analog reproduction that degrade sound in very much the same way as digital degradation does. As far as my 'test' was concerned, both mediums were subject to what I would call a 'disruption of the fluidity of the time flow' in the reproduction medium. I have experienced this problem in different kinds of wire, different digital clocks, different A/D/A, different DSP, and different microphones/speakers of course. I would like to postulate that the fundamental audible difference between so called "analog" and so called "digital" is a fidelity problem that is shared by both mediums.
Your thoughts?
Nicholas