So... I've been thinking (dangerously enough) about the thesis of this thread. I got to wondering whether the question posed might (???) be rephrased in a way that might make it more interesting and more useful.
My sub-thesis is that the type and spectrum of noise in LP playback is a factor in the perceived warmth and yumminess of same, but only a factor. Pops and ticks aren't good, full stop, but the low-frequency noise encountered on LP playback (even with an inherently quiet deck), I
suspect, imparts a sense of
acoustic space that is part of the charm of the medium. I wonder if there's anything else? Heck, who knows, maybe phase shifts relating to the imposition and subsequent correction of the RIAA equalization curve applied when the lacquer is cut?
Even many of the
vinylista* will allow that a digitally captured
needle drop sounds pretty good; i.e., pretty vinylistic.
This got me to wondering whether "null testing" could give some insight into roots of the ('vinyl sounds better') phenomenon. Unfortunately, what probably wouldn't work is nulling the digitized output of playing an LP track with "the same" track from a digital source (i.e., not a needle drop). The master tape (or
whatever) used to cut the LP lacquer would almost certainly be different than that used to generate the digital version, if for no other reason than the imposition of RIAA EQ, mixing the bass to mono
and/or rolling off extremely low frequencies in the case of the analog pathway.
I don't know enough about anything (as the ramblings above testify!) to devise a way to do an appropriately controlled comparison to see what's
really different (besides clicks and pops) between the two sources -- but I'll bet there is a good way(?).
_______________
* of which I would have to count myself as a member, although not a rabid one.