AnalogSteph
Major Contributor
This is the phono MM input, where capacitive loading of the cartridge is critical and very much affects its frequency response well within the audible range. Try modeling an MM cartridge as about 750 ohm + 440 mH and you should see the problem - these buggers are super inductive and their source impedance has climbed to >10 kOhms at the upper end of the audible range. The standard value for input impedance these days is 47k || 220 pF, and even that does not go down too well with quite a few higher-end AT cartridges once you add cable capacitance (~130 pF, typ), so it is common practice to reduce input resistance to 36-39k for these. With a total of 600 pF, not too many cartridges will be playing as intended.OK, so from my understanding the guy swapped the original 470 pF caps from the input stage low-pass filter with 100 pF or 125 pF. This will probably change the frequency-cut to way above the original and way more above the human audible limit, but nothing else.
Whacking in a 470 pF may have been needed to pass EMI testing but from a functional POV it's clearly a fail.
Using 47 kOhms has always been a compromise between noise performance and practicality. Going higher would enable a few dB lower noise still, but capacitance would have to be even lower then. This is only feasible if the preamp can be mounted inside the turntable, or even more radically, a tiny, super lightweight SMD buffer circuit can be mounted at the cartridge itself. (An idea that turned out to have been pioneered decades ago, if I remember the diyAudio discussion correctly...)