restorer-john
Grand Contributor
Although the device was developed 45 years ago, its performance, e.g. distortion factor < 0.003%, noise figures etc. have only really been surpassed in the past few years.
The power amplifier SE-A1 also set standards, real 350W class A continuous power (20 - 20,000Hz) at 4 and 8 ohms/channel, damping factor 100 at 8 ohms, 275 watts input power in idle mode, nominal distortion factor 0.003% (at nominal power, 20 Hz - 20 kHz), signal-to-noise ratio 120dB, and a total harmonic distortion of 0.01% with a power bandwidth of 5Hz-100kHz (not only at 1kHz ;o).
That was 1977!
The figures published as specifications were also extremely conservative. They were guaranteed worst case numbers, unlike the best case, single frequency numbers published by so-called SOTA manufacturers these days.
Distortion numbers have not got better in today's 'SOTA' proper preamplifiers. And proper preamplifiers have a range of controls, inputs, tone circuitry, filters and high quality RIAA stages. The single stage, op-amps-in-a-can being held up as 'preamplifiers' by the un-informed in 2022 make me laugh.
Here's an example of one of my vintage (1983) Denon preamplifier's specs:
I cannot measure the THD of my unit- it is below the residual of my test gear and much below 0.001%, at any frequency. All I can see is mains spurs which are the only factor in reducing is S/N. And they are so far down it doesn't matter.
Preamplifiers were a solved problem 45 years ago and have only got more and more stripped down, devoid of actual usefulness and basically a waste of money for anyone who actually has more than one or two pieces of HiFi gear.