So we all know that 1% THD is -40dB. (If you didn't, you do now!)
A lot of reviews Amir does, the power vs THD+n chart will show distortion falling as power is increasing, and usually, somewhere around 10-20% of full power, this straight line reaching for the bottom, starts reaching for the side (I consider full power the point at which clipping occurs. So... while the line is falling in a straight line and power is increasing, harmonic distortion is entirely masked by the grass that his hiss. When it stops falling at this constant rate, I believe harmonic distortion is beginning to poke through the noise (ie. it's not entirely masked anymore). When the THD+n value starts quickly rising to the top of the chart, that is when clipping is happening. Distortion rises fast, but it's not extremely fast. The difference of power output at 1% and 10% distortion is actually pretty huge... So say the amplifier is normal-good, and right before it begins clipping at 122 watts into 8 ohms, THD+n is -100dB, or 0.001%. Since manufacturers give often give the power rating at 0.7% THD or 1% THD, I'd like to know roughly, by what amount, does power output increase from -100dB to -40dB? Basically I want to be able to take a rating like:
140W @ 1% THD+n
And multiply it by 0.94 to get where the amp [most likely] begins clipping.
For this to be generally applicable and on the conservative side, it's the power difference going from -80dB to -40db THD+n, when the increase is from clipping.
(sentence not worded the best, but y'know what I mean! (I hope lol) )
A lot of reviews Amir does, the power vs THD+n chart will show distortion falling as power is increasing, and usually, somewhere around 10-20% of full power, this straight line reaching for the bottom, starts reaching for the side (I consider full power the point at which clipping occurs. So... while the line is falling in a straight line and power is increasing, harmonic distortion is entirely masked by the grass that his hiss. When it stops falling at this constant rate, I believe harmonic distortion is beginning to poke through the noise (ie. it's not entirely masked anymore). When the THD+n value starts quickly rising to the top of the chart, that is when clipping is happening. Distortion rises fast, but it's not extremely fast. The difference of power output at 1% and 10% distortion is actually pretty huge... So say the amplifier is normal-good, and right before it begins clipping at 122 watts into 8 ohms, THD+n is -100dB, or 0.001%. Since manufacturers give often give the power rating at 0.7% THD or 1% THD, I'd like to know roughly, by what amount, does power output increase from -100dB to -40dB? Basically I want to be able to take a rating like:
140W @ 1% THD+n
And multiply it by 0.94 to get where the amp [most likely] begins clipping.
For this to be generally applicable and on the conservative side, it's the power difference going from -80dB to -40db THD+n, when the increase is from clipping.
(sentence not worded the best, but y'know what I mean! (I hope lol) )