Hey all, still getting used to this thing, I see that cd players are generally rated for 2 volts rms output so I was starting there. I see that as you increase the output of the internal generator the overall amplifier output distortion goes down.
For example, the 1 kHz output was the lowest in the first pic, but at rated output I was over 0.4% THD
I then bumped up the 1 kHz output to 1v rms and got the THD at rated output of 45 watts to 0.08%.
I bumped it up again to 2.5v rms output and got the spec better than the rated 0.05%.
How high are you supposed to use? I know with a power amp it’ll be different as some amps need several volts input to get rated output.
I would look at the gain when you are testing. A lot of amps/intg amps/receivers will specify the input signal level to achieve the rated output power. Thd may go down as power goes up but the snr (or thd + noise) tends to go up as you are getting near the limits of the amp. Some amps with low gain are going to require a larger input signal. With a receiver or integrated amp (or power amp with gain controls), the Standard that I have found is to set the gain for 29dB and adjust the level of the input signal to achieve the desired output level.
Man I appreciate that, I have a lot to learn in this area as much of what you said went over my head as of now. I’m going to play around some more and return.
Amir has a nice video on understanding amp measurements: