I was looking at the SNR spec for a mid 80’s power amp, which they claimed it was 105dB. I measured it using the QA402 (A weighted) as about 82dB at ~200w/8ohms. I was looking over a review on the amp from an AUDIO magazine for that time period and they commented:
I have seen another map from that time period from another manufacturer that specifies a high SNR which I am guessing is for the same reason (using the 1w snr measurement and adding the ratio between 1w and the max power of the amp). I have always stated the SNR at the power level that I am putting out as reported by my QA402. I am curious if anyone knows why they did this or does their measurements that way…Thanks for any input…
Back in the day most test instruments for audio didn’t have the 120dB kind of noise floor we see today. The HP 8903 (which I still occasionally use) has trouble getting to 100dB. So I suspect this is why they used the method you describe.
That could very well be. I suppose I could try to see what the standard for snr was back then- it should specify how it was to be measured…might do that yet…thanks for the reply
I think petveot is right, but I also think you Var are right because I remember that in my university days (more than 40 years ago) I was taught to measure SNR and then calculate it like this. I wouldn’t be surprised if you found that the standard of measurement back then is what you found in the magazine