Hello
I’m using the QA402 for the development of a preamplifier with phono input. Measuring the phono inputs with the QA402 has some challenges which need to be considered.
Output noise
A phono preamp has a gain of typcally +40dB for a MM cartridge or +60dB for a MC cartridge (@ 1kHz). Therefore the input to the preamp needs to be low level (e.g. 5mV @ 1kHz = -46dBV). The QA402 output stage has an output attenuator followed by an output driver stage. For such small output signals, the output driver stage adds noise (or the internal -30dB attenuator adds the noise). Therefore, the THD+N and SNR measurements are not accurate. Using a -40dB (9.9k Ohm + 100 Ohm) or -60dB (10k Ohm + 10 Ohm) passive attenuator to feed the preamp input gives better results.
Frequency response
The frequency response of the phono preamp is non linear according to RIAA (or IEC). The weighting feature of the the QA402 allows easy evaluation of the frequency response deviation of the preamp. The standard RIAA playback curve file of the QA402 has relatively low frequency resolution. Therefore the accuracy is not very good if measurements with high resolution (high number of FFT bins) are made. I used a higher resolution RIAA playback file (about 80 frequency/gain pairs) for my measurements. I can share this file if someone is interested.
Frequency response of the QA402
A second little issue is the frequency response of the QA402. At 20kHz, the frequency response is down -0.25dB. While this can be compensated using the right channel as a reference for linear gain amplifiers, I have not tried this yet for the phono preamp. As an alternative, I consider adapting the weighting file accordingly which would allow to make a two channel frequency response measurement of the preamp.
Overall the QA402 is an excellent device for measuring audio equipment such as my preamp. Well done Matt and team!
I’d very much appreciate you sharing the high accuracy RIAA weighting file as I’d noticed there are some odd bumps in the phono response when testing various vintage amplifiers previously yet hadn’t made time to create a better one myself.
Also I’d noticed the noise and had originally tried external reverse compensation instead but that was worse. In the end I’d simply adjusted the signal level for MM to -30 to -35dBV to improve it.
The HF response error seems unusual to say the least. Most ADC’s show infinitesimal errors in the response band. Do you see the .25 dB error in a straight loopback? Could there be some capacitive loading introducing the error? Maybe a loopback with the attenuator in place from the input to the attenuator?
Accurate measurement of RIAA to .1dB is not easy, You need both frequency and amplitude precision. A digital system should provide that. Effective source impedance is also important. Moving coils are very low but a moving magnet cartridge is pretty inductive so that should be emulated to quantify the effects on the input capacitance.
Hi 1audio
the 0.25dB error at 20kHz seems normal for the QA402. It can be easily measured with a loopback frequency response measurement - even without my passive attenuator. While the ADC and DAC have very little ripples at high frequencies the QA402 has analog anti aliasing filters which introduce this little error.
Best regards
Andy
Hi @Avo, yes indeed you are correct. On the QA402 release 0.997 the User button was moved over one slot to the left and renamed from User to User1. To the right of that, a User2 will be added shortly, and this will allow you to have two curves active simultaneously. So, you could have your high-resolution RIAA in User1 and a gentle correction for the ~0.25 dB around 20 kHz.
This will also be useful if you want to notch out powerline freqs that can creep into up in high-gain measurements Although sometimes I wonder if a generic 50 or 60 Hz IIR notch in SW might be useful.
You raise a good question on the DAC noise floor that should probably be part of the spec. Below I’m running the DAC output at 0 dBV into the QA480 notch. That is whacking the 1 kHz about -55 dB and ensuring the ADC isn’t contributing anything meaningful to the measurement. The Noise Minus Distortion (versus N+D) is -104.4 dBV (20 kHz, no weighting). Note the shape–the noise here is quite a bit higher than the QA480 noise and thus you can see the notch filter impacting the noise across the band.
Both of these have the same settings including -40dBV SG output, averaging over 5 plots and maximum smoothing.
The right channel is within -1dB 10-30kHz through the amplifier’s line input, so the 20Hz -3dB on the right channel is due to the phono stage. The left channel has poor tone control calibration, so the bottom end prematurely rolls off. The OEM specification is 10-40kHz -1dB.
There’s still a bit of work for me to do to straighten this amp out. Can’t say I’d be worried about 0.25dB at 20KHz though
I had just finished re-measuring the MM phono input on my mid 80’s NAD intg amp when I read this topic chain. I am measuring out of one of the tape outputs, with no extra capacitive loading applied to the input.
There is a little 60Hz spur that comes an goes, I just made a capture with it not there. I am very pleased with the FR of both channels and don’t care about the little bit of noise >10kHz. While we all like measurement perfection, were are talking about playing vinyl after all… I will look at the MC of the intg amp as well since it is switchable (-66dBv input will be used).
Update- I had made a spreadsheet to calculate the RIAA curves prior to the user weighting function being enabled, and decided to see if I could make my own Hi-res file, which I was once I read how to combine elements in Excel with a “,” . Below is what I ended up with-
I have been characterizing my vintage Carver C1 preamp prior to performing at least some of the “Bill D” mods. In the process of doing so, while looking at the Phono 1 preamp input (MM & set to 0pf), I noticed there seems to be a discrepancy when measuring the THD/SNR manually vs measuring via the Automated test “Amp THD vs Frequency”. For the manual tests I am using my “hi res” RIAA weighting file mentioned above. Measurements were taken at the Tape 1 Monitor Out, and the Gen 1 adjusted to give about ~800mv rms output (close enough to 775mv) per the service manual. Here is the 1Khz measurement:
The frequency response of the preamp stage varies about 2dB from min to max, so the change in Rms levels over frequency was expected. I have not made any THD measurements at 10kHz and guess that a reading of 0% is normal when measuring 20-20kHz. As a side note, per the service manual, the Phono 1 stage’s gain measures what it should at 100, 1000 and 10000 Hz. So, I wanted to try the automated test to measure the distortion to get more points and used this setup:
I unchecked the Autoset Input range since I felt I was in a good range and I was not sure how it worked. I also do not know how to use the Input Level Range Adder, but am guessing it relates to the Autoset Input range and left it at its default. I wished you could just run a single sweep, but I chose to start at the -46dBV input level and jump to -37dBV where I’d done the measurements. As far as I could tell by watching the sweep, the User 1 weighting was still in effect. Here is what I got:
So at 100Hz the THD+N measurements are off by ~4.6dB, at 1Khz by ~7dB and 10kHz by ~6.5dB.
I looked at 30Hz later and it was off by maybe a dB- not very good performance there:
I wanted to know if anyone else has found differences with the manual vs. automated tests for THD- at least for a phono preamp. (I also measured a line level input and found the same kind of difference between the Automated Test method and the manual one.) I also could not find anything in the User’s manual about the settings…
Hi @Var, Can you try to make some small ranges of measurements and compare? For example, in the setup below, there are 10 measurements from 1000 to 1010 Hz, all at a single amplitude. The 10 measurements make it easier to find on the graph, otherwise you are left with just two points that are almost impossible to locate visually. But this should let you compare a few areas manually.
If you uncheck the “autoset input range” then the full scale input you selected will be preserved. And with that unticked, the input range adder isn’t used.
Overall, the plugins should respect the settings you’ve set in the main window.
Matt- thanks again for the suggestion and I re-did the automated measurements, but still saw a difference between the two methods. I was able to just do a single sweep based on your example, which is nice! So, here is the 1Khz “manual” measurement, showing the Left Ch THD+N at ~ -81dB:
Comparing to the -4dBV inputs at 1khz shows -77dB vs -81dB, though I guess on either side of 1khz it is about -81dB. Similarly at 100Hz I measured -81dB:
Which shows about -75dB THD_N vs -81dB for the left Channel. I would expect the measurements from the two measurements to be closer… am I doing something incorrect. ? Thanks again for your hard work!
I am looking at the phono FR of a Carver tuner/preamp from the early 90’s. It looks like crap to say the least (6dB of "flatness). The other inputs look pretty good. I have never had great luck measuring the FR of phono preamp stages (just mm for now). I did see the very 1st post by @AVO where he used an input attenuator to improve the output noise, but there is no mention of trying to improve the FR. The THD/SNR of the phono preamp is pretty close to the published specs. The phono stage ckt is capacitively coupled by a 10uF cap :
Other phono preamps I have seen are not capacitively coupled, but they seem to have around a 47kohm input impedance, whereas the line inputs are up around 1Mohm input impedance. I suppose both 10uf caps on the input (& outputs) could be less than ideal after 30years, but I would be surprised. I noticed that the Tape In input stage is capacitively coupled with a 1uf cap and ~47k input impedance, so I will measure that as well- I have yet to look at a TAPE IN since that is not a function I would use, though I do listen to my Cassette deck often
So my question is if the Q402’s output impedance is fine with driving most phono input stages ? ( with -46dBV)
When refurbishing vintage hifi I use the QA-401 to directly test the response of phono stages. Depending on the design I’ve found between -35 and -55 dBV is fine for most MM inputs. I don’t do any specific impedance matching and use your high res RIAA file posted here previously.
Firstly to determine input dBV I check at the spectral plot to make sure there’s plenty of headroom by looking at THD and keep an eye on SNR too.
For example, last weekend I tested a 1980s Quad 34 Control preamp I’d just bought. It hasn’t been used at all for over seven years after the the owner passed away. This is what the bode plot looks like before I’ve done any servicing:
Thanks for the input- that plot (I have not heard “bode plot” used in about 40 years since college ) is really good. I will probably never use the phono preamp where that unit is used, but it bothers me and I am retired and have time, though that maybe be changing next year. I will post a plot of what I am seeing now later on…
UPDATE- I don’t know what happened when I did my phono preamp measurement, but I have since repeated it a few times and it looks fine. The spec was +/-1dB RIAA from 20-20kHz. I probably should not make measurements past 10pm…:
I think the RIAA file provided in by Matt in his post of 22. June is pretty good. Nevertheles, please find the link to my file which uses 78 frqueny/gain pairs below: