Hi @matt. I happened to use the “Output Gain (dB)” field in the “dbV option” context menu for the first time and realized that I misinterpreted the operation. In fact I thought that such a field was then used like the “input gain” field at the mathematical level to correctly adjust the values then shown. Instead I realized, by inserting a DVM on the output of the DAC of the QA403, that such a field physically changes the value of the output voltage (adds or subtracts that value to the value set in the generators). You have probably already clarified (and I missed it) how “output gain” works, but when time permits, could you give some explanation about how that field should be correctly interpreted and then used? Thank you
Hi @Claudio, the way to think about both input and output gain is as if they were gain blocks outside of the analyzer.
There are two vantage points you can observe the system from. One is the analyzer view point, and that is what you’d see if you made measurments at the analyzer terminals.
The other is the DUT vantage point. That is what you’d see if you made measurements at the DUT.
First, let’s think about input gain because it’s easy. The DUT might output 50 dBV, which we can readily measure the at the DUT. But then, there is an external gain block, which might be a 30 dB attenuator (gain = -30). The analyzer actually see 20 dBV. But by specifying an external input gain, we can switch the view point from the analyzer to the DUT perspective.
That one should be easy.
Now, the output gain is a bit more tricky, mostly because it’s not frequently needed. But for output gain, you are telling the analyzer “there’s an external gain block, and so when I specify a non-zero output gain, I want to reference levels at the DUT and here is the conversion needed”
Conceptually, that operates just like the input gain.
And so, if you specify an output gain of 10 dB, and you want a 0 dBV signal, the output gain will adjust the analyzer output level so that you measure 0 dB at the DUT. That should mean a -10 dBV signal at the analyzer output of the analyzer, and a 0 dBV signal at the DUT.
OK, so far so good I hope.
Now, the QA403 has differential inputs, and it doesn’t matter if if you apply 1Vrms to the L+ IN
and ground the L- IN
or if you apply balanced 0.5Vrms to both L+
and L-
. The hardware will correctly handle it and report the correct value.
But the differential outputs pose a challenge, because it’s not know if the user is using the outputs in a balanced fashion or just a single-ended fashion.
An analyzer with both balanced XLR and single-ended BNC output requires the user to connect to the right terminals and that answers the question right there. But because the balanced outputs can do double-duty (balanced or unbalanced, depending on how you connect), the user must specify to the analyzer how the outputs are being used.
So, if you have connected to an unbalanced device, you can use a 0 dB output gain and that will ensure if you specify a 0 dBV signal you will measure 0 dBV at one of the BNCs.
And if you are connecting balanced, you specify a 6.02 dB output gain, and that ensures if you specify a 0 dBV output signal, you’ll measure 0 dBV across the L+
and L-
outputs.
Hopefully this helps.
Thanks @matt, that clarifies the issue. Actually the problems had arisen not for “Input Gain” which is simple to understand, but just for “Output Gain” which is definitely more complicated to use correctly. Now with what you have written it is much clearer and I think these notes will be helpful to many people. Again thank you for the quick response
P.S. About setting “Output gain” to 6.02dB to get the correct reading when using differential outputs, I wanted to point out a very small problem: if you press the “+6.02 dB” button, an “Output gain” of 602 dB is set. I believe this is due to the fact that I am using a European system that uses a different convention about the use of the decimal point