Is it possible to set a delay between signal generation and its reading and analysis in automatic tests? I read the thread about frequency response measurements in tape recorders, but the problem also applies to other measurements.
Hi @borax, how much delay are you looking at? There’s a setting (Edit->Settings) called pre-buffer that can help compensate for systems with lots of delay.
Thanks for your response. I need delay times of at least 1.5 s at the highest available sampling rate. If my program works well - the delay adjustment is limited to 131072 samples, which is not enough. The manual mentions something about using the ‘pause’ option in high-latency systems, but it is not clear.
Hi @borax, are you writing software to do the processing via REST or using the standard app? What type of measurement (and generator source) are you needing to make? What size FFT will you need?