Hi all-
First off I’ve got a 401 so old code here 1.924.
I’m trying to do what should be dirt simple: I want to spectrum analyze an external pink noise signal but have it display as a flat horizontal line using a custom user curve.
This should be dirt simple as it’s just 10dB/decade across a few decades. It didn’t take me long to write one up that covered 1Hz to 10KHz - four decades.
Here’s how I built it:
I pinned 1KHz as my 0dB gain point
I pinned 1/100/1000/10000Hz in exact 10dB increments
I filled in the interim values using linear spacing (~1.12dB per step in the file below)
This gives me severe droop between my pinned decade values. All my decade values are flat but between the decades it droops by three dB.
When I go in & try to manually fix this I seem to get caught in an overshoot loop and for the life of me it won’t cooperate. Am I doing something wrong? Thanks.