Hi folks,
I'm trying to determing the bandwidth of amplifier I'll need, as part of control system.
So I wrote up a simulator script and got this waveform:
Fig.1: Waveform from simulator output
Then I tried to apply a lowpass filter to the waveform, and see how the signal will be warped by the limited bandwidth.
I tried to use the "lowpass()" function provided in the signal processing toolbox:
Fig.2: Call of the lowpass function
Here comes the wierd part, the output do not change when I changed the variable "bnd_f", which should be the cut-off frequency. But when I changed the simulator time resolution (i.e. 1/sampling rate), the output from "lowpass()" changed.
Fig.3: The changed value res_t is highlighted
Fig.4: Resolution = 1ns
Fig.5: Resolution = 5ns
Fig.6 Resolution = 10ns
I'm so confused by what's going on. Any suggestion is highly appreciated!
Many Thanks,
Anyu