There was an oscilloscope article written by Steven
B. Warntjes about sustained sample rate in digital
oscilloscopes. I wanted to get a verification from
the experienced users on this group. Let's say I
have an oscilloscope with 25000 sample memory depth
and a real time sampling of 1 gigaSamples / sec
with 200 MHz bandwidth (-3dB).
If I had
an FM carrier at 100 MHz with a 1000 Hz (1 mS period)
test signal modulating it, does that mean that a
digital oscilloscope with the specs above will not
be able to capture the waveform properly?
I was thinking that if the capture window (in
seconds unit) is equal to the memory depth (in samples
unit) divided by the sample rate (in samples / sec
unit), then the maximum capture window that I can
use and still maintain the rated sampling rate would
be 25 uS (micro-seconds).
So that means
that if my carrier signal is deviating between 100MHz
+/- 1kHz, then my capture window would at least
have to be 1ms/div to capture the deviation properly
(?). With a capture window of 1ms/div, my sample
rate would decrease to 25 mega-samples/sec, which
is not enough to sample the 100MHz carrier frequency.
Is that right?
I know that I can just
use a spectrum analyzer, but I wanted to capture
the waveform and duplicate the time-domain plot
of frequency modulation (amplitude vs. time) that
I've seen in the books.