© National Instruments
|
4-27
Using a Delay from Sample Clock to Convert Clock
When using an internally generated AI Convert Clock, you also can specify a configurable delay
from AI Sample Clock to the first AI Convert Clock pulse within the sample. By default, this
delay is three ticks of AI Convert Clock Timebase.
Figure 4-17 shows the relationship of AI Sample Clock to AI Convert Clock.
Figure 4-17.
AI Sample Clock and AI Convert Clock
Other Timing Requirements
The sample and conversion level timing of M Series devices work such that clock signals are
gated off unless the proper timing requirements are met. For example, the device ignores both
AI Sample Clock and AI Convert Clock until it receives a valid AI Start Trigger signal. Once
the device recognizes an AI Sample Clock pulse, it ignores subsequent AI Sample Clock pulses
until it receives the correct number of AI Convert Clock pulses.
Similarly, the device ignores all AI Convert Clock pulses until it recognizes an AI Sample Clock
pulse. Once the device receives the correct number of AI Convert Clock pulses, it ignores
subsequent AI Convert Clock pulses until it receives another AI Sample Clock. Figures 4-18,
4-19, 4-20, and 4-21 show timing sequences for a four-channel acquisition (using AI channels
0, 1, 2, and 3) and demonstrate proper and improper sequencing of AI Sample Clock and AI
Convert Clock.
Figure 4-18.
AI Sample Clock Pulses Are Gated Off;
AI Sample Clock Too Fast For Convert Clock
AI Convert Clock Time
bas
e
AI
Sa
mple Clock
AI Convert Clock
Del
a
y from
Sa
mple
Clock
Convert
Period
AI
Sa
mple Clock
AI Convert Clock
Sa
mple #1
Sa
mple #2
Sa
mple #
3
Ch
a
nnel Me
asu
red
1 2
3
0
1 2
3
0
1 2
3
0