
DC gain accuracy test when the input impedance is 1 MΩ
1. Connect the active head of Fluke 9500B to CH1 of the oscilloscope, as shown in the
figure above.
2. Set the impedance of Fluke 9500B to 1 MΩ.
3. Output a DC signal with +3 mV
DC
voltage (Vout1) via Fluke 9500B.
4. Configure the oscilloscope:
a. Press the front-panel key
to enable CH1.
b. Click or tap the channel status label at the bottom of the screen. Then the
Vertical menu is displayed. Then click or tap Probe to enter the Probe setting
menu. Set the probe attenuation ratio to "1X".
c. In the Vertical system menu, click or tap 1 MΩ under Impedance to set the
input impedance of CH1 to 1 MΩ.
d. Set the vertical scale to 1 mV/div.
e. Set the horizontal timebase to 1 μs/div.
f. Set the vertical offset to 0 V.
g. In the Horizontal system menu, select "Average" for the Acquisition menu.
Then click or tap the input field for the Averages menu item to set it by using
the pop-up numeric keypad. Set the number of averages item to 32.
h. Adjust the trigger level to avoid that the signals are being triggered by mistake.
5. In the Measure menu, click or tap Vertical measurement item to select "Vavg". The
Vavg measurement result list is displayed at the right section of the screen. Read
the value from the "result" list and record the measurement result of Vavg1.
6. Adjust Fluke 9500B to make it output a DC signal with -3 mV
DC
voltage (Vout2).
7. Enable the average measurement function. Read and record Vavg2.
8. Calculate the relative error of this vertical scale: |(Vavg1 - Vavg2) - (Vout1 - Vout2)|/
Full Scale × 100%.
9. Keep the other settings of the oscilloscope unchanged.
Performance Verification Test
Copyright ©RIGOL TECHNOLOGIES CO., LTD. All rights reserved.
DS70000 Performance Verification
15