
Mass
Spectrometer
PlasmaQuant MS Series
136
Provided the instrument vacuum system is clean and operating at the correct pressure
the detector will age as a function of the quantity of ions detected. As the electron
multiplier’s gain gradually starts to decrease a reduction in sensitivity, particularly at
high mass will become evident. To restore the gain run a detector scan and increase
the high voltage applied to the detector by a suitable amount (typically 50 V
increments). Allow a settling time of 10 minutes after each incremental increase in
voltage before running analysis to avoid drift.
When troubleshooting the mass spectrometer system for problems related to
sensitivity loss, ensure that you eliminate other potential causes before increasing the
detector voltage. The pulse counting DDEM detector is particularly rugged therefore, if
you notice a sudden loss in instrument performance, firstly run a reference method
(typically one used for optimization) in order to double check the sensitivity. Check the
nebulizer, plasma and ion optics settings. Run a mass calibration to be sure that mass
peaks are positioned correctly, since intensity will be low if the peaks drift off the
correct position. Check the sample introduction system in case analyte transfer is
impaired. In addition, check cleanliness of the cones, as surface coating or clogging of
the orifice, in particular the skimmer cone, will cause a reduction in sensitivity. Check
the vacuum pressures. Note that as a cone ages the orifice size increases, which results
in a poorer vacuum hence reduced sensitivity and higher background interference.
The procedure to adjust a detector high voltage is to observe the intensity as a
function of the applied high voltage during a single isotope scan using a moderate
intensity signal (1E
04
to 3E
05
c/s). The isotope closest to the center of the mass range
(typically indium) defined in the method is selected for the detector scan. A successful
scan has the pre-requisite of a valid mass calibration to be able to locate a mass to
single ion scan and a method and sample of a known sensitivity such as the system
setup method. Providing the voltage scan starts at a sufficiently low value the signal
initially rises rapidly, followed by a plateau region. This will give good sensitivity with
little load to the high voltage supply and maximize output linear dynamic range. The
optimal setting is typically 300 V above the ‘knee’. Do not scan the detector in the
saturation region, which typically occurs 800 V above the ‘knee’.
Recommended scan start values for a new detector are 2000 V and the plateau should
begin at approximately 2400 V. The maximum scan window should not go beyond
3000 V when new. Over months of use, the plateau will migrate to higher voltages
and the required control voltage must increase to compensate. Replace the detector
when the ‘knee’ approaches 4 kV.
There are 3 instances where an over range will be generated:
1.
PC SOFTWARE: The pc software has a limit of 5 million counts. If the instrument
sends a count value of above this to the software, it will be displayed as an over
range.
2.
FIRMWARE: Every time the instrument moves to the next mass point to scan, it
sets the attenuation to maximum and checks how many counts it gets in a 100
µS time period. If the counts in maximum attenuation mode exceed 100 in the
100 µS (equivalent to 1 million counts per second), the firmware flags an over
range and it will not scan that mass point. If there is no over range, the firmware
sets the attenuation to whatever the user has chosen, and scans that mass for
the dwell time selected. If the counts during the dwell time exceed 1 million, the
Signal over ranges