Tel: 886.909 602 109 Email: [email protected]
196
Input amplitude range
The power ratio (dB) of the maximum signal (usually the maximum safe input level) and minimum
signal (display average noise level) that can be measured at the input of the microwave analyzer within a
given accuracy range, which is always much larger than the dynamic range that can be achieved in a
single measurement.
Input impedance
The terminal impedance of the microwave analyzer for signal generation. The impedance of RF and
microwave analyzer is typically 50
. The normal impedance of some systems (for example, the cable
TV) is 75
. The mismatch degree between the rated impedance and actual one is expressed with the
voltage standing wave ratio (VSWR).
Display range
The difference between the maximum signal and the minimum signal can be observed simultaneously
on the display. For a microwave analyzer with 10 amplitude scale intervals, the display range is the scale
factor x10.
Explicit average noise level
In case of minimum resolution bandwidth and minimum input attenuation, the video bandwidth is
reduced to minimize the peak-to-peak fluctuation of the noise, and the level observed on the microwave
analyzer display is the display average noise level, expressed in dBm. The display average noise level
of the microwave analyzer can be equivalently called to the sensitivity of the microwave analyzer.
Linear display
The vertical scale on the display is proportional to the input signal voltage. The bottom line of the screen
represents 0V, and the top line represents the reference level (depending on a non-zero value of the
specific microwave analyzer). For most Analyzers, the scale factor is equal to the reference level value
divided by the number of cells. In the linear display mode, the value of microwave analyzer is expressed
in Volts and Watts.
Relative amplitude precision
Uncertainty in amplitude measurement. The amplitude of one of the signals is compared to the
amplitude of the other, regardless of the absolute amplitude of either. Factors affecting relative amplitude
accuracy include frequency response, display fidelity, change in input attenuation, IF gain, scale factor
and resolution bandwidth.
Sensitivity
The microwave analyzer measures the ability of a minimum level signal. Sensitivity is further divided into
input signal level sensitivity and equivalent input noise sensitivity. The output produced by the former is
approximately equal to twice of input signal level of average noise value, and the latter is the average
level of internally generated noise converted to the input. The best sensitivity is obtained with the
narrowest resolution bandwidth, minimum input attenuation, and adequate video filtering. Factors
affecting sensitivity include input attenuators, preamplifier, insertion loss of front-end devices, bandwidth
and noise sidebands of IF filters. Although video filters do not improve sensitivity, it can improve
discriminability and repeatability of measurements at low signal-to-noise ratios. The best sensitivity may
conflict with other measurement needs. For example, a smaller resolution bandwidth increases the
sweep time; 0 dB input attenuation increases the input standing wave ratio (VSWR), which reduces the
measurement accuracy; and the increasing of dynamic range of the preamplifier will affect the
microwave analyzer. The relationship between sensitivity and resolution bandwidth is as follows:
PdBm = -FdB+10LogB
Where: PdBm - sensitivity of the microwave analyzer
FdB - noise factor of microwave analyzer
B - 3dB bandwidth of the microwave analyzer (in Hz)
Noise marker
A maker that uses its value to represent the noise level within the 1 Hz equivalent noise bandwidth. If the
noise marker is selected, the sample detection mode is initiated, averaging several trace points around