For this example, let's assume the measured output current from the transducer is 0.996
mA dc. Based on a Max. value of 1 mA equals 500 Watts, the displayed
Watts
in the
Transducer Output
section of the
Transducer Test Screen
should read 500 * 0.996 =
498.0 Watts
The
Accuracy
displayed in the
Transducer Output
section would be equal to the
following,
(498.0 - 499.96 / 499.96) * 100 = % accuracy or
-0.392
%
If this were a 0.5 % transducer, then the firmware would compare the accuracy values
between the Setting Screen and the Test Screen and would display
PASS
in the
Transducer Output
section of the test screen. If this were a 0.2 %, then it would display
TEST FAILED
.
Note: All of the calculations are very similar when testing VAR 1 Element
transducers.
The primary difference is replacing the COS function with the SIN function.
For example, let us assume that the test angle for the VAR transducer is 30 degrees.
This is an important point, since if they were in phase, the SIN of 0
°
is 0, thus the Var
contribution is 0 at the in-phase condition. Only rotating the phase angle 30 degrees do
we create the measured output VAR's would be,
120.01 * 4.166 * SIN 30 = 249.98 VARs
Note: Calculations for VA transducers are the same except there are no
COS or SIN functions.
Therefore, the apparent power (VA) calculation is simplified as
Volts * Amps. For example, for the calculation above the apparent power is,
120.01 * 4.166 = 499.96 VA
3.8.10.7.2
Power Factor 1 Element
The single element power factor transducer requires 1 voltage and 1 current to test. The
MPRT will automatically select the first voltage and current channels available,
V1
and
I1
.
The test will initially start at the default values for voltage and current that are set in the
Default Setting Screen
. For example, 120 Volts L-N, and 5 Amperes. The Power factor
transducer has a range of operation that correlates to the Leading or Lagging phase
angle relationship between the voltage and current inputs. Therefore, when the user
selects Power Factor 1 Element, the
MIN
and
MAX
nomenclature will change to read
LEAD
and
LAG
power factor values. The user is required to input the
LEAD (MIN)
and
LAG (MAX)
power factor values into the provided spaces (normally the same values, i. e.
0.5). The power factor is the trigonometric decimal equivalent value of the COS of the
angle between the V1 voltage and I1 current. For example, when the user inputs the
LEAD
and
LAG
Power Factor values in the
Transducer Setting Screen
, the firmware
can calculate the required test angles for full scale values. Thus for a
LAG
Power Factor
value of 0.5, the current would need to lag the voltage by 60
°
. The Lead and Lag phase
angles require that the vector display be changed to show angles as
±
180
°
. If the default
angle representation is 0 to 360 LAG, then the angle between the voltage and current will
be considered lagging (current lags voltage). In this situation, the typical test angles may
vary between 0 to 90 degrees lag and 359.9 to 270 degrees lag (90 degrees leading).
This could cause some confusion to the user. By forcing the display to
±
180
°
simplifies
the testing considerably. The test will start at unity power factor, or
±
0
°
. Since the default
-
142
www
. ElectricalPartManuals
. com