IPTV Tabs
Ethernet and Fibre Channel Application
283
MDI/TR 101 290
MDI (Media Delivery Index)
The use of the Media Delivery Index as a testing metric provides the tools to
measure and diagnose network induced impairments for IPTV streaming
media. The Delay Factor (DF) and Media Loss Rate (MLR) together provide
a measure of the quality (quality-of-service) of a delivered media stream
which can be directly correlated the end users ultimate Quality of
Experience (QoE).
Delay Factor (ms)
: The Delay Factor (DF) provides a measure of the
maximum packet delay variation over a period of 1 second. In other
words, the metric presents in milliseconds how much buffer would be
required in the next downstream network element to compensate for
the media packet jitter. Note that by definition (as detailed in RFC 4445)
a DF value representing a minimum of one line packet (in ms) is
reported when no jitter exists in the network. This represents the
minimum buffer size (in ms) required to properly process a media
packet and this value changes depending on the media rate of the
stream. For example, if no jitter exists in the network, a typical
Standard Television stream with a media rate of 3.75 Mbps would
exhibit a Delay Factor of 2.81 ms while for a High Definition Television
stream of 10 Mbps the Delay Factor would be 1.05 ms.
Average,
Minimum
, and
Maximum
values are also displayed.
Media Loss Rate (pps)
: Indicates the count of lost packets in the last
second (packets per second) as per RFC 4445, out-of-order and
duplicate are considered lost packets.
Average, Minimum
, and
Maximum
values are also displayed.
Virtual Buffer Size (Bytes)
: Provides a measure of the required buffer
size that would be required by a downstream network element to
handle the delay variation, in the last second.
Average, Minimum
, and
Maximum
values are also displayed.