In Calibrating Fiber Optic Instruments, I discussed calibrating fiber optic power meters, which measure optical power. This article will discuss calibration in measuring optical loss, the most common measurement in fiber optics. But first I must define loss.
Loss is the difference in optical power measured before and after some component(s) in a length of fiber. That component could be a splice, connector, patchcord or cable. It could also include a splitter used in a passive optical network (PON).
There are two different ways to measure loss in fiber optics: directly using a light source and power meter, or indirectly using an OTDR. Either way, the measurement of loss is the difference between two power levels, so the accuracy of the measurement depends on the linearity of the instrument’s measurement of optical power. Fortunately, modern instruments are controlled by microprocessors and linearity is generally not a problem, but will be checked when the instruments are serviced or calibrated.
The calibration of loss is quite another issue, though. How do you measure the optical power before the loss to have a reference level that is essentially no loss, or how do you calibrate the instrument and measurement setup for “0 dB” loss?
The most common measurement with a light source and power meter is the loss of a fiber optic cable plant. The test setup you are probably familiar with looks like this:
What we want to measure is the loss of the cable to test, which may include several segments of connected or spliced cables and large amounts of fiber, including the loss of the connectors on each end.
To calibrate the loss measurement, we must have a reference point for “0 dB,” which is the output of the launch reference cable. During setup for the test, use the power meter to measure the optical power output of the launch cable and set this as the “0 dB” reference on the power meter.
To complete the loss test setup, connecting the cable plant under test to the launch cable adds a connection that will have a loss. On the other end of the cable plant where we connect the power meter, add another reference cable (receive cable) to the meter. When the receive cable is connected to the cable plant under test, the connection allows measuring the end connector on the cable plant.
When the test is made, the light output of the source goes down the launch cable fiber to the connection to the cable plant under test, where it will have loss due to the connection. The light will have loss as it travels down the fiber in the cable plant through other components such as splices, connectors or splitters. Finally, the light will be attenuated by the connection to the receive reference cable.
What the meter measures after all the loss is the difference in power after the cable plant loss compared to the calibrated “0 dB” at the output of the launch reference cable, which is exactly what we want to know.
I created a diagram (below) that illustrates this. The cable plant to test is connected between points 1 and 2, and the loss measurement includes all the loss in the cable plant, including the connections on each end.
You can see how the 0 dB calibration is before the initial connection, and the measurement of loss includes the final connection to the receive cable.
This calibrated 0 dB loss reference needs checking periodically to ensure the light source has not varied in output or the launch cable has not degraded from repeated use. So essentially, the test tech calibrates the meter, source and launch cable for the test themselves as necessary.
The other way to test loss is with an OTDR. The OTDR uses an intrinsic characteristic of an optical fiber, backscatter, to make measurements. The OTDR display looks just like the lower half of the drawing above, as it can take a snapshot of the loss of the fiber under test. We can modify the drawing above and make it an OTDR test diagram, which should be familiar to anyone who has used an OTDR.
This drawing shows the OTDR markers (vertical blue lines) that define the two points between which the OTDR measures loss. The “A” marker is at the beginning of the test, before the connection at the launch cable, and the “B” marker is at the end of the cable plant to test, after the connection to the receive cable. The “A” marker sets the 0-dB calibration point and the “B” marker is the point at which the instrument measures loss.
Comparing the two diagrams shows how similar the two measurements are, but remember how differently they are made. The different measurement techniques may result in different loss values and have different sources of measurement uncertainty. However, both begin with a calibration of 0 dB loss to make the measurement.
About The Author
HAYES is a VDV writer and educator and the president of the Fiber Optic Association. Find him at www.JimHayes.com.