Fiber Optic Standards

What would we do without them?

Standards affect all of us, everywhere we go. The minute details of standards are required to allow widespread use of products and to facilitate commerce. Standards define measurements such as the foot and pound (or meter and kilogram for those in the rest of the world) and standards define systems, such as our electrical distribution system (120V AC, 60 Hz here in the United States, 220V AC 50 Hz in most of the world.) Imagine the problems we would have if electrical power specifications differed so that not only was the voltage and frequency different, but even the plugs and outlets were unique to each system.

The more complicated the product or system, the more detail is needed in the standards. When you get to something as complicated as our digital phone systems or computer networks, the standards have to include every operational detail from the cabling to the way digital data is formatted and transferred. The standards must be detailed enough to allow any manufacturer to develop products that will successfully connect to any other manufacturer. This “interoperability” is the basis of most standards.

When fiber optics was first being commercialized in the late 1970s, practically all the components and systems products were under AT&T’s control, since they were the telephone monopoly. As the fiber optic market grew and divestiture loomed, industry groups were formed to develop fiber optic standards. In the United States, the EIA (Electronic Industries Association) sponsored the first fiber optic standards meetings.

In the beginning, meetings were spent deciding which standards were necessary—and arguing whether standards would slow the development of technology. Eventually everyone agreed we needed standards, so a short list of necessary standards became the first assignments. Necessary standards covered specifications for such components as fiber, cable, connectors and splices and how to test component and system performance (loss, bandwidth) and durability (temperature, humidity, altitude, pulling tension, abrasion, flexing, etc.)

In the first few years, we (yes, I was there as a participant) developed these basic standards to allow commerce and interoperability. Once we had standard specifications for fiber and cable, connectors and splices, plus standard procedures for testing the loss, bandwidth and other performance parameters, we had the basic standards in place.

Unlike the TIA/EIA 568 standard for structured cabling, the TIA/EIA fiber optic standards do not specify a standard network architecture. While 568 was designed around UTP cabling used in buildings, fiber standards are applicable to all fiber installations, whether premises or outside plant (buried, aerial, undersea, etc.) and do not dictate how the components are used. In fact, specific fiber standards cover how to measure the performance of fiber optic components in highly diverse environments, such as being installed in buildings, aerially, underground or even on jet aircraft.

For most installers and end users, the most important specifications are those that cover testing after installation, as these provide feedback on the quality of installation. Standards for loss tests after fiber optic cabling installation are covered under OFSTP-14 for multimode cable and OFSTP-7 for single-mode cable. Both standards require using an optical loss test set (or power meter and source) to make an insertion loss test.

The most confusing part of these two standards is that they offer three different methods of making the test that can give different loss results. The methods differ by how one sets the zero reference with the reference test cables. All tests require attaching a launch reference cable to the source and a receive reference cable to the meter, but allow setting the 0 dB reference with just the launch cable, both launch and receive cables or even with both reference cables and a known-good third cable to represent the cable to be tested. Since these three methods include different numbers of connectors in the reference measurement (zero, one or two), the measured loss on the same cable will be different by the amount of loss in those connectors, at approximately 0.3 dB per connector.

Why would three methods be written into a standard? Over the 25-year history of fiber optics, there have been more than 85 different types of fiber optic connectors. Some like the ST or SC are easy to adapt to testers, while others like the MT-RJ are difficult and usually require hybrid reference cables for testing. Sometimes, with hybrid reference cables, it is impossible to do the one or even two cable reference. The three methods allow the installer to choose the test that best suits the connectors on the cable plant.

International standards, meanwhile, also add in a fourth method, using an OTDR, usually required for outside plant installations when splice verification is desirable. Confusing? Not really, as long as test reports provided to the customer disclose the method used. EC

HAYES is a VDV writer and trainer and the president of The Fiber Optic Association. Find him at


About the Author

Jim Hayes

Fiber Optics Columnist and Contributing Editor

Jim Hayes is a VDV writer and trainer and the president of The Fiber Optic Association. Find him at

Stay Informed Join our Newsletter

Having trouble finding time to sit down with the latest issue of
ELECTRICAL CONTRACTOR? Don't worry, we'll come to you.