The Future Was in the Past: The evolution of PQ instruments through the decades

Published On
Jun 15, 2021

In a recent interview for a historical report on power quality instruments, I was asked to describe the evolution of PQ monitors during my 41-year tenure with one of the leading design/manufacturing companies.

In the late 1970s, the Model 606 was the first commercially available, microprocessor-based PQ instrument. Settings for triggers were made with a flathead screwdriver on rotary switches on the front panel. It had a rectified input and switching power supply, and a room full of them made a very audible screeching sound that caused your brain to feel as if it would explode like in the movie “Scanners.” Perhaps its most memorable feature was the thermal printer, a 2-inch-wide roll of paper tape that would turn blue when the printhead wrote out the INCs, DECs, SAGs, SURGES (what swells were originally called) and IMPULSES (what all transients were initially called). Should that screwdriver (not the operator, of course) have inadvertently set the switch in the wrong position, you were often greeted at the next visit to the monitoring site with a large pile of thermal paper on the floor. Finding the useful data was a challenge.

The next revolution in the 1980s provided much more detailed information about the PQ events, also called disturbances, including graphical capabilities. A picture is only worth a thousand words if someone knows how to interpret voltage waveforms. The new capabilities helped someone skilled in the art determine the source of the disturbances, which was mostly done by the 5–10 engineers in the electric utility’s PQ department. In many cases, the PQ problems originate within facilities and are caused by the loads themselves, which affects susceptible loads. Facility managers put their instruments on their side of the meter with the utility engineers on the other.

The ’90s brought instruments with multiprocessor architecture using several digital signal processors—microprocessors designed to do intense math very quickly and efficiently. This was required, as the calculation of harmonics and power parameters were overloading the traditional CPU processors. The growth of surface-mount parts and programmable logic devices allowed the 45-lb. instruments to shrink down to handheld instruments with color LCD screens and several hours of battery life.

Communication methodology evolved in the late ‘90s from phone line modems to ethernet and wireless connection to LANs (local area networks) and WANs (wide area networks). PQ notifications and information were now available with an internet connection, simultaneously viewed by the facility, utility personnel and third-party consultants. Data produced by PQ instruments had evolved from 2,000 bytes in the first instrument to gigabytes of solid-state memory. Deploying numerous PQ monitors through the distribution system and within a facility allowed observation of the effect of a PQ disturbance to propagate—or travel—hundreds of miles and provide large-scale benchmark studies.

One of the capitalistic results of the growth of portable and permanent system PQ monitors was that dozens of manufacturers began producing instruments. Unfortunately, data from one manufacturer’s instrument often couldn’t be correlated with another’s, since they may use different algorithms for calculations—even something as simple as root mean square. The IEEE and IEC standards-making committees addressed this problem, producing a dozen or more standards on how to measure PQ phenomena with reproducible, verifiable results. This applied to instruments and PQ analysis software, which was also a growing market from instrument manufacturers and independent software companies.

By the turn of the century, we had climbed up the pyramid from data to information to answers, heading toward the next level—knowledge. While data was now being measured in gigabytes, the number of capable analysts began to shrink, through retirements, reductions in utilities’ PQ departments (which weren’t considered profit centers), overload of facility managers wearing multiple hats and consolidation of PQ instrument manufacturers through mergers and acquisitions. More instruments produced more data transmitted to multiple locations, but with fewer people to determine what happened that made the lights blink, the processes stop, or even what caused a blackout.

What most people really want is just for their processes to work efficiently and productively. They don’t care about the squiggly lines that excite PQ engineers. If something happened or is about to happen, they want to be told what it was, who is going to fix it quickly and how to prevent it from happening again. They want the next step, knowledge, or maybe even wisdom, but on their terms, not in “geek speak.”

This leads back to the title. In the mid ‘90s, a company out of San Diego was well ahead of its time with a software program called A.I. Power. This was long before A.I. became a trendy buzz word. It tried to do such with the tools and skill sets available at the time. Unfortunately, if you aren’t 99.9% accurate, one misdiagnosis will taint the user. Maybe its time for the next generation of PQ instruments to seriously revisit that vision and make PQ technology something really valuable to those who don’t even know they need it.

About the Author

Richard P. Bingham

Power Quality Columnist

Richard P. Bingham, a contributing editor for power quality, can be reached at 732.248.4393.

Stay Informed Join our Newsletter

Having trouble finding time to sit down with the latest issue of
ELECTRICAL CONTRACTOR? Don't worry, we'll come to you.