You're reading an older article from ELECTRICAL CONTRACTOR. Some content, such as code-related information, may be outdated. Visit our homepage to view the most up-to-date articles.
According to the Nuclear Energy Institute (NEI), the nuclear energy industry’s policy organization, U.S. nuclear power plants posted an average daily capacity factor of more than 98 percent, with a high of 99.6 percent capacity factor, during the first two weeks of August, playing a vital role in maintaining electricity service during a two-week period of very high temperatures.
Capacity factor is a measure of power plant efficiency, measuring the amount of electricity the plant generates compared to the amount it could have produced at continuous, full-power operation during the same period.
The 104 nuclear power plants operating in 31 states have a combined generating capacity of 100,125 megawatts of electricity, enough to meet the yearly electricity needs of approximately 62 million Americans. According to NEI, nuclear power plants account for about 11 percent of U.S. total electricity generation capacity, but because they operate at high levels of efficiency and reliability, they produce nearly 20 percent of the country’s annual electricity supply.
U.S. nuclear power plants have performed at an average industry capacity factor of more than 87 percent for the past seven years. Last year, nuclear plants produced the second-highest amount of electricity in the industry’s history—more than 787 billion kilowatt-hours (kWh). According to NEI, only three countries—China, Japan and Russia—generated more electricity from all sources than U.S. nuclear power plants alone.
U.S. nuclear plants operated with record-low electricity production fuel costs and operations and maintenance expenses: 1.72 cents/kWh. Coal-fired power plants produced electricity at 2.37 cents/kWh, and natural gas-fired plants had average production costs of 6.75 cents/kWh in 2006, according to Global Energy Decisions data. EC