Nuclear Power Industry Headed in Two Directions

On May 8, 2019 the National Public Radio website posted two articles related to the nuclear power industry. These articles each reported on independent unrelated events. However, when taken together, they reveal two contrasting directions of the nuclear power industry.

The first article, entitled Three Mile Island Nuclear Plant to Close, Latest Symbol of Struggling Industry, could be considered to be the closing chapter of the Three Mile Island nuclear power accident that occurred 40 years ago.

The General Public Utilities (GPU) Three Mile Island Nuclear Generating Plant was located close to Harrisburg, Pennsylvania. At the time of the plant’s construction generation, transmission and distribution facilities were all considered to be part of the utility’s regulated system. Under that regulatory model GPU could decide what type of generation facilities to build and it would recover the costs of those facilities from its ratepayers through regulated rates.

Large base load nuclear power plants, like Three Mile Island, were supposed to be the perfect answer for our electricity hungry economy. Nuclear plants do not emit pollutants and the electricity from those plants was expected to be exceedingly cheap. The Chairman of the Federal Power Commission was supposed to have said that electricity from nuclear power was “going to be so inexpensive it would not even have to be metered.”

But nuclear power did not turn out to be inexpensive. In fact, because of design changes found to be required during construction, it turned out to be an extremely expensive source of power. In addition, because of the recession of the 1970s industrial electric consumption was lower than anticipated and there was a question of whether the new plants were even needed. By the mid-1970s consumer advocates were arguing that regulatory agencies should order utilities to discontinue construction of their nuclear power plants and keep the costs out of rates.

The regulators were not initially sympathetic to the arguments of the consumer advocates. They did not order the discontinuation of construction and they approved rates that included recovery of the nuclear plant costs. However, that all changed on March 28, 1979 when an accident in Three Mile Island’s Unit 2 caused a partial melt-down of the nuclear fuel rods.

After the accident those that opposed nuclear power because of its impact on rates were joined by those that opposed nuclear power because of their concerns with its safety. This opposition was effective. Utility orders for 120 nuclear reactors were cancelled as virtually all plans for new plants were abandoned.

Even through new construction was halted, plants that were already in operation lived on. In the United States there are still 60 nuclear power plants with 98 reactors in operation. This includes Unit 1 at Three Mile Island which was not damaged by the 1979 accident. In 2018 these 98 reactors produced about 20% of the nation’s electricity. And most importantly, they produced that electricity without emitting any carbon dioxide or other greenhouse gas.

With all of the concern about climate change it would seem to make sense to find a way to retain, if not to expand, nuclear power’s share of the nation’s electric production. However, things have changed since 1979. In most parts of the country generation is no longer considered to be part of the utility’s regulated system. In other words, most utilities can no longer build the generation plant they want and expect to recover the costs through their rates. For those utilities generation is now a competitive service and the costs for that service can only be recovered if the plant successfully competes with other sources of electric production.

Three Mile Island Unit 1 is typical of nuclear generating plants located in areas where generation is considered to be a competitive service. It has, in recent years, struggled to remain competitive with electricity produced by renewables and low cost gas produced by fracking. Now these nuclear units are at an age where they need expensive upgrades to continue in operation. The current competitive prices for electricity do not support the cost of those upgrades.

As explained in the NPR article, Exelon, the current owner of Three Mile Island Unit 1, sought subsidies from the Commonwealth of Pennsylvania to keep the plant in operation. However, Pennsylvania did not agree to the subsidies and Exelon announced the closure of Unit 1 effective in September, 2019.

The fate of Three Mile Island Unit 1 likely reflects the fate of most of the other large base load nuclear generating plants. Where their owners are unable to recover costs either through regulated rates or government subsidies the plants are being retired.

And there is little likelihood that new large base load nuclear generating plants will be built to take their place. The only such plant currently under construction is Vogtle Units 3 and 4 which, if completed, will be owned primarily by Georgia Power Company. Vogtle Units 3 and 4 are turning out to be extremely expensive – current cost projections are expected to exceed $18 billion. These facilities rely on huge government subsidies and Georgia Power’s continuing ability to recover its generation costs through its regulated rates. In the absence of the subsidies and regulatory rate recovery this type of facility would be very difficult, if not impossible, to finance and construct.

Although it appears that large scale base load nuclear generation is going to be used less and less the second article on the NPR web site – entitled This Company Says the Future of Nuclear Energy is Smaller Cheaper and Saferdescribes a different type of nuclear generation that may be ready to take its place. This second article describes the efforts of an Oregon company, named NuScale Power, to build smaller, simpler and less expensive nuclear generating plants. NuScale plans to build these modular plants at its plant and to ship the completed plants to their points of use.

NuScale contends that its plants are safer than traditional nuclear plants because they do not rely upon pumps and generators – which can fail in the event of an emergency – to provide cooling for the reactors. Instead, the reactors are located in a containment vessel in a pool of water which provides passive cooling. The following video depicts the unique operation of the NuScale plant.

https://youtu.be/wDMsXEur1fs

NuScale claims that its plants can be used either jointly as a base load facility or as a small scale back-up for the intermittent generation from a wind or solar farm. NuScale further claims that its generation will be less expensive than electric storage, the other electric source commonly considered as a back up to renewables.

NuScale currently has plans to install its first nuclear plant at the Idaho National Laboratory in 2026. Power from the plant will be used to operate the Lab and sold to the Utah Associated Municipal Power Systems for resale its members’ customers.

David Rosenstein worked as an attorney and consulting engineer in the electric utility industry for 40 years. When he retired he wrote a book entitled Electrifying America: From Thomas Edison to Climate Change which describes the evolution of the electric industry from the time Edison invented the light bulb until today. Each of his posts in this Blog describe a different aspect of electricity, the electric industry or the issues currently faced by the electric industry. 

What is the Difference Between Central Station Generation and Distributed Generation?

In 1882, when Thomas Edison lit his first commercial light bulbs he used direct current electricity produced at a dynamo located across the street from building containing his new lighting fixtures. Edison attempted to maintain this business model, selling lighting systems throughout the world consisting a small generating source and a short distribution line going from the generating source to the point of use.

George Westinghouse, with the help of Nicola Tesla, saw the shortcomings of Edison’s system. They developed an alternating current system that relied upon remote central station generating plants whose electricity could be delivered long distances to multiple customers. Because Westinghouse’ system was much more efficient than Edison’s he won the War of the Electric Currents.

Remote central station power plants using a complex delivery system of transmission lines have now become the standard in the industry. The following explains how electricity is generated at a central station power plant:

The following explains how the electric transmission system is used to deliver electricity from a central station power plant to a local distribution system for final delivery to customers:

https://www.youtube.com/watch?v=TQg2Y0kp2vl

Westinghouse’ system was, however, far from perfect. The fossil fueled central station generating stations emitted pollution and, because of their size, had to be added in large chunks, often before they were needed by the utility customers. And the transmission system required rights-of-way in controversial areas, was maintained by utilities with different levels of commitment to that maintenance, was subject to potential outages due to weather, faulty equipment and terrorist attacks and caused energy losses of as much as 10%. Even with these flaws, for more than 100 years, the system was the best method available for the delivery of reliable and affordable electric service.

That may, however, be changing. Distributed Generation (DG), that is small scale generation located close to the point of use and similar to what Edison used in his early lighting systems, may be an efficient substitute for at least some portion of the current system of remote central station plants and transmission network.

DG can come in the following forms:

  • Back-up generation used to ensure continued operation during an outage of the larger Grid. This type of DG has historically been used by health care facilities but has recently be expanded to more and more residential and commercial facilities.
  • A combination of generation sources (possibly including small scale thermal generation along with one or more renewable resources) that provide service to a major institution such as a university, a hospital or a government campus as well as the surrounding community. This is sometimes referred to as a micro-grid and can be operated either along with, or independent from, the larger Grid.
  • Site specific generation, such as an industrial facility’s cogeneration plant or residential roof top solar panels where the energy generated can be sold to the larger Grid.
  • Behind the meter generation where the output is used solely to reduce the owner’s purchases from their local utility and none of the output is sold to the larger Grid.

See this FERC White Paper for a full discussion of the potential uses and benefits of DG.

DG is currently installed primarily by customers who see a benefit from such use. Their benefit may be in the form of back-up service in the event of an outage, a reduction in costs, or a desire to consume electricity that that is produced without carbon emissions.

DG can also provide benefits to the overall utility system in the form of reduced losses during long distance transmission, reduced pollution from central station thermal plants and improved system reliability. DG has not, however, historically been viewed very favorably by utilities. In fact, they have found ways to discourage their use by customers.

In recent years, regulatory agencies have imposed requirements on utilities that reduce their ability to discourage customer installed DG. And utilities are now well aware of the benefits that they can gain from this DG. The system-wide benefits will not, however, be fully realized until the utilities can fully incorporate the benefit of DG into their system operations and planning. And that will not occur until there is better implementation of the Smart Grid under which the utility will have complete information regarding the operational status of all DG on its system. The following is an example of how a utility can use DG and the Smart Grid to benefit its system.

https://www.youtube.com/watch?v=uBdO7N88o98

David Rosenstein worked as an attorney and consulting engineer in the electric utility industry for 40 years. When he retired he wrote a book entitled Electrifying America: From Thomas Edison to Climate Change which describes the evolution of the electric industry from the time Edison invented the light bulb until today. Each of his posts in this Blog describe a different aspect of electricity, the electric industry or the issues currently faced by the electric industry.

Who Controls the Electric Grid?

On September 4, 1882 Thomas Edison flipped a switch and, when 400 light bulbs lit up in an office building in New York’s financial district, everyone cheered. Today we flip a switch and, when our lights, our appliances and our computer equipment work we think nothing of it. 

Edison lit his light bulbs with a single dynamo generating electricity that was delivered under the street and up through the walls of the building on a simple distribution system. Today our service comes from a bulk electric system consisting of 7000 power plants whose generation is delivered by 360,000 miles of high voltage transmission lines (over 115 kV) to a local distribution system managed by your utility. That high voltage transmission system is generally referred to as the Grid.

But who manages the Grid and how do they make sure that the lights come on every time that we flip the switch?

A short time ago the answer would have been simple.  Your local utility owned and managed the portion of the grid that interconnected its distribution system to its generating plants. Your utility probably also owned high voltage lines that interconnected to neighboring utilities so that they could buy and sell supplemental power to each other when needed.  

The weakness of this system first appeared in 1965 with a blackout of the Northeast United States that left 30 million people without power. The cause of the blackout was the weakness of the inter-utility interties created to facilitate utility-to-utility sales. In response to the 1965 Blackout the utility industry agreed that the utility-by-utility planning was not working. They promised to start planning high voltage transmission systems on a regional basis and to voluntarily implement uniform reliability procedures.

The Grid, as we now know it, developed as a result of this regional planning process. It enabled utilities to reliably transmit power over multiple utility systems. In effect, the Grid could be operated like the interstate highway system where a power plant operating at almost any point on the Grid could deliver its power to almost any other point on the Grid. 

In 1995 the Federal Energy Regulatory Commission (FERC) issued its Open Access Orders requiring every utility to provide non-discriminatory access to its high voltage transmission system. This meant that every utility was required to transport power produced by any other utility or by any non-utility generator to any customer located anywhere on the Grid. It also meant that generation could be separated from transmission and distribution service and sold as a competitive commodity.

When it issued its Open Access Orders the FERC was concerned that utilities could not be trusted to provide access on a non-discriminatory basis. They were concerned that utilities would favor their own generation at the expense of other parties’ generation.  The FERC was afraid that it would have to deal with a raft of complaints from generators who claimed that utilities were violating the non-discriminatory access provisions of the Open Access Orders.

In order to avoid dealing with these disputes the FERC strongly urged utilities turn control of their transmission facilities over to new entities called Independent System Operators or Regional Transmission Operators (ISO/RTOs). ISO/RTOs are non-profit entities that are run by an independent Board of Directors whose members are elected by the ISO/RTO members, which include utilities, generators and customers.

Utilities that join an ISO/RTO retain ownership of their high voltage transmission facilities. But they operate those facilities at the direction of the ISO/RTO. The ISO/RTO is responsible for coordinating and directing the flow of electricity over its region’s high-voltage transmission system. The ISO/RTO also performs the studies, analyses, and planning to make sure that the region’s electricity needs will in future time periods.

The following video, prepared by the California ISO/RTO, describes the ISO/RTO responsibilities with respect to operation of their portion of the Grid

The ISO/RTOs also manage the wholesale power markets in which competitive generation is bought and sold. The operation of those markets will be addressed in a later Post.  

The ISO/RTO is considered to provide all transmission on the regional Grid and ensures that such transmission is provided on a non-discriminatory basis. The revenues collected by the ISO/RTO for the transmission service is turned over to the utilities whose facilities are used to provide that transmission service and is adequate to provide each utility reimbursement for its prudently incurred costs and a reasonable return on its investment.

The following are the ISO/RTOs that have been created in the United States:

The utilities in the Southeast, the Northwest and the Southwest (other than California) have not joined ISO/RTOs and continue to both own and operate their own high voltage transmission facilities.  

In 2003 there was another blackout of the Northeast United States. This time 50 million people were without power. The 2003 Blackout was the result of the failure of at least one utility (First Energy Corporation) to adhere to the voluntary reliability standards adopted by the industry after the 1965 Blackout. Congress decided that the reliability of the Grid was too important to rely upon utility promises of voluntary compliance. It, therefore, gave the FERC authority to make compliance with the NERC standards mandatory and to impose penalties of up to $1 million per day for failure to comply.

Therefore, the answer to the question of who manages the Grid has three parts:

First, the utilities still own the high power transmission lines that make up the Grid. They are responsible for maintaining those facilities and keeping them in good working order.

Second, in most parts of the country the ISO/RTOs are responsible for directing the operation of the Grid and for long term planning.

Third, the FERC is responsible making sure that the utilities operate and maintain their facilities in compliance with reliability standards adopted by the NERC.

David Rosenstein worked as an attorney and consulting engineer in the electric utility industry for 40 years. When he retired he wrote a book entitled Electrifying America: From Thomas Edison to Climate Change which describes the evolution of the electric industry from the time Edison invented the light bulb until today. Each of his posts in this Blog describe a different aspect of electricity, the electric industry or the issues currently faced by the electric industry. 

Northeast Blackout of 2003 – The Failure of Voluntary Compliance

August 14 was one of the hottest days of 2003 in the Northeast United States. Late in the afternoon people were beginning to leave work in anticipation of a cool evening in their air-conditioned homes. 

But at 4:10 P.M. a wide scale electric blackout shut off power to over 50 million people from Detroit, Michigan to Toronto, Canada. Homes and businesses were left in the dark without air conditioning. Workers were stuck on elevators and on subways that suddenly came to a stop. Commuters were caught in gridlock as street-lights stopped working. And, eventually, water supply was at risk as electric pumps used to transport the water had no power to operate.

The 9/11 attack on the World Trade Center was still fresh in the public’s mind in 2003 and, initially, there were fears that this was another terrorist attack. But terrorists had nothing to do with the 2003 Blackout. It was, instead, a failure of an aging electric grid.

This was not supposed to have happened. After a similar blackout in 1965, the electric utility industry had promised that they would take action to prevent future wide scale outages.  They committed to creating regional organizations to coordinate transmission planning among multiple utility systems. And they promised to create a National Electric Reliability Council whose role would be to develop reliability standards and practices for adoption by the utilities and the regional organizations. 

There may have been some discussion of government oversight of implementation of the utilities’ promises. But the utilities convinced all regulators that they understood their responsibility to keep the lights on and that there should be no concern regarding their commitment to do whatever it took to prevent future wide scale outages. They promised voluntary compliance with the standards being promulgated by the National Electric Reliability Council.  

Fast forward 38 years and it turned out that not all of the utilities shared the same commitment to do whatever it took to enhance the reliability of the transmission grid. Instead, many of the utilities focused their efforts on maximizing profits in a rapidly deregulating electric industry. 

A prime example was FirstEnergy Corp. FirstEnergy is the current corporate name of what was initially Ohio Edison Company, a local electric utility headquartered in Akron, Ohio. Ohio Edison grew during the late 1990s and early 2000s when it acquired the Centerior Energy Corporation (consisting of the old Cleveland Electric Illuminating Company and the old Toledo Edison Company) and General Public Utilities (consisting of the old Jersey Central Power and Light, Pennsylvania Electric Company and Metropolitan Edison). 

During the years that it was focusing on its growth strategy FirstEnergy grew lax in its reliability obligations. The 2003 Blackout started when a FirstEnergy-owned high voltage line went out of service when coming into contact with a tree. Had FirstEnergy complied with the North American Reliability Council’s standards the tree in question would have been trimmed so that the contact never occurred. 

However, the failure to trim the tree was not the only issue. A computer system required by the reliability standards should have notified FirstEnergy operators when the line went out of service so that they could take action to prevent the spread of the outage. However, at the time of the outage the computer system was out of service. And even if the system had properly been in service there have been suggestions that the FirstEnergy operators were not adequately trained to know how to react upon receipt of the computer signal.  

The 2003 Blackout was the result of a failure of voluntary compliance. Congress decided that, if we are going to be assured of the reliability of the transmission grid, compliance with reliability standards are going to have to be mandatory.

In the Energy Policy Act of 2005 Congress gave the Federal Energy Regulatory Commission (FERC) authority to enforce mandatory reliability standards and to assess penalties of up to $1.0 million per day for failure to comply.  Since that time FERC has established nine Regional Entities who are responsible for enforcing the reliability standards adopted by the NERC (now renamed the North American Reliability Corporation). Those reliability standards include practices required to defend the system again cyber-attacks.

It is never easy for businesses to conform their operations to a new regulatory scheme. However, one would have thought that, when faced with potential penalties in the millions of dollars, the industry would have done its best to comply. NERC gave the industry a phase in period of several years during which it assessed primarily nominal penalties. After the phase in period was over the utilities should have been up to speed and compliance with the reliability standards should have been part of their day-to-day business. So industry watchers were surprises when, in February, 2019, Duke Energy, a utility owning and operating transmission facilities from Florida to Ohio, was fined $10 million for as many as 125 violations of the NERC standards going back over a period of three years. 

After the 1965 Blackout the utilities showed that they could not be trusted to comply with voluntary reliability standards where there was no risk of penalty for non-compliance. Unfortunately, it is not yet clear that the utilities are any better with complying with mandatory reliability standards where the risk of penalty for non-compliance is significantly higher.

David Rosenstein worked as an attorney and consulting engineer in the electric utility industry for 40 years. When he retired he wrote a book entitled Electrifying America: From Thomas Edison to Climate Changewhich describes the evolution of the electric industry from the time Edison invented the light bulb until today. Each of his posts in this Blog describe a different aspect of electricity, the electric industry or the issues currently faced by the electric industry. 

How Do Regulatory Agencies Set Just and Reasonable Rates?

As discussed in the Post entitled “Why Are Electric Utilities Regulated?” virtually every state, as well as the Federal Government, has incorporated the Regulatory Compact into its utility regulatory law. That means that the utilities are required to provide service and the government is required to authorize the utility to charge Just and Reasonable rates. 

Up until the 1990s utilities’ regulated service – the service for which Just and Reasonable Rates were determined – was considered to include all three components of such service – generation, transmission and distribution.  A simple explanation of the generation, transmission and distribution functions can be found at the following You Tube video:

Beginning in the 1990s, in many parts of the country, the generation component of service began to be available on a competitive basis. The impact of this conversion to partial deregulation on the rate-setting process is discussed in more detail at the end of this Post. However, the following description of the rate-setting process applies to whatever portion of the service remains subject to regulatory ratemaking.

Just and Reasonable rates are those rates that allow the utility to recover its operating costs plus a reasonable return on its investment.  Over time this had led to the following formula that is used to determine the total revenues (referred to as the “Revenue Requirement”) that a utility will be permitted to recover over any 12 month component of the rate effective period:

So, to calculate a utility’s Revenue Requirement, the state or Federal regulatory agency must determine the following three rate components for a representative 12 month period: (1) Prudently incurred costs; (2) Reasonable rate of return; and (3) Investment in used and useful plant.  Those determinations are made in a judicial like proceeding where the utility and other interested parties can present testimony and exhibits and argue for their interpretations of the above three rate components.

Revenue Requirement        =          Prudently incurred costs of operations + (Reasonable Rate of Return x Investment in Used and Useful Plant)

Prudently Incurred Operating Costs

Operating Costs include things like labor, materials and fuels. The process begins with the operating costs taken directly from the utility’s books and records for a representative 12 month period. For rate-setting purposes the regulatory agency will adjust the raw data for the representative 12 month period to reflect any “known and measurable changes” that are going to occur during the rate effective period.  For example, for rate-setting purposes, labor costs may reflect negotiated wage increases that are going to take effect during he period in which the rates will be in effect.

Parties representing customer interests may argue for the disallowance, or exclusion, of certain operating costs that they believe were non-recurring or imprudently incurred. For example, they might argue that costs incurred by the utility to recover from an equipment outage that occurred during the representative 12 month period should not be included in rates because the outage was out-of-the ordinary and is not expected to occur again or because the outage was caused by a failure by the utility to properly maintain the equipment. 

Reasonable Rate of Return

The Reasonable Rate of Return is a weighted average of the utility’s cost of capital. It includes the interest rate on the utility’s long term bonds, the dividends on any preferred stock, the interest rate on any short term debt and a return on outstanding equity. 

The return on equity is the profit component of the utility’s rates.  Therefore, the Reasonable Rate of Return component will include the taxes paid on the return on equity component. 

The return on equity should be equal to a return that is comparable to the return on other available investments in the market that are of a similar risk to the utility. Historically, the utility and the parties representing customer interests presented extensive arguments for their preferred returns on equity for the utility. In recent years regulatory agencies have found ways to reduce the contentiousness of this issue. 

Investment in Used and Useful Plant

As with the Operating Cost component, the Investment component starts with net investment for operating plant taken from the utility’s books and records for the representative 12 month period. For ratemaking purposes the actual amounts are adjusted to reflect plant that will be taken out of service and plant that will be added to service during the rate effective period. 

The Investment component was very contentious during the 1980s when large new nuclear plants came on line at a time when it did not appear that they were going to be needed to meet diminishing customer requirements. Parties representing customers argued that the investment in the new plants should not be included in rates because they were not going to be “used and useful” in providing service.

Conversion of Revenue Requirement to Rates

The Revenue Requirement, as determined above, is the total revenue that the utility is permitted to recover during any 12 month component of the rate effective period. So how is that Revenue Requirement converted to rates that customers will see on their bills?

The first thing that happens is that the Revenue Requirement is equitably allocated to each of the utility’s rate classes, typically, industrial, commercial and residential. The second thing that happens is that the portion of the Revenue Requirement allocated to each rate class is broken down into rates, typically, a customer charge, a per kW demand charge (usually only for industrial customers) and a per kWh energy charge, based on projected customer usage during a 12 month period in the rate effective period. 

Rates approved by the regulatory agency through the above described process will remain in effect until the regulatory agency changes the rates again. This could be for one year or it could be for many years. During the years that the rates are in effect, if the costs, return, investment and customer usage are the same as used in the rate-setting process the utility will earn the profit that was projected in that process. However, in years when any of these components are different than what was used in the rate-setting process the utility will earn more or less profit than hat which was projected.

Impact of Partial Deregulation

Where the generation component of service is available on a competitive basis the customer is considered to be buying only transmission and/or distribution service from the utility and the rate-setting process in such a case is determined by the Operating Costs, Return and Investment related to those services. The customer will then receive an invoice from the regulated services from the utility and an invoice for the competitive generation services from the generation supplier. (In some cases the utility may collect the revenues for the supplier as a separate line item on the utility invoice).

In cases where competitive generation is available but the customer elects to continue to purchase all three components from the utility the utility will purchase the generation in the competitive market and will include the cost of the generation component as a pass through charge on its invoice.

Where partial deregulation for the generation component of service has occurred the regulatory agencies have not been relieved of their obligation to ensure that electric rates are Just and Reasonable. However, where they formerly applied the above described rate-setting process to the cost of generation to ensure Just and Reasonable rates they now ensure such rates by making sure that the market for the generation component is truly competitive.

David Rosenstein worked as an attorney and consulting engineer in the electric utility industry for 40 years. When he retired he wrote a book entitled Electrifying America: From Thomas Edison to Climate Changewhich describes the evolution of the electric industry from the time Edison invented the light bulb until today. Each of his posts in this Blog describe a different aspect of electricity, the electric industry or the issues currently faced by the electric industry. 

Why Are Electric Utilities Regulated?

In the late 1800s and early 1900s anyone could build the facilities necessary to serve one customer, a few customers or a large community of customers. Most of these initial supply arrangements consisted of a small generator located at or near the point of usage.

But it soon became clear that economies of scale could best be achieved by using one or more generating plants interconnected to many customers by a network of transmission and distribution lines. Suppliers who had the financial capability to build the required network of plants and delivery facilities could undercut the prices of smaller providers. Those smaller providers either went out of business or merged with a larger provider. 

Eventually, most communities had only a single monopoly supplier. Advocates for consumers were concerned that a single investor-owned utility would try to maximize its profits by charging excessive rates for its service.  

And the investor owned utilities had their own concerns. They were worried that municipalities might create a public owned utility or that another privately owned supplier might come to town and start a price war. Privately owned utilities were willing to give up their ability to fully capitalize on their monopoly position in exchange for some guaranty that they could retain that monopoly position. 

Policy makers of the time arrived at something called the “Regulatory Compact”.  The Regulatory Compact is basically an agreement between the utility and the government.  Under that agreement the utility promises to invest in facilities necessary to provide service to customers within its service territory and to charge rates for those services that are set by the government. In exchange, the government promises to protect the monopoly status of the utility within a defined service territory and to authorize rates that permit the utility to recover its operating costs plus a reasonable return on its investment. 

Virtually every state has now incorporated a form of the Regulatory Compact in its state Public Utility Act. Those Public Utility Acts empower a three or five member regulatory agency (know as a Public Service Commission or a Public Utility Commission) to establish franchise service territories for each utility and to set “Just and Reasonable” rates for the services provided by the utility.  

In 1935, with passage of Title II of the Federal Power Act, Congress gave to the Federal Power Commission (now named the Federal Energy Regulatory Commission) authority to set Just and Reasonable rates for interstate wholesale sales between utilities (a transaction that had previously been held by the US Supreme Court to be beyond the authority of the state regulatory agencies).

David Rosenstein worked as an attorney and consulting engineer in the electric utility industry for 40 years. When he retired he wrote a book entitled Electrifying America: From Thomas Edison to Climate Changewhich describes the evolution of the electric industry from the time Edison invented the light bulb until today. Each of his posts in this Blog describe a different aspect of electricity, the electric industry or the issues currently faced by the electric industry.