An upheaval is underway in the nation’s electricity sector, and Virginia is ground zero. As the data center capital of the world, the state faces surging demand, ballooning utility bills, and a bottlenecked grid — all challenges that policymakers are navigating while maintaining a legally mandated course toward carbon neutrality.
Now, the state is poised to become the first in the nation to quantify and examine ways to reduce waste on the electric grid — a potentially monumental move toward reining in rates and speeding the clean energy transition. Maximizing usage of our existing network of power lines and related infrastructure, backers say, could also help close the gap between the public interest and that of investor-owned utilities.
House Bill 434 would direct Appalachian Power Co. and Dominion Energy, the state’s two predominant vertically integrated utilities, to gather and report detailed data on their grid utilization. The measure won final approval from Virginia’s Democratic-controlled legislature this week and now heads to the desk of Gov. Abigail Spanberger — a Democrat whose victory in November was fueled in part by anxiety over rising electricity costs. As one of the earliest proposals Spanberger offered after her election to address energy affordability, the bill looks certain to become law.
Many experts say the information the measure would require is itself meaningful: Utilities have long resisted gathering and reporting such metrics, in part because doing so could hurt their case to build out more infrastructure that pads their bottom lines.
But advocates for HB 434 say its real impact could come after the utilization data has been reviewed by regulators, who must then establish a timeline for utilities to optimize grid usage. The bill directs officials to give special consideration to “non-wires alternatives” like batteries and line sensors.
“The fact that Virginia became the first state to introduce this sort of legislation is pretty significant,” said Charles Hua, the founder and executive director of PowerLines, a nonprofit that aims to lower utility bills and supports HB 434. “But this would just be the first step of a long journey.”
The legislation is premised on an incredible reality: Roughly half the electric grid goes unused about 99% of the time. Poles, wires, substations, and other components are built out to deliver electrons during periods of maximum demand, such as during the recent cold snap brought on by Winter Storm Fern. But those peak events are rare.
“This is where this conversation has been stuck for 20 years,” said Pier LaFarge, the co-founder and CEO of Sparkfund, which helps utilities deploy and manage distributed energy sources. “We’ve built the grid to peak … then said, ‘How much space is left?’ But what’s amazing is, the grid only is at peak 50 to 200 hours a year out of 8,760.”
Another factor is that some kilowatt-hours are lost as they travel from the point of generation to the customer, especially along lower-voltage AC distribution lines.
“Local poles and wires, that is, the distribution grid, is really not that efficient,” Hua said. “But you never would really know, because there’s not a ton of transparency around spending.”
HB 434 would prompt Appalachian Power and Dominion to examine and quantify these utilization gaps and inefficiencies as part of a regulatory proceeding this fall. The state’s utilities commission would then review and approve that data and direct the companies to increase grid utilization.
The measure requires regulators to evaluate key technologies — from energy storage to synchronous condensers, which reduce line loss — to improve use of the grid. It also opens the door for regulators to weigh grid utilization when considering utility proposals to instead expand their infrastructure.
In theory, these steps should lead to lower rates for customers. “Electricity rates are a math equation,” Hua said, where the top of the fraction is the cost of grid infrastructure, among other investments, and the bottom half is the number of kilowatt-hours sold.
Increasing grid utilization divides the fixed cost of the poles and wires — roughly the same numerator — by more electrons, a much higher denominator. “Therefore, you’re lowering the per-unit price of electricity,” Hua said, “and you’re lowering utility bills for all consumers.”
Exactly how significant this “denominator effect” will be isn’t clear yet – not without the data HB 434 requires utilities to compile. But experts say that growing the bottom of the fraction is a win for both customers and the investor-owned utilities, which make more money the more kilowatt-hours they sell.
Grid optimization also gives these utilities a pathway to making capital investments that earn them a guaranteed profit more quickly than building new power plants. That pathway runs through grid-scale batteries, according to LaFarge.
“Batteries have enormous value to the grid because they’re electron time machines. You can charge them up when there’s plenty of energy on the grid and no congestion or scarcity,” LaFarge said, and then discharge them when demand is at its height. “It creates more room on the grid using the grid you have. That unique nature of batteries is their superpower.”
While storage technology has been around for a decade, until very recently it was more expensive than building poles and wires and harder to justify to regulators.
“What has changed in the last 18 to 24 months is batteries have gotten staggeringly cheap,” LaFarge said, and utilities can invest in them and improve their bottom lines. “This is one of our most important messages around utilization: Utilities can earn more on capital assets [and] have higher revenue while delivering cheaper power to people.”
LaFarge’s company has worked with Dominion on other forms of distributed generation, including EV charging. For batteries, he said, “the Virginia utilization bill certainly creates an even bigger opportunity.”
To be sure, increased grid utilization is far from the only step Virginia lawmakers can take to tamp down skyrocketing electricity costs. Tying rates to performance metrics such as affordability and efficiency, increasing targets for batteries and other cheap sources of clean energy, and enabling more large-scale solar projects are among a host of legislative proposals that would also help lower prices — and that all could also become law this year.
It’s also true that the one-page HB 434 is more suggestion than mandate, and its speedy passage through the Virginia General Assembly — including by a nearly unanimous vote in the House of Delegates — raises questions about its impact. And the onus will be on the state’s utilities to measure, report, and improve grid utilization, albeit with prodding from regulators.
Still, Jigar Shah, a longtime energy entrepreneur and the director of the U.S. Department of Energy Loan Programs Office under former President Joe Biden, believes the legislation will put utilities on the hook, even as it gives them leeway to collect and analyze utilization data.
“What’s not acceptable is for folks to say, ‘It’s not possible and rates are going up 9% a year,” said Shah, who helped shape and advocate for the bill as an adviser to the nonprofit Deploy Action. He also pointed out Spanberger’s support and regulators’ engagement in the bill.
“It’s not something that we expect to be buried in a [utility] filing and it goes to die,” he said. “I think there’s actual interest in it from folks on the commission to continue moving it.”
For LaFarge, the broad consensus around the legislation is a reason for optimism, not skepticism.
“This is a bipartisan idea that really is having its moment, and we’re excited to see the successes of this bill replicated in dozens of states,” LaFarge said. “I think the regulated utility compact is about to surprise people with its ability to solve these problems to the benefit of the climate, the economy, and people who use energy in their daily lives.”
Disclosure: Charles Hua is a member of Canary Media’s board of directors. The board has no influence over Canary Media’s reporting.
Electricity consumption growth rates are increasing across the United States, driven, in part, by a boom in hyperscale data center development. Although the long-term market outlook remains uncertain, the Lawrence Berkeley National Laboratory predicts that data center demand will grow from 176 terawatt hours (TWh) in 2023 (or, about 4.4% of total U.S. electricity consumption) to between 325-580 TWh (6.7-12.0%) by 2028.1 In some parts of the country, AI-driven energy demand is outpacing available capacity, driving companies to delay projects, contract power directly from private producers, and/or install multiple, inefficient reciprocating generators using natural gas.
Data centers may impact grid reliability in some regions. In July 2024, a voltage fluctuation in northern Virginia triggered the simultaneous disconnection of 60 data centers, prompting a 1,500-megawatt (MW) power surplus, which forced emergency adjustments to prevent cascading outages.2 Investors claim that massive investments in energy generation and grid infrastructure are needed to power data center development while mitigating outage risks. However, if the anticipated demand does not materialize, utilities (and their consumers) could face stranded costs.3
Data centers have enjoyed discounted energy tariffs and tax incentives, as state and local governments compete to attract business. Although these early incentives have driven substantial data center investments, emerging regulatory debates are impacting market development across the country. Policy shifts in major data center markets, such as the passage of Texas Senate Bill 6, signal the probability of future market intervention by both regulators and policy makers to address local-level concerns over reliability and affordability.
As data center infrastructure continues to expand, developing effective regulatory policies becomes critical. The future of data centers and their energy needs, as well as the policy decisions made in this realm, will impact U.S. technological competitiveness for decades to come. While overregulation could hinder AI development, insufficient regulation risks grid instability, rising consumer costs, reliance on high-emission energy sources, public backlash, and setbacks to state and corporate climate goals.
This policy brief outlines the current state (and potential consequences) of U.S. data center electricity usage and corresponding grid expansion. The paper provides an overview of the current data center and grid landscape followed by a discussion of potential engineering and policy approaches to address ensuing challenges. The foundations laid herein will inform our future research under the Project on Grid Integration at the Harvard Kennedy School (HKS) and the Harvard School of Engineering and Applied Sciences (SEAS). This Initiative aims to advance 1) the development of new regulatory tools to incentivize increased grid flexibility and 2) the creation of more equitable cost-sharing mechanisms in the wake of expanding data center development. The brief concludes by outlining several critical questions which will guide the Project’s research over the next year.
According to the National Telecommunications and Information Administration (NTIA), there were over 5,000 data centers in the United States in 2024, with demand for data center services expected to grow through 2030.4 Accordingly, capital spending on hyperscale data center infrastructure has risen to unprecedented levels over the past five years. Amazon CEO Andy Jassy noted that AWS’s AI-related revenue is already a multibillion-dollar business “growing at a triple-digit, year-over-year percentage.” In 2024, Amazon, Microsoft, Google, and Meta collectively spent over $200 billion on capital expenditures (CapEx), representing a 62% year-over-year increase from 2023. Each firm’s spending reached an all-time high: Amazon’s CapEx was $85.8 billion5 (up 78% year-over-year), Microsoft’s was $44.5 billion6 (up 58%), Google’s was $52.5 billion7 (up 63%), and Meta’s was $39.2 billion8 (up 40%). Looking ahead, Amazon’s total CapEx9 in 2025 is projected to surpass $100 billion, while Microsoft’s and Google’s are each expected to exceed $80 billion. The data center buildout race reflects both strategic and financial drivers, as companies race to secure long-term returns and future competitive advantages. By investing ahead of demand, these companies are ensuring infrastructure is available when customers need it. From the industry’s perspective, failure to build ahead of demand places companies at a competitive disadvantage.
While data center financing stems primarily from parent-company balance sheets, corporate bonds, and public incentives, project finance is occasionally used, with green bonds emerging as a supplementary tool. Financing the electricity infrastructure upgrades needed to power data centers, however, is a much more challenging endeavor, as utilities operate under tight financial and regulatory constraints that complicate the acquisition of the large-scale capital deployment needed to fund expansive upgrades.
As data centers continue to seek rapid power interconnection, alternative financing mechanisms for power procurement—through both utilities and third-party providers—are gaining prominence. For example, firms are increasingly relying on third-party power contracts, which include collateral commitments, long-term power purchase agreements (PPAs),10 availability payments, and upfront capital payments. Additionally, companies are weighing the costs and benefits of co-locating data centers and power generation, despite challenges surrounding siting rules, asset ownership, and regulatory oversight. Overall, this unprecedented capital outlay exposes both firms and utilities to a range of risks, from increased stranded assets to rising financing costs; therefore, the sustainability of the data center build out depends on both resilient financing structures and continued demand realization.
Future data center market expansion, and its consequent energy usage, remains highly uncertain. Past data center energy studies display numerous flaws. In a review of 258 data center energy consumption estimations, Mytton & Ashtine (2022) found systematic defects within study methodologies, particularly with regards to data availability and transparency.11 The opacity of data center operations, site planning, and energy efficiency complicate energy estimations and projections.12 Subsequently, institutional projections of data center electricity demand range from about 200 TWh to over 1,000 TWh by 2030, according to the World Resources Institute. This range complicates medium-to-long term grid planning, as utilities struggle to determine both the true magnitude of the industry’s future energy needs and its relationship to economywide electrification.
The 2024 United States Data Center Energy Usage Report13 attempted to clarify the extent of current and future data center energy consumption. After a period of stagnation from 2014 to 2016, center energy demand grew in 2017 due, in part, to expanded efforts to digitalize data across economic sectors. From 2018 to 2023, data center energy use increased from roughly 76 TWh (comprising 1.9% of the nation’s total annual electricity consumption) to 176 TWh (4.4%); future data center energy usage could range from 325 to 580 TWh by 2028, or 6.7-12.0% of 2028 national electricity consumption. However, this range remains uncertain, due to the continued opacity of data center and utility planning as well as uncertain data center market trajectories.14
Project risks are assumed by external stakeholders, not just data center companies. For example, utilities face stranded-asset risks with regards to generation and transmission buildout; if infrastructure is built to serve projected data center demand and said demand does not materialize, these assets could be underutilized. Furthermore, increased contract-based financing has shifted projects away from guaranteed “rate-base” recoveries, instead favoring special tariffs and PPA contracts, arrangements which lack transparency and may shift power costs onto other consumers.
These threats raise urgent questions about who should shoulder data center buildout costs and whether returns (and cost recovery) to the utility will remain predictable. Who should pay for grid improvements spurred, at least in part, by data center development? Who are the beneficiaries of these improvements? How should costs be allocated across consumers? How can local communities be protected from rising energy costs and natural resource depletion as data centers expand to new markets across the United States? Rigorous policy, economic, and engineering research—in conjunction with increased transparency from data center operators and utilities serving them—is crucial for future grid planning as well as for mitigating unwanted environmental, social, and economic impacts.
As data center markets continue to expand, regional differences in electricity market design and energy needs are shaping regulatory and market reforms. Simultaneously, local-level impacts are introducing additional variables for policy consideration. This section surveys two of the largest U.S. data center markets, Virginia and Texas, to demonstrate how locales facing similar challenges differ in the pace and substance of their responses.15
Virginia is the epicenter of the global data center industry, with over 4,900 MW of operating capacity (and another 1,000 MW under construction) in Northern Virginia alone.16 By some estimates, about 70% of global internet traffic passes through the region daily.17 The area’s dense fiber network, linkages with federal facilities, and systemic incentives enabled its market dominance. First, Northern Virginia was an early node in the U.S. government’s ARPANET18 and still hosts major internet exchange points.19 Second, the state’s low power costs, strong electric reliability, economic incentives, and mild climate reduce data center operation costs, while some Northern Virginia counties provided early permit acceleration for large campuses.
Data center growth in Virginia will add thousands of megawatts of nearly constant demand over the next few years, thereby compressing planning timelines and raising new questions around who should bear the costs of system improvements. Dominion’s20 2024 resource plan projects nearly 27 GW of new generation by 2039, including 21 GW of renewable energy (i.e., solar, wind, and nuclear small modular reactors [SMRs]) and 5.9 GW of gas.21 Simultaneously, Virginia’s energy rates are increasing. In February 2025, Dominion proposed its first base-rate increase since 1992, adding about $8.51 per month in 2026 and $2.00 per month in 2027 for a typical household.22
Furthermore, rapid demand growth has led PJM, Virginia’s regional transmission organization, to review how it both defines firm service and manages reliability obligations. The region’s wholesale design depends on a balance between competitive generation, long-term capacity procurement, and regulated local service. This dynamic is strained by data center expansion, as a single, fast-growing class of customers with unique load profiles present system needs that differ from those around which PJM was built. Data centers use large, steady electricity loads with limited ability to reduce (or ramp down) their power usage; simultaneously, their energy demand can fluctuate according to equipment usage and job complexity. This pattern differs from the more gradual, weather-sensitive load patterns. Overall, Virginia is under pressure to embrace new rates, financing, and reliability tools to allocate risks to the drivers of this new demand: data centers.
As the data center industry continues to expand, the Virginia grid must adapt. Cost allocation rules and policy incentives will evolve as the state considers how to sustain reliability investments while stabilizing rates for other customers. Several policy reforms have been proposed. For example, lawmakers have debated scaling back Virginia’s data center tax exemptions for both performance and sales. However, proposals to repeal these incentives stalled in the budget process. Furthermore, several 2025 bills sought 1) to link eligibility to tax incentives to improved energy efficiency or clean energy performance, 2) to pause new projects in Northern Virginia, and/or 3) to set uniform development standards, but none of these advanced.23,24 A separate bill establishing statewide standards, including land use reviews, reached the governor’s desk but was vetoed.25 That said, local governments are considering enhancing land use and environmental regulations, in order to slow the data center build out process. As of the time of writing, the state tax exemptions remain in place through 2035, signaling Virginia’s intent to support competitive market development, but serious concerns around land use and affordability are looming on the horizon.
Texas, with its lightly regulated, “energy-only” electricity market structure, offers a contrasting example of how U.S. electricity systems are responding to rapid data center development. The state demonstrates how a market that historically favored low-friction interconnection processes is adjusting its regulatory framework in response to unprecedented new load growth.
Over the past several years, Texas data center investments have been attracted by the state’s competitive electricity prices, business-friendly policies (including state sales and use tax exemptions on servers, cooling equipment, backup energy, and other hardware), and rapid interconnection speeds. As a result, the Dallas-Fort Worth area has emerged one of the largest data center markets in the United States and is continuing to witness massive build out. The Electric Reliability Council of Texas (ERCOT)26 projects that peak summer power demand could approach 145 GW by 2031, up from 85 GW in 2024; this represents a significant acceleration relative to the gradual 1-2% annual growth in demand experienced over the past two decades. Over half of this new demand (about 32 GW) is projected to come from data centers (including cryptocurrency miners).27 Unlike past gradual and dispersed growth, the current demand surge is rapid, lumpy, and increasingly clustered around specific localities, leading to increased concerns around demand-supply mismatch, insufficient energy reserve margins, and transmission congestion.28
By mid-2024, state lawmakers grew increasingly alarmed by emerging energy risks, particularly with regards to: (1) fairness in cost recovery, with concerns that data center’s speculative or duplicative29 interconnection requests could shift upgrade costs onto smaller customers; (2) behind-the-meter (BTM) co-location that might pull existing grid-facing generation behind a private fence, reducing available capacity in the system under30 tight conditions; and (3) managing resource adequacy and emergency operations if large loads remained uncurtailed31 during an emergency.
In June 2025, the Texas State Senate enacted Senate Bill 6 (SB6), a package of planning, interconnection, cost-sharing, transparency, and emergency operations reforms aimed at strengthening and protecting the state’s energy grid. The law formalizes ERCOT’s Large Load Interconnection Study (LLIS) process;32 directs the Public Utility Commission of Texas (PUCT) to determine a “reasonable share” of upgrade costs for new large loads;33 and requires improved disclosure to reduce speculative filings.34 Overall, SB6 signals the growing potential for expanded regulation across regional markets in response to increased energy affordability and cost-sharing concerns.
In conclusion, Virginia and Texas face similar energy challenges in the wake of rapid data center development, but their approaches demonstrate different regulatory philosophies. The actions (or lack thereof) taken in these states will serve as models for regulators elsewhere across the country.
Future policy and regulatory solutions for data center energy usage will only work if they are technically feasible, economically sound, and politically acceptable. Data center interconnection is often framed as a choice between grid reliability and economic growth. However, past policies have not been anchored in how large loads behave in the real world. Effective policy solutions must account not only for local-level impacts and cost sharing concerns, but also for computational realities. A modeling-first approach can elucidate policy opportunities by first screening for system reliability, then evaluating system-wide price and congestion effects under certain operational criteria that reflect real flexibility. This exercise will require close collaboration between policymakers, engineers, and business leaders across both the energy grid and corporate sectors.
Ongoing research at the John A. Paulson School of Engineering and Applied Sciences (SEAS) aims to address this gap. By linking security-constrained operations (i.e., reliability screening, congestion and ramping limits) with market outcomes (i.e., price volatility, renewable curtailment risks, and uplift payments), the SEAS team is developing realistic engineering solutions to be integrated into real-world policy tools. This analysis will extend across operational levels, considering everything from hosting capacity to transformer loading to thermal equipment aging. Together, these views link system-wide constraints to local reliability and power-quality considerations to develop standardized, transparent workflows that can align planner decisions, regulatory approvals, and developer obligations on predictable timelines.
Rigorous modeling of data centers’ reliability and economic impacts across transmission and distribution enables evidence-driven policymaking. For example, planners could maintain a public shortlist of locations where the grid can reliably host new large loads, aligning private proposals with places with sufficient grid capacity. A similar structure could apply to transmission and distribution by clarifying non-negotiable conditions (such as contingency margins and equipment limits) and possible trade-offs (such as construction timelines). This transparency would enable faster construction, fairer decisions, and clearer expectations among all stakeholders.
At the same time, AI data center power consumption still lacks a standard electricity load profile. Such a baseline would help grid operators, planners, renewable energy developers, and policymakers compare scenarios, estimate future energy costs, gauge resource adequacy, design demand-side flexibility incentives, and set accurate emissions policies. Job submission scheduling provides opportunities to enhance data center demand-side flexibility. Using a bottom-up, minute-by-minute model informed by real job data (i.e., job-arrival traces, per-job resource demands, GPU power profiles, and standard cluster resource allocation mechanisms), SEAS researchers have demonstrated that queuing dynamics (or, how jobs arrive, wait, and are scheduled under finite resources) shape electricity demand. This detailed modeling provides a more granular understanding of power profile dynamics across multiple time scales, ranging from seconds to hours, thereby clarifying the impact of job dynamics on the energy system. This work will provide the basis for regulatory tools designed to mitigate excess power usage and fluctuations stemming from job-level dynamics.
While the outlook for data centers and their energy needs remains uncertain, future solutions must leverage robust policy instruments to spur technological and/or operational changes. For example, data centers may be able to improve grid reliability by reducing their power usage during peak periods; however, it is unclear which incentives would best encourage these practices. Theoretical solutions must be translated into effective, real-world policy initiatives that consider economic, political, and social realities as well as technological feasibility. Rigorous policy, economic, and engineering research—in conjunction with increased transparency from data center operators and utilities serving them—will facilitate successful reforms.
The Project on Grid Integration (PGI) is well-positioned to address these challenges. A joint project of the Harvard Kennedy School of Government (HKS) and the Harvard John A. Paulson School of Engineering and Applied Sciences (SEAS), the Project aims to develop new policy, technical, and operational tools that leverage the data center boom in order to strengthen and modernize the U.S. electric grid; at the same time, the project works to minimize the economic, social, and environmental repercussions of rapid data center expansion.
Moving forward, the Project will examine the following questions:
The views expressed in this paper are the opinion of the authors and do not reflect the views of PJM Interconnection, L.L.C. or its Board of Managers of which Le Xie is a member.
The U.S. desperately needs to make more room on its electricity grid. But for years, the country has struggled to build new power lines at a reasonable pace, and despite fast-rising electricity demand, there’s no sign of that changing in the near term.
A project taking shape near Boston could help make the case for an alternative to expanding the grid: big, strategically placed batteries.
In fact, energy storage has already helped defer the need for costly, slow-moving transmission upgrades in Australia, Europe, and South America. But it hasn’t yet caught on in the U.S.
The Trimount battery project, four miles north of Boston, could spur grid planners and operators to take another look at this concept of using storage as a transmission asset. At the very least, it will be hard for them to ignore. With 700 megawatts of power capacity and 2.8 gigawatt-hours of stored energy, the battery installation would be one of the largest in the nation, and by far the largest in New England.
The Trimount project is targeted for a key pinch point in the region’s grid. It will be located at a former Exxon Mobil oil-storage facility in the city of Everett and will plug into a major substation that connects Boston to the greater New England grid. Boston is a “load pocket,” a spot on the grid where peak electricity demand sometimes exceeds what transmission lines can supply — whether because of emergencies or more predictable spikes in usage on hot and cold days.
But those moments tend to be relatively short-lived, making batteries a viable tool for weathering imbalances. Batteries can store electricity when it is abundant and then discharge it when the transmission system faces high demand.
“At hours when the grid is overly stressed, the ability to discharge the batteries in the middle of the load pocket alleviates the strain on all the major lines going into the metro area,” said Hans Detweiler, senior director of development for Jupiter Power, the Austin, Texas–based company behind the battery project.
Jupiter Power is seeking approval from Massachusetts’ Energy Facility Siting Board for Trimount and hopes to secure utility contracts later this year, Detweiler said. If everything goes according to plan, the company expects to break ground in 2027 and start operating in late 2028 or early 2029.
That will put Trimount smack-dab in the middle of near-term and long-range planning for the Independent System Operator New England, the entity that manages the region’s transmission grid. And ISO-NE is actively searching for ways to relieve Boston’s peak electricity demands.
To that end, Jupiter Power hired RLC Engineering to conduct a study of how energy storage could help solve challenges identified in ISO-NE’s “Boston 2033 Needs Assessment” report. Specifically, the study looked at options for managing when two major transmission lines go out of commission successively, called an N-1-1 event, which could force utilities to institute widespread power outages.
Trimount’s “pivotal” position in the grid could allow it to keep the grid up and running during such an emergency, RLC’s study said. The other alternative would be upgrading a number of high-voltage transmission lines, many of them buried underground — a costly, disruptive, and time-consuming process in dense urban environments.
RLC’s analysis found that the Trimount battery project could provide an “avoided transmission cost benefit” of about $2.27 billion by avoiding those upgrades — “a much more cost-effective way to solve the reliability issue.”
“There are all these ways that storage can save consumers’ money,” Detweiler said. “One is that storage — at least in certain locations, like our project — can avoid massive transmission upgrades.”
This use of batteries as a sort of shock absorber for the grid has gained more traction outside the U.S.
Take the work of Fluence, a global leader in energy storage solutions, for example. The firm, a joint venture of Siemens and AES Corp., is building what could be the world’s biggest storage-as-a-transmission-asset project in Germany, and it has more than 1.2 gigawatt-hours of projects with transmission-asset components around the world, according to Suzanne Leta, the company’s vice president of policy and advocacy.
If the idea catches on in the U.S., the impact could be significant.
A study from Astrapé Consulting commissioned by the Natural Resources Defense Council found that building 3 gigawatts of energy storage by 2030 could obviate the need for about $700 million in transmission upgrades to serve Illinois as it closes fossil-fueled power plants to meet state climate goals.
And in New York, adding battery storage as a transmission asset could “mitigate grid congestion, reduce renewable curtailment, and defer the uncertain need for new power lines,” according to a study by Quanta Technology on behalf of the New York Battery and Energy Storage.
But right now, it’s hard to make these projects happen in the U.S., Leta said. The reason? ISO-NE and other regional grid operators require such batteries to be exclusively used to aid the transmission grid. The battery owners cannot make money from performing other services.
“You have a transmission revenue stream — that may need first priority. But you need additional revenue streams,” Leta said. “The reason that hasn’t happened is generally because policymakers have not allowed for those combined revenue streams.”
That’s the case for the Trimount project, which won’t earn money from any grid relief the battery might provide. Instead, like the other large-scale battery projects being built in Massachusetts, it will earn money through the state’s Clean Peak Energy Standard, which offers credits for charging up with renewable energy and discharging it during times of peak demand. And Trimount is seeking to contract the project to one of Massachusetts’ major utilities, which are under state mandate to procure 5 gigawatts of energy storage by 2030.
But if ISO-NE wants to take advantage of the potential transmission savings of Trimount and similar battery projects, it may need to work with stakeholders on another way of doing it. At present, the grid operator’s “storage as a transmission-only asset” (SATOA) structure, approved by federal regulators in 2023, bars batteries from doing anything else if they’re used to relieve transmission constraints.
There’s a market rationale for this separation. Grid operators draw a hard line between transmission assets and other energy-market resources like power plants and batteries. If a battery project is collecting money for being a transmission asset, that revenue could subsidize the other energy-market services it provides, giving it an unfair advantage over competitors.
The same kind of limitations apply to the storage-as-transmission-asset rules at the Midcontinent Independent System Operator, which manages the transmission grid and energy markets across 15 U.S. states from Louisiana to North Dakota. It has limited its use of those rules to only one relatively small project to date.
Other major grid operators, such as PJM Interconnection, which covers Washington, D.C., and 13 states from Virginia to Illinois, have yet to develop rules for storage as a transmission asset. In PJM, that absence has played a role in stymieing proposals to use batteries to facilitate the closure of aging fossil-fueled plants.
Alex Lawton, a director at trade group Advanced Energy United, suggested that grid operators may want to find ways for batteries to make money across both energy markets and transmission services in order to use energy storage to help relieve their increasingly urgent transmission shortfalls.
“Yes, we are going to need to build more lines. But we want to do that cost-efficiently,” he said. “If it can be solved with a battery, that needs to at least be considered. And we want an analysis that shows all those things.”
Market rules aren’t the only barrier. There’s also the issue of forcing these projects to be part of the glacial pace of planning, approving, and building power lines. Under ISO-NE’s SATOA plan, any battery meant to help defer a grid build-out has to be identified through regional transmission plans, which take years to develop.
Currently, ISO-NE’s soonest opportunity to update its approach to integrate batteries into its transmission planning may be as part of its upcoming work to comply with the Federal Energy Regulatory Commission’s 2024 order to modernize long-term transmission planning, Lawton said. Among the mandates in that sprawling order, FERC calls on grid operators and utilities to incorporate advanced transmission technologies, which can expand the capacity and flexibility of existing power lines.
“We’ve always advocated with long-term transmission planning that there should be a robust process to evaluate alternative transmission technologies,” he said. “Storage is, in some cases, the most cost-effective solution.”
But just as companies that own power plants jealously guard their market position against new competitors, utilities that own and operate transmission grids tend to guard their incumbent advantages in winning contracts to build new power lines. ISO-NE’s current SATOA rules don’t provide incentives for transmission owners to consider adding battery storage as an alternative to building power lines, which earn them guaranteed rates of profit, Lawton noted.
The Trimount project “could be a really excellent case study to make a case for revisiting SATOA, and strengthening it and expanding it,” he said. It will certainly be worth observing how the project’s future patterns of charging up with excess clean energy and discharging during peak hours, which it’s incentivized to do under the Clean Peak Energy Standard, coincide with relieving the congestion on that part of the transmission grid.
In the meantime, building an enormous battery right next to a major city will bring multiple benefits, Jupiter’s Detweiler noted. The company commissioned a study by Aurora Energy Research that found the Trimount project could save ISO-NE customers about $1.6 billion in capacity market costs over its 20-year lifetime by deferring the need to build other power plants to serve the region’s peak needs.
It remains unclear how ISO-NE will choose to incorporate the Trimount project into its transmission planning once it’s operational, Detweiler said. “We are confident that they will notice when a project like ours goes up. The question is how they do the valuation.”
Sitting below sea level along the hurricane-prone Gulf of Mexico, New Orleans is particularly vulnerable to losing power during extreme weather. But the city plans to tackle that problem by helping residents buy backup batteries, which will make the grid more resilient.
In December, the New Orleans City Council ordered local utility Entergy New Orleans to design a $28 million battery incentive program for homes, businesses, and nonprofits (plus $2 million for administration and implementation). Crucially, the scheme won’t cost New Orleanians a dime: It will be paid for by a settlement Entergy reached with the city over problems at one of the utility’s nuclear power plants.
Entergy has until March 1 to file an implementation plan for the program, which is expected to launch later this year. Once the plan is up and running, the incentives could support batteries at around 1,500 homes and 150 community institutions. Those systems would provide backup power for the properties they’re sited on, but also inject power onto the grid when it’s strained.
This would propel New Orleans to the forefront of localities adopting virtual power plants, the concept of aggregating energy devices in homes and businesses and wielding them like a traditional power plant for the good of the broader community. Vermont’s biggest utility has used home batteries to lower costs during heat waves; California tapped home batteries to meet demand in extreme moments; Texas has opened up a market-based version of the concept. But New Orleans would become a pioneer of virtual power plants in the Deep South, and would stand out for the scale of the program relative to the size of the territory.
“We hope if you were already on the fence about getting a battery, here’s a chance to participate in a utility program,” said Ross Thevenot, senior project manager at Entergy New Orleans, who oversees the customer-facing battery effort. “We’re the Crescent City — we’ve got water on all sides of us. Customer resilience is obviously important.”
The new investment builds on Entergy’s pilot virtual power plant, which enrolled nearly 140 customer-owned battery systems across the city last year. EnergyHub, a cleantech startup acquired by smart-home company Alarm.com in 2013, manages the distributed controls for the pilot and will run the expanded program. The initiative also builds on a grassroots effort called Community Lighthouse, which formed after 2021’s Hurricane Ida and has installed backup-battery systems at nearly 20 churches so that they can offer shelter and light to neighbors during grid failures.
“We’ve seen how useful those can be when there’s a power outage,” said Nathalie Jordi, who works with Together New Orleans, the nonprofit that spearheaded Community Lighthouse, and who advocated for the new virtual power plant. “But how great would it be if, when the power goes out long-term after a hurricane, we have nursing homes that don’t lose their power, we have hardware stores, we have bodegas, we have firehouses?”
If the emerging plan succeeds, New Orleans could teach other parts of the U.S. how to build a cleaner, more responsive grid in a way that brings the whole community along.
Arushi Sharma Frank, a D.C.-based distributed energy expert, got an urgent message from Jordi in September 2024. The New Orleans City Council, which, unusually, serves as the city’s utility regulator, wanted to hear how the Community Lighthouse locations had performed during outages from Hurricane Francine earlier that month. Together New Orleans knew there was settlement money available, and it wanted to bring the council a fully-fledged virtual power plant proposal that could put those funds to work. Jordi wondered if Frank could propose a turbocharged virtual power plant like she’d helped design in Texas and Puerto Rico.
For Frank, this offered a chance to harness existing grid technologies to save lives in the aftermath of a hurricane or other disaster.
“There are life-threatening conditions that can be averted if people can get to shelter with power and cooling quickly,” Frank said. Small-scale batteries could ensure that “we have a place that any human in New Orleans can walk to in 15 minutes that has power after a storm.”
She got to work, compiling a proposal in 72 hours and arranging for people to testify from 12 other states with operating virtual power plants. The last-minute blitz worked: The City Council green-lit an effort to explore the concept, culminating in the December order.
Often, the companies selling energy devices to regular people cast themselves as electric Davids taking on the utility Goliath — as disrupters of a failing status quo.
In New Orleans, Frank said, the community groups were able to “remove this tone of adversarialism” that frequently crops up in virtual power plant proceedings around the country, and instead design something “generative, as exposed to extractive.”
The program creates a new market opportunity for solar-battery installers, with upfront incentives that can shave up to $10,000 off the cost of batteries for homes or $100,000 for businesses. It will still be up to cleantech companies — local ones or national brands like Sunrun or Tesla — to compete for customers’ business and guide them through the sales process. Those companies will be the ones designing the systems to provide backup power in the event of outages. And the order earmarks 40% of the residential funds for households with low to moderate income, ensuring installers don’t just pitch to more-affluent customers.
Once the batteries are installed and hooked up to EnergyHub’s control software, it becomes Entergy’s job to decide how and when to use them to benefit the power system more broadly. The regulated monopoly utility has knowledge that battery vendors don’t: which parts of the grid need more capacity or struggle to manage voltage when clouds interrupt rooftop solar production, for example, and other such nuances of a complex interconnected network.
Since Entergy runs the grid and charges customers for the service, it’s also able to pass along savings in the event that the virtual power plant lowers overall grid costs.
“Nonparticipating ratepayers are definitely enjoying the benefits of just having more affordable power, because VPPs are cheaper than traditional grid infrastructure and much quicker to stand up,” said Gabriela Olmedo, EnergyHub’s manager of policy and regulatory affairs.
If Entergy can eventually harness tens of megawatts of aggregated battery capacity, Thevenot said, the utility could bid that into the Midcontinent Independent System Operator’s regional grid and use the ensuing revenue to pay down costs for the overall customer base.
Utilities habitually seek an extended trial phase for “new” technology, even if the same equipment has been operating successfully for years elsewhere in the country. Sometimes, that preference for diligent study pushes off adoption of viable grid technologies. In this case, though, New Orleans was able to move swiftly on its virtual power plant because Entergy’s initial foray had laid a careful groundwork.
Under its existing pilot project, EnergyHub manages those nearly 140 batteries — mostly in homes, but also about a dozen in Community Lighthouse installations. The program pays homes up to $600 per year for sending energy to the grid for two-hour stints when demand is especially high. Last year was the first full year this system operated, and Entergy dispatched it six times, Olmedo said, largely to test that the system works.
“We started slow and steady: Let’s learn what the positives and potential speed bumps are,” Thevenot said. “It was a true pilot. We were trying to learn as much as possible.”
Entergy “got great data,” he added, and learned to troubleshoot in situations when batteries didn’t respond because of issues like internet-connectivity lapses or system settings preventing power from being dispatched.
Having six dispatches per year falls on the leisurely end of the virtual power plant spectrum. A program in Oahu, Hawaii, for instance, pays customers to set their batteries to discharge for two hours every evening, when the island grid is bound to have high demand.
That said, in this pilot phase, Entergy wanted to be judicious about using the batteries that customers had already bought and paid for, Thevenot said. And the summer of 2025 proved to be far less stressful for the local grid than the previous summer, dampening the need for battery assistance.
The plan had been to increase dispatches to 30 per year, Olmedo noted. (The forthcoming implementation plan will decide what the target is going forward.)
Each dispatch will make a far bigger difference once the new funds get disbursed: The incentives are expected to support roughly 10 megawatts of residential batteries and 10 megawatts of nonresidential, Olmedo said. All that capacity will fall within the city boundary, making for a far more concentrated impact than programs that sprawl over, say, the state of California.
Normally, a small customer base can make it hard for a utility like Entergy to propose spending on innovative programs like a virtual power plant, Frank said. The cost of a battery subsidy would be divided among the customer base, and there simply aren’t many customers to split the tab; many New Orleans households earn a low or moderate income, making them especially sensitive to jumps in utility bills.
“If we were forced to do this and run $28 million through some kind of rider we’d have to collect from customers, that would be a different conversation,” Thevenot said.
The pot of settlement dollars circumvented this dynamic, funding innovation without adding to anyone’s monthly bill. “Any dollar that they do spend on creating socialized infrastructure, it also goes further because of the same math,” Frank added.
This may limit how replicable the New Orleans experience can be in other locales. “Wait for a bucket of utility penalty funds to materialize” is not a particularly actionable directive for would-be grid reformers. But New Orleans can show the world what good a bunch of batteries can do, and quantify eventual operational savings for the whole customer population. Then, advocates can argue for funding this sort of program on its own merits, based on evidence of how useful it has been in the Crescent City.
Jeff St. John contributed reporting.
Artificial intelligence’s bubblitude fizzes with circular transactions, risk concealment, and exotic real-estate debt finance. In a frenzy to build AI data centers, Big Tech recently borrowed and bonded more money in 11 weeks than in the previous three years combined. More than a thousand new data centers are under construction or planned nationwide. Though they don’t yet know how many of those facilities will eventually materialize, energy suppliers are using AI data centers’ ravenous appetite for electrons to justify vast new investments in gas and nuclear power plants and the revival of uneconomic coal plants, claiming that all are needed to win the AI arms race and keep the lights on.
This trillion-dollar surge is transforming not only equity and capital markets but also the future U.S. power mix, locking in decisions that will shape energy affordability for decades. Smarter, cheaper, cleaner, less-risky options for powering data centers exist — if decision-makers choose them.
To meet all the expected new electricity demand, the U.S. has rapidly proliferated its gas-fired capacity under development in 2025. For context, at the start of 2024, only 4 gigawatts of gas-fired power in the U.S. development pipeline were explicitly earmarked for powering data centers. Today, over 100 gigawatts are.
And developers are proposing to invest over $400 billion to build more than 250 gigawatts of new U.S. gas-fired power plants — nearly tripling the gas power pipeline in a year, mostly driven by speculative AI projects subsidized by 37 heavily lobbied state governments.
Some data centers are even being mandated as “critical defense facilities” to be built on federal land, alongside otherwise uneconomical nuclear plants exempted from strict Nuclear Regulatory Commission scrutiny, all at taxpayer expense. This is happening, ironically, in Texas — the nation’s free-enterprise leader in solar, wind, and batteries. These renewable resources totaled 97% of its 2025 capacity additions, while fossil fuels amounted to 3%, and nuclear 0%. But in the past two years, planned gas plants in Texas nearly quadrupled, to 80 gigawatts. Only China has more gas plants under development than Texas, and nearly half the Texas plants are meant to power data centers directly.
We’ve seen this movie before. A quarter century ago, the coal industry warned that the Internet would overwhelm the grid without massive new coal capacity. Demand proved to be over tenfold lower. The dot-com bubble burst in 2000, permanently vaporizing $120 billion of electricity investments and embalming another $80 billion in infrastructure built long before it was needed. Today’s AI mania rhymes: Gas and nuclear vendors that can’t beat energy efficiency and renewables in competitive markets are leveraging hype into mandates and subsidies to rescue their losers.
Yet capital markets increasingly fear that AI looks like a bubble set to pop. That’s because each new data center effectively bets against at least 10 plausible outcomes that make the investment unwise: Scaling large language models could fail to achieve superintelligence; customer revenue could disappoint; inaccuracy may persist; smaller and leaner models might keep outperforming giants; copyright infringements may have to be paid for; data centers may go on quadrupling their energy efficiency every year; and flexible interconnection might stretch existing grid assets to serve all new demand.
Each new power plant also bets against the ways that data centers may access cheaper electricity, such as adding pop-up microgrids, colocating renewables and storage at idle gas plants, and buying efficiency, flexible load, storage, and clean supply from other customers. Betting against any one of these realities is risky. Betting against all of them strains credulity.
Many utilities are already trimming projections toward reality. Regulators in data-center hot spots are scrambling to shield customers from accelerating and politically sensitive rate hikes — already up 16% in Illinois, 13% in Virginia, 12% in Ohio, and 6% nationwide. Meanwhile, actual data-center demand still barely shows up in national totals. U.S. weather-adjusted electricity use fell in 2023, then rose by 2% in 2024, about one-twentieth due to new data centers. Nearly all the growth comes instead from air conditioning, electrifying buildings and vehicles, and reshoring industry. These needs can all be more cheaply met by better efficiency, and by another vast and potent competitor to fossil fuels: renewables.
Globally, data centers — roughly one-ninth of which are devoted to AI — use about 1.5% of today’s electricity. The International Energy Agency forecasts they’ll grow in this decade while renewable supplies grow 11 times more. Thus, solar and wind power, now swiftly displacing costlier fossil-fueled and nuclear power, dwarf the AI boom. Speed to market is paramount for AI developers, so many smart tech companies choose renewables to get their data centers built and running quickly and cheaply.
However, other AI firms have rushed for gas power, and that stampede has doubled gas-plant costs and backlogged gas turbine deliveries to past 2030, to the point that two-thirds of gas-plant project proposals have no named turbine manufacturer. This jam has pushed about a fifth of projects to substitute off-grid gas power, often using adapted aircraft jet engines. These turbine generators are easily available but engineered to meet peak demand, so they’re inefficient, noisy, and dirty. Running them constantly to power data centers would quickly inflate electricity costs and magnify public health damages. U.S. data centers were already projected to cause more than $20 billion per year in asthma and cardiopulmonary disease costs by 2030. Communities will not welcome additional pollution, water stress, noise, and rate hikes.
Gas markets magnify the financial risks of turning to gas to power data centers. New gas wells decline faster than old ones, while falling oil prices can make new drilling and refracking unattractive. At the same time, exuberant exports of liquefied American gas (and gas pipelined to Mexico) are pushing gas toward both global glut and domestic scarcity. The analysts at BloombergNEF predict that new gas-fired AI power could tip the 2025–30 U.S. gas surplus into a deficit, making volatile gas prices for heating, industry, and utilities spike. Indeed, BloombergNEF says wholesale gas futures for 2028–30 are unsustainably priced below production cost. And whatever the gas price, new gas-fired power plants are likely to become underutilized, subsidized assets that burden electricity customers long after today’s AI ebullience fades. While many data centers will be built, many won’t, and many won’t actually run at full tilt for decades to come — stranding gas plants and pipelines built to power them.
Even as national policy reinforces a gas lock-in, power choices that can scale at AI speed already dominate actual markets. Renewables captured over 92% of the world’s new generating capacity in 2024 and (including storage) about 90% of U.S. additions in 2025, with 93% expected in 2026. They are far cheaper than gas power, keep getting cheaper, sell on constant-price contracts for decades, and finance like low-risk annuities. They’re virtually unlimited and deploy at industrial speed.
Last May, China added 1 gigawatt of solar and wind power roughly every six hours around the clock. Pakistan displaced 30% of its utility power with solar in four years. Vietnam added solar equivalent to half of its coal generation in two years. South Australia generates 75% of its annual electricity from renewables and will reach 100% by 2027, driving 37 firms to propose relocating there to secure stable, low-cost power. Global metals giants Rio Tinto and BHP are relying on “renewable baseload” power to smelt aluminum and mine copper. Apple’s data centers have run on fully renewable energy for more than a decade. Google just announced that on-site solar, wind, and battery power will get its new 850-megawatt Texas data center online in 18 months, not five-plus years.
Critics have long claimed that variable renewables are too unreliable: The wind doesn’t always blow, and the sun doesn’t always shine. But evidence shows that intermittency concerns are now generally unfounded. Ten proven carbon-free balancing methods already make high-renewable grids reliable and economic in many countries. One of those methods, batteries, costs 96% less today than it did in 2010. BloombergNEF finds that battery-firmed solar and wind deliver steady power more cheaply than any new fossil or nuclear plants, and many operating ones. That’s why three-fourths of India’s new firm capacity today is solar-plus-storage.
Renewables also offer essential speed. In Sparks, Nevada, the world’s largest solar-powered microgrid continuously powers modular data centers. Solar panels laid on desert ground feed hundreds of second-life electric-vehicle batteries joined to form a superbattery. It was all built in four months and delivers electricity that’s cheaper, quieter, and more reliable than grid power; uses virtually no water; emits nothing; and is even portable. This is what clean, scalable, market-speed power looks like. Gas isn’t it.
AI does have some valuable applications. No one yet knows, though, if its revenues can repay the immense and swiftly depreciating investments required. But while markets are answering that trillion-dollar question, the AI boom must not be allowed to undermine American energy affordability and security.
Utilities and regulators can protect existing customers with a simple safeguard, giving teeth to vague qualitative pledges: Sell power to new data centers only under “take or pay” contracts that repay the entire electricity investment regardless. Those agreements should be backed by robust bonds or insurance, priced by capital-market risk experts (not by developers), to ensure that if an AI venture collapses, losses fall on the developer, not on households and small businesses.
If markets, and not mandates, determine the outcome, the conclusion is already clear. Gas, coal, and nuclear are too slow, too costly, and too risky to anchor the next wave of U.S. power demand. The only technologies that scale quickly enough, cheaply enough, and reliably enough for AI already dominate global additions. Policy will now decide whether Americans will enable the new energy system or protect the old — and whether they’ll pay for stranded gas plants or profit from the cheapest and most secure electricity in history.
Back in the summer of 2024, Minnesota utility Xcel Energy proposed a novel approach to building virtual power plants, the networks of rooftop solar systems, home batteries, and other energy equipment that can operate in tandem to reduce strain on the electric grid.
Instead of working with other companies to cobble together solar arrays and batteries at homes and businesses — the traditional model for VPPs — Xcel wanted to install, own, and control those devices itself, using its grid expertise to deliver a better bargain for its customers at large.
Now, a year and a half later, the plan is in — and clean energy advocates, solar industry groups, and state agencies say it doesn’t live up to Xcel’s promises.
In filings with the Minnesota Public Utilities Commission, these groups say Xcel’s Capacity*Connect (C*C) plan, unveiled in October, is likely to be slower, more costly, and less impactful in relieving grid stresses and energy costs than the customer-centered VPP programs already in place or being rolled out — including one by Xcel in Colorado.
As Minnesota’s Office of the Attorney General wrote in its initial comments, “Although Xcel suggests that C*C is uniquely innovative, it may simply be a uniquely expensive way to accomplish the same thing other states have accomplished for less ratepayer money.”
Xcel is asking for permission to spend at least $152 million to deploy 50 megawatts of batteries, and up to $430 million for 200 megawatts, through 2028. Those costs will be borne by its customers. And as capital expenditures, they will offer the utility a guaranteed profit on every dollar spent — a perk Xcel wouldn’t get if it relied on the traditional VPP model.
In its petition to regulators, Xcel says the plan is a first step in learning how to best integrate distributed energy resources across its grid, as called for by state utility policy for the past decade. It also argued that “non-utility-owned resources could deliver, at best, a portion of the anticipated system and customer benefits.”
Backers of this utility-led approach include Jigar Shah, a Biden administration Department of Energy official who has long championed the value of using batteries and other distributed energy resources — DERs in the jargon — as an alternative to big, costly, and hard-to-build power plants and transmission lines.
“For the first time in my professional career, we have a utility company formally agreeing with the fact that distributed power plants are essential to maintaining reliability and meeting load growth,” Shah wrote in a December LinkedIn post. “This is a huge win for our entire industry, and efforts by industry groups to torpedo this proposal can’t see the forest for the trees.”
But John Farrell, co-director of the nonprofit consumer advocacy group Institute for Local Self-Reliance and a longtime utility critic, argues that Xcel Energy is trying to monopolize the grid value of solar and battery systems, which customers are already willing to pay for to save money and provide backup power.
Utility ownership might be an acceptable alternative if it could be done faster and cheaper than the VPPs being put together by solar and battery installers like Sunrun, Tesla, and a host of other companies, Farrell said. But “if utilities are supposed to be so good at this, why is the cost-benefit analysis underwater?” he asked. “And why is it so slow?”
Logan O’Grady, executive director of the Minnesota Solar Energy Industries Association, doesn’t want to be too critical of Xcel’s plan. After all, his group and other solar advocates have spent years pushing utilities to rely more on rooftop solar, backup batteries, and other DERs. It hasn’t been easy. Utilities have long been leery of the reliability of these technologies, and instead prefer tried-and-true grid upgrades and utility-controlled equipment.
“This has been a tricky one, because for 10 years, people on our side have been saying to the commission and utilities, there’s value in the distribution system — you should invest there,” he said.
That argument is backed by an analysis from the DOE, promoted by Shah during his tenure, that found rooftop solar systems, backup batteries, electric vehicles, smart thermostats, and grid-responsive water heaters could provide 80 to 160 gigawatts of VPP capacity by 2030 in the U.S. That would be enough to meet 10% to 20% of the nation’s peak grid needs and save utility customers roughly $10 billion in annual grid costs.
“So when [Xcel’s] proposal first came out, in one sense it was like, ‘They’re finally listening to us,’” O’Grady said. “But in another sense it was, ‘They’re going too far by proposing only utility ownership.’”
That’s a significant departure from the status quo, the Minnesota Solar Energy Industries Association, Coalition for Community Solar Access, and Solar Energy Industries Association trade groups wrote in comments to the Minnesota PUC. “Traditional VPPs are technology-agnostic portfolios of customer-sited and third-party-owned resources,” they wrote. “Participation is open, competitive, and decentralized.”
By contrast, Xcel’s C*C plan would rely completely on utility-owned batteries of between 1 and 3 megawatts, the kind that usually come in shipping containers. Xcel plans to pay an undisclosed amount to businesses or nonprofits willing to host those batteries on their properties. But rather than connecting the equipment in those customers’ buildings, the utility would instead connect the batteries directly to its grid, preventing them from providing emergency backup power to participating customers.
To secure customers willing to host those batteries, Xcel Energy has proposed hiring Sparkfund, a company founded in 2013 that has promoted the “distributed capacity procurement” concept that forms the basis of the C*C plan. Xcel’s plan marks its first stab at implementing distributed capacity procurement.
But deploying utility-owned batteries via a single commercial partner is “unprecedented in VPP programs and raises significant competitive-market concerns,” the solar trade groups wrote.
Chris Villarreal, president of consultancy Plugged In Strategies and former director of policy at the Minnesota PUC, shares those concerns. In comments filed on behalf of the R Street Institute, a free market–oriented think tank where he serves as an associate fellow, Villarreal recommended that regulators reject the plan or, at a minimum, “ensure Xcel does not exercise monopoly power at the expense of other competitive and potentially lower-cost alternatives.”
“There are a couple of things that annoy me about this from a practical perspective,” Villarreal told Canary Media. “One is the exercise of monopoly power over competitors.” Xcel is proposing to give Sparkfund access to grid and customer data that “no competitor would be able to get” without signing nondisclosure agreements, he said. “Meanwhile, we have community solar gardens, solar developers, storage developers, that want to do the same thing.”
This lack of grid transparency is troubling, O’Grady said, given Xcel’s track record of making it difficult for customers and third-party developers to add batteries and community and rooftop solar to its grid. “Minnesota has a grid-congestion problem, and lack of utility investment to solve that problem,” he said.
At the very least, Xcel should subject its battery systems to the same process third-party developers and customers must go through to connect to the grid, O’Grady said. Under the C*C plan, “they circumvent that entire waitlist to interconnect — and that doesn’t seem fair.”
State regulators anticipated these concerns. The Minnesota PUC’s 2024 order allowing Xcel Energy to pursue the C*C plan required the utility to compare the costs and benefits with those of “alternative models” using customer and third-party-owned resources.
But Xcel Energy appears to have short-shrifted that requirement, said Erica McConnell, a staff attorney at the nonprofit Environmental Law & Policy Center. Instead of offering a cost comparison, Xcel asserted in its petition that “anything less than full operational control and visibility of these assets — which will operate functionally as part of our system — could present safety risks for our employees and the public and could create cybersecurity risks for our system.”
These statements appear to ignore the experience of other utilities managing VPP programs, McConnell said. In essence, she said, the utility dismissed the prospect of alternative approaches by saying, “‘It’s dangerous if we let other parties do it.’ That’s disappointing to us. We need alternative pathways.”
Xcel Energy disputes that it ignored regulators’ instructions. The utility lacks “quantitative information” on those alternatives, and “would need to speculate on these costs and benefits, which would inevitably lead to unresolvable disputes,” it wrote in reply comments.
Xcel also highlighted that it’s offering customers and third-party developers other pathways to add solar and batteries to its grid, including its long-running community solar program and incentives for backup batteries. Nearly all of the more than 1.3 gigawatts of distributed solar and storage on Xcel’s system in Minnesota is owned by third parties, it noted.
But the C*C program is focused on solving a much broader range of challenges on its grid, which requires greater precision than Xcel can achieve from customer-owned batteries, the utility said. It argues that it needs such rigorous control over the systems to cut costs and improve overall grid reliability for customers at large, in what it called a “marked shift in distributed energy policy.”
Critics have their doubts, however, about whether the benefits of Xcel’s plan will outweigh the costs.
The Minnesota Office of Attorney General wrote in its comments that it supports efforts to meet the state’s carbon-cutting goals while keeping rising energy and grid costs in check. But it also asked regulators to put a “hard cap” on Xcel’s spending, noting that it “stands to be a quite expensive program.”
Xcel’s C*C budget calls for spending up to $430 million for deploying 200 megawatts of batteries, it wrote, which equates to $2,150 per kilowatt of battery installed — well above typical costs for grid batteries.
It’s also more expensive than what Xcel Energy intends to spend on a gas-fired “peaker” power plant it’s planning to build in Lyon County, Minnesota, the office noted. That’s despite data from DOE’s VPP report indicating that typical VPP capacity can be more than 40% cheaper than that of conventional peaker plants, which run only at times of extremely high demand.
And Xcel’s proposed budget is well above what the Public Service Co. of Colorado, Xcel Energy’s utility in that state, intends to spend on its proposed Aggregator Virtual Power Plant pilot program. That program will pay third-party aggregators that equip customers with resources — including batteries, smart thermostats, smart water heaters, smart heat pumps, and EV chargers — that can inject electricity onto the grid or reduce power use. It is targeting 125 megawatts of capacity for a five-year budget of $78.5 million, or roughly $625 per kilowatt.
Xcel says these comparisons don’t tell the whole story. The Colorado program covers only five years of payments to aggregators, while the Minnesota program is modeled to cover the cost of assets for 20 years, Xcel spokesperson Theo Keith told Canary Media in an email. “When you model both programs over 20 years, their costs are similar.”
“Capacity*Connect will be more complex to operate and coordinate than the Colorado [program],” Keith added, because it’s designed to do more than simply reduce peak electricity demands across the entire grid.
Instead, C*C is meant to target particular points on the utility’s distribution grid that might otherwise need costly upgrades. This is the portion of the system that, unlike giant transmission lines that cover long distances, brings power directly to homes and businesses. Costs related to the distribution grid are the single biggest driver of rising utility bills in the U.S.
“Through the deployment of distributed batteries, we (and thus our customers) will save more money by avoiding more expensive grid upgrades than the payments made to program participants,” Keith wrote.
But Xcel’s plan will take years to use its batteries for this kind of deferral. Its initial phase will limit them to reducing systemwide energy and capacity costs — the same kind of task that demand-response programs have been doing for decades. Not until “Phase 3” of its plan, set for between 2028 and 2031, will Xcel “seek opportunities to stack additional distribution value streams,” like finding ways for batteries to defer costly grid upgrades.
Delaying that work doesn’t sit well with nonprofit groups such as the Environmental Law and Policy Center, Vote Solar, Solar United Neighbors, and Farrell’s Institute for Local Self-Reliance. In their comments, they asked the Minnesota PUC to require Xcel to set a mid-2027 deadline to “take concrete steps to advance distribution value” — and to set up a way for third-party and customer-owned technologies to participate.
The Minnesota Department of Commerce concurred. In its comments to regulators, it laid out a series of changes that it and clean energy advocacy groups agreed Xcel should make to its plan to more quickly take on the advanced grid services it’s currently proposing to delay for years to come.
For one, the department recommended that regulators require Xcel to target its batteries to fix known reliability issues or “defer specific, budgeted infrastructure investments” on the distribution grid — something that utilities in California, Massachusetts, and other states are doing in pilot projects.
Another recommendation for Xcel that’s being done by other utilities is to use its batteries to make room on congested parts of the grid for more customer-owned or community solar to come online. That could help solve the long-standing interconnection bottlenecks that rooftop and community solar providers have been complaining about.
Shannon Anderson, a policy director at the nonprofit Solar United Neighbors, which helps households organize to secure cheaper rooftop solar, highlighted one big difference between the approaches taken by Xcel in Minnesota and in Colorado. In Colorado, the utility’s VPP approach is guided by a law passed by the state legislature in 2024. Minnesota lacks such a policy; a VPP bill failed to pass last year, although its sponsors plan to reintroduce the legislation this year.
“The Minnesota story is part of a national trend,” said Anderson, who is leading Solar United Neighbors’ work with a coalition sponsoring VPP legislation in multiple states. “The more legislative direction can give them guidance and political support, the better.”
A massive new battery has entered service in southern Maine, providing a much-needed boost to the Northeast’s efforts to expand clean and affordable energy.
Developer Plus Power wrapped up its Cross Town Energy Storage project in late November, but publicly inaugurated it last week in a ceremony featuring Gov. Janet Mills, a Democrat, who has championed clean energy for the state and is currently running for Senate. Now, the small town of Gorham, nine miles inland from Portland, hosts a battery plant capable of injecting 175 megawatts for up to two hours, a bigger capacity than any other battery in New England.
“During Winter Storm Fern, we were 100% available and ready to contribute capacity with no emissions,” said Polly Shaw, chief external relations officer at Plus Power. “With a response capability of 250 milliseconds, there’s no faster asset that New England can rely on to help when they need capacity or grid services.”
New England states have issued a raft of energy storage targets in recent years, meant to complement their bevy of commitments to grid decarbonization. By 2030, Massachusetts aims to have 5 gigawatts, Connecticut 1 gigawatt, and Maine 400 megawatts. So far, however, it’s been slow going, even as storage has taken off in states like California and Texas. New England has managed to build just two battery installations with more than 100 megawatts: Plus Power’s Cross Town and its 150-megawatt Cranberry Point Energy Storage project, which came online in Massachusetts in June.
Plus Power has distinguished itself by entering into markets before they become saturated. For these two projects, the company won seven-year contracts in a 2021 forward capacity auction for the Independent System Operator New England, which runs wholesale power markets for the region’s six states. ISO-NE subsequently switched its capacity auctions to one-year awards — a move that complicates storage development in the region, as short-term contracts make it harder to attract project financing.
As it stands, Plus Power can claim the federal investment tax credit for 30% of the cost of the storage plant. Then it can earn revenue from the capacity contract and by bidding ancillary services in the wholesale market. Batteries can also arbitrage energy by buying when it’s cheap (typically when there’s an influx of renewable production) and selling when it’s expensive (typically when there’s increased reliance on gas-burning peaker plants).
Plus Power hired 25 full-time employees during construction of Cross Town and will employ two permanent maintenance staff now that the largely automated facility is running. It will contribute $8 million in tax revenue to Gorham, Shaw said.
Cross Town has an advantageous location in southern Maine, near Portland, the state’s biggest city. That allows it to work around transmission constraints, charging up when onshore wind farms are producing farther north, and then making that power available when Portland or points south need it, Shaw noted.
This serves Maine’s target of having 90% renewable and 100% clean energy by 2040, among the more assertive clean-energy goals in the country. Batteries can help this goal by improving utilization of renewable electricity. Maine also passed its 2030 storage mandate in 2021; Gorham knocked out nearly half of that single-handedly.
The state is planning a competitive storage solicitation this year to keep moving toward the target.
“Maine is such a leader on renewable energy, climate policy, and battery storage policy that it sent a long-term signal to come and invest in Maine,” Shaw said.
The battery could also tie into evolving conversations around energy affordability, which has become a primary political concern around the country. Mainers pay among the highest rates in the country for electricity and home heating. State energy analysts recently published a report that pinpointed fossil gas prices as a key driver of higher energy prices, since gas-burning plants typically set the market price for power in the region. Batteries provide peak power on demand without burning gas — and a broader build-out of facilities like Cross Town could put downward pressure on those sky-high prices.
Since the late 1800s, the grid has used more or less the same devices to convert electricity to different voltages. They’re called transformers — and they’re in increasingly short supply as power demand surges nationwide.
A crop of startups wants to solve that problem and modernize transformer technology at the same time — and they’re raising financing to do it.
On Wednesday, solid-state transformer startup Heron Power closed a $140 million Series B round from investors including Andreessen Horowitz’s American Dynamism Fund and Breakthrough Energy Ventures.
The new financing will allow the Northern California–based startup to build a factory at a yet-to-be-disclosed U.S. location capable of churning out 40 gigawatts of its medium-voltage power-conversion gear annually. It plans to start full-scale production in the second half of 2027 and have hundreds of megawatts of equipment produced by the end of that year.
Heron Power has already lined up 50 gigawatts of orders with more than a dozen prospective customers that are “actively engaged in technical product collaborations,” according to CEO Drew Baglino, who founded the startup in 2025 after an 18-year career at Tesla.
The firm is looking to initially sell not to utilities but rather to operators of solar and battery farms and data center campuses, which need to convert electricity as well. So far, it has disclosed only two of its early customers: Intersect Power, a major clean-energy developer that Google is acquiring for $4.75 billion, and Crusoe, a data center developer building a 1.2-gigawatt campus in Abilene, Texas.
While Baglino declined to share details about other prospective customers, he did say that Heron Power has been bringing many of them into its lab to see the prototype equipment being put through its paces. “We’re also doing integrated full-system deployments later this year,” he said. “It helps immensely for folks to get a sense of what we’re talking about and see the power processing in front of them.”
Heron Power isn’t the only company building next-generation power-conversion equipment. DG Matrix is planning to deploy its solid-state transformer via strategic partnerships with PowerSecure, a major developer of microgrids and data-center power systems that’s owned by utility Southern Co., and with Exowatt, a startup providing solar and thermal energy storage systems to data centers.
On Wednesday, the Raleigh, North Carolina–based DG Matrix announced a $60 million investment led by Engine Ventures and including Mitsubishi Heavy Industries and electrical-equipment manufacturing giant ABB. The Series A funding will enable the company to scale up manufacturing and deepen “strategic partnerships with datacenter developers, hyperscalers, utilities, and industrial customers,” according to the company’s press release.
Another startup in the space, Resilient Power, was acquired last year by electrical equipment giant Eaton in a deal worth as much as $150 million.
Solid-state transformers digitally manipulate the flow of electricity, employing the same kind of power electronics that are used in solar and battery inverters and in electric vehicle drivetrains. “Solid state” refers to the semiconductors that make that digital power manipulation possible. “Transformers” is a nod to the 19th-century electromechanical devices that convert the voltage of alternating current via copper wires wound around iron cores.
Solid-state transformers are a timely replacement for those devices for a couple of reasons. They’re far more flexible than old-school electromagnetic devices, meaning engineers can do more things with one device. They’re also urgently needed because conventional power equipment — particularly transformers — has been unable to keep up with the demand created by the fast-growing electricity sector.
The technology itself is not brand new. High-frequency digital power-switching technologies are already used for specialized purposes such as massive high-voltage direct current (HVDC) converters. And inverters — another form of digital power-switching tech — are an integral part of EV chargers and solar and battery installations.
Over the past decade or more, various efforts to expand the role of solid-state power-conversion technologies to replace a wider array of systems have struggled to gain traction, given high costs and technical challenges. But Heron Power’s Baglino thinks that the time is right for this tech, as costs come down and major customers seek out effective alternatives to the backlogged and increasingly expensive conventional options.
As with many other digital technologies, “power semiconductors have had their own version of Moore’s law,” Baglino said. In the past five years or so, these improvements have made it “not only feasible but economically attractive to replace inverter skids — with an old-school transformer at solar and battery facilities — with a power electronics solution.”
Those “inverter skids” he mentioned are shipping-container-size combinations of electrical gear — step-down and step-up transformers, switching and protection gear, and inverters themselves — that convert direct current from solar panels and batteries to grid-ready alternating current. Similar combinations of gear are used to convert grid electricity to direct current needed to power heftier commercial and industrial sites — such as data centers.
Unlike traditional high-efficiency transformers, solid-state power-conversion devices don’t need specialized grain-oriented electrical steel, which is now in short supply. Instead, they use the same silicon carbide and gallium arsenide semiconductor supply chains feeding EV markets, Baglino said, “and the EV supply chain has expanded rapidly over the past decade or so.”
Solid-state transformers also weigh less and take up less space than the gear they replace, he said. They’re capable of a wider range of functions, including regulating power quality fluctuations, which can wreak havoc on data centers, and they can be used for multiple applications, unlike traditional equipment.
As for the cost, Baglino said prices for Heron Power’s electronics are competitive with those for traditional tech. “We’re not asking for any premium over the solutions they’re buying right now.”
Like DG Matrix and Resilient Power, Heron Power is targeting data centers, solar and battery farms, and dense EV charging sites for early adoption, since that’s a “fast-growing market with motivated customers,” Baglino said.
Heron Power’s Heron Link devices are designed to handle typical utility distribution substation voltages of 34.5 kilovolts and to deliver 600-volt direct current. That higher-than-typical voltage aligns with the latest data center power architectures being pursued by major AI players such as Nvidia.
“But we have every intention of bringing the benefits of solid-state transformers to the AC-to-AC world,” he said, referring to the need for transformers to step voltage up and down without converting it to direct current. “A single SST can decouple faults, it can do power factor control, it can do voltage regulation, frequency regulation, all this monitoring and control of the power flow that utilities don’t have with passive transformers.”
While these are all useful capabilities, utilities are not eager adopters of novel technologies. Over the previous decade, companies that have built power electronics for utility distribution grids have closed up shop or have been acquired and fallen from public view.
But the combination of technical improvements and growing grid pressures may make this decade different. “Once we prove the technology is performing well” for solar farms and data centers, Baglino said, “we can go back to utilities.”
Before temperatures plunged to the teens in the wee hours of Feb. 2 in North Carolina, Duke Energy pleaded with customers like me to conserve.
Since electricity supplies would be strained, the utility said in a blanket email, we could help avoid planned blackouts by lowering our thermostats and perhaps putting on a sweater. I got a text, too, asking me to cut back on “nonessential energy use.” In other words: Embrace my inner Jimmy Carter.
The missives worked, in that Duke didn’t have to schedule outages around the state, but they also provoked resentment. At public hearings, some complained that large customers like data centers probably didn’t get the same appeal. On social media, I saw at least one energy policy wonk contend that the utility should be paying customers — not just asking them nicely — to reduce their energy use.
But it turns out that Duke also does that. I should know: Late last year, I joined the throng of Tar Heels who let the company remotely adjust our smart thermostats by a few degrees when needed in exchange for a credit on our bills. It’s just one example of the sort of demand-response program that clean energy advocates say should be expanded not just in North Carolina but also nationwide, as climate change leads to more frequent extreme weather that taxes electricity supplies.
While broad solicitations like the one I received on Feb. 1 can help relieve stress on the grid when every watt counts, paying customers to enroll in ongoing programs can have a more substantial effect. Plus, they offer some much-needed utility-bill relief for households dealing with skyrocketing energy costs in North Carolina and beyond.
A version of the incentive program I participate in has been around for nearly two decades, after a 2007 law required Duke to invest more in energy efficiency. Long an option in the summer for those with central air conditioning, the scheme was recently extended into the winter. Around 500,000 customers are enrolled in the warm months, Duke says, and some 66,000 are signed up in the cold months. (Participation is lower in the winter partly because many customers heat their homes with gas rather than electricity, per the utility.)
Duke hasn’t yet analyzed the precise effectiveness of this one residential incentive program during this year’s unusually frigid temperatures. But it says the combination of this household initiative, similar ones for business customers, and the mass conservation request all made a difference.
“The collective efforts of customers in our demand response programs and those who voluntarily reduced their energy use made a substantial impact during the stretch of extreme cold and unusually high energy demand,” spokesperson Jeff Brooks said in an email. “Across our Carolinas service areas, customers helped reduce demand on the grid by contributing hundreds of megawatts of electric load reduction.”
Hundreds of megawatts is no small matter. It’s the equivalent of the grid getting an additional small gas-fired power plant — but without the associated pollution or cost.
For consumers, there were clear upsides, too.
In Raleigh, where I live, the scheme is called EnergyWise. In other parts of Duke’s territory, it’s called Power Manager. Everywhere, the idea is the same: Customers with electric heat and thermostats connected to the internet get a $150 credit for enrolling, then $50 a year after that, plus whatever money we save by using a little less heat than we might otherwise. It’s not a staggering amount, but since the average Duke household in North Carolina spends about $154 a month on electricity, it’s not nothing, either.
For my part, the savings have been meaningful. I live in a small house powered partly by solar panels, so I’m not a prototypical Duke customer. But since joining the program in early December, I’ve paid the utility all of $6.45, thanks to the sign-up incentive. (My bill due in March, to be fair, is close to $130.) With Duke proposing rate increases of 15% in the coming years — and a 2025 law requiring households to shoulder more of the burden when the company buys power from outside the state — I’ll take the extra dollars where I can.

“Active savings events,” whereby Duke lowers my temperature setting a few degrees for one or two hours, happen a few times a month, per the company, or not at all if the weather is mild. A message on my physical thermostat, and on the phone app that controls it, tells me when an event is underway. I can opt out at any time by changing the temperature as I see fit.
A Gen Xer, I grew up in a household where only one person — my father, born in the throes of the Great Depression — could control the thermostat. His rule was kind but firm, with winter settings that never exceeded the high 60s. Sometimes he would cheerfully encourage an extra layer and start a fire. At night, he always set the temperature much lower.
Perhaps that upbringing, together with my career as an energy reporter, explain why I’ve felt the need to override a savings event only once so far. It wasn’t to raise the temperature but to lower it during the recent cold snap: I woke up in the middle of the night and realized I’d accidentally set the thermostat higher than normal. While Duke’s system had adjusted the heat down a few degrees, I wanted it to be colder still — a little bit for the planet, a little bit for bedtime coziness, but mostly for my wallet.
Of course, plenty of people will balk at giving Duke — a monopoly that almost by definition breeds distrust — control over their thermostats. And I can surely see how the entreaty for households to voluntarily conserve left a bitter taste when the company was reporting sky-high profits.
But I suspect there are scores of people like me, who are happy to do their part and save a little money at the same time with basically no risk. As for the half million North Carolinians already enrolled in the program, I know one thing for sure: They aren’t all energy reporters with solar panels.
A correction was made on Feb. 19, 2026: This story originally misstated the date of the Duke Energy email as Feb. 2; it came on Feb. 1.
Five years ago, Winter Storm Uri brought the Texas power grid to its knees. Temperatures plunged across the state for nearly a week, power plants froze, natural gas supply lines failed, and the grid operator came within minutes of a total system collapse. More than 4 million Texans lost electricity, many for days. Over 200 people died. It was the worst infrastructure failure in modern Texas history.
In the years since, Texas has quietly built one of the largest renewable energy and battery storage fleets in the world. According to capacity data from the Electric Reliability Council of Texas, the state has added roughly 31 gigawatts of solar capacity and 17 GW of battery energy storage — enough to power millions of homes. Over the same period, the legislature mandated weatherization of power plants and natural gas infrastructure, ERCOT improved its operational procedures, and new market mechanisms were introduced to better coordinate solar and storage.
The results speak for themselves. Since Uri, the Texas grid has faced three major winter storms that each set new all-time winter peak demand records. In every case, the grid held. No rolling blackouts. No load shedding. No emergency curtailments. Demand kept climbing, and the grid kept delivering.
This track record matters because a prominent Texas think tank, the Texas Public Policy Foundation, has published a widely circulated analysis arguing that ERCOT’s reliance on solar and battery storage is making the grid less reliable in winter. The analysis is authored by Brent Bennett and uses real ERCOT data. But as this article will show, Bennett’s own numbers contradict his conclusions — and the actual performance of the grid over the past five years contradicts them even more decisively.
The following chart I worked up offers a quick summary: Texas’ reliability has increased dramatically in recent years in direct proportion to the renewables and battery storage it has added.

The above data tells the story. At the time of Uri, ERCOT had roughly 5 GW of solar and less than 1 GW of battery storage. When Winter Storm Elliott arrived in December 2022, it had 14 GW of solar and 2 GW of storage. By Winter Storm Heather, in January 2024: 22 GW and 4 GW. By Winter Storm Kingston, in February 2025: 30 GW and 9 GW. And now, as we pass the fifth anniversary of Uri: approximately 35 GW of solar and 15 GW of battery storage.
During each of these storms, peak winter demand set a new record — climbing from 74,525 MW during Elliott to 78,349 MW during Heather to 80,525 MW during Kingston. Just three weeks ago, the grid sailed through another major winter storm with over 11,000 MW of operating reserves and ERCOT said it did “not anticipate any reliability issues on the statewide electric grid.”
In none of these events did ERCOT order load shedding. This is the track record that Bennett’s analysis asks you to ignore.
Now let’s turn to Bennett’s projected numbers for 2030. His Figure 1 posits that ERCOT could have 103,802 MW of firm output against a speculative peak demand of 110,000 MW — his estimate, not ERCOT’s. That’s a gap of roughly 6 GW. His projected battery fleet by 2030? Forty-three gigawatts.
Read that again: a 6-GW shortfall covered by 43 GW of batteries.
Bennett’s response to this rather obvious mismatch is to reframe the question entirely. Instead of asking whether batteries can cover peak demand windows — which is what they’re designed to do — he converts the entire battery fleet into a single energy metric: 77 GWh, which he says is “equivalent to running a single 1 GW thermal power plant for the duration of this three-day storm.” It’s a striking comparison. It’s also irrelevant to how batteries actually operate in ERCOT.
Nobody designs, operates, or dispatches battery storage as a 72-hour baseload resource. Batteries are designed to shave peaks, provide rapid frequency response, and bridge the morning and evening demand ramps when solar output is low. A 43-GW battery fleet can inject enormous amounts of power during exactly the narrow peak windows that Bennett’s own Figure 2 identifies as the problem periods. During Winter Storm Heather, ERCOT’s post-storm analysis confirmed that batteries were “partially supplementing the lack of solar generation available” during the coldest pre-sunrise hours — the exact scenario Bennett says they can’t handle.
Perhaps the most revealing aspect of Bennett’s analysis is what he doesn’t discuss: the massive existing fleet of gas, coal, and nuclear generation that forms ERCOT’s backbone. He projects 103,802 MW of firm winter output in 2030. That fleet — overwhelmingly fossil and nuclear — carries the grid through the vast majority of every storm hour in his model. The assumed thermal outage rate is only 12% — a figure drawn from ERCOT’s reliability assessments — meaning 88% of the thermal fleet performs through the modeled storm.
Bennett constructs a scenario in which batteries fail by defining success as continuous 72-hour discharge, while simultaneously taking for granted the thermal fleet of 80-plus GW that keeps the lights on during the bulk of his modeled event. The batteries aren’t replacing that fleet. They’re supplementing it during the peak demand windows that the thermal fleet alone can’t quite cover — which is precisely the role that ERCOT’s system planning envisions for them.
The contrast between Bennett’s theoretical model and actual ERCOT performance is stark. During Winter Storm Elliott, solar contributed roughly 8 GW at peak, and real-time prices dropped from over $3,000/MWh to under $100 within 90 minutes of sunrise. During Heather, large flexible loads curtailed voluntarily, demonstrating the demand-side response that Bennett barely acknowledges. ERCOT CEO Pablo Vegas has specifically identified the growth in battery capacity as “perhaps the most significant factor affecting grid stability,” while University of Texas energy professor Michael Webber credited “significant investments in more solar and more batteries and demand response” as key factors in the grid’s most recent winter storm performance.
None of these experts are claiming the grid faces zero risk. ERCOT’s probabilistic risk assessment, as reported in NERC’s winter reliability assessment, puts the chance of controlled load shed this winter at about 1.8% — low, but not zero. The question is whether Bennett’s framework for evaluating that risk is sound, and on that point, the data he himself relies on says no.
Bennett’s piece concludes that ERCOT needs “market design changes that redirect revenue away from wind and solar and toward resources that can work in all types of weather conditions.” That’s a policy preference dressed up as an engineering conclusion. His own data doesn’t support it.
What his data actually shows is that ERCOT has a manageable peak-demand gap that battery storage is well positioned to address, supplemented by a massive thermal fleet that provides the overwhelming majority of firm capacity during winter events. The December 2025 launch of ERCOT’s Real-Time Co-optimization Plus Batteries (RTC+B) market is specifically designed to optimize exactly this kind of coordination — dispatching storage where and when it creates the most grid value.
The real question isn’t whether batteries can run for 72 hours straight. No one is asking them to. The question is whether the combination of 100-plus GW of firm thermal capacity, a rapidly growing battery fleet, improving demand-response capabilities, and better weatherization standards can keep the lights on during winter storms. The last five years of actual performance — including three consecutive record-breaking winter peaks — provide a clear answer.
Bennett’s analysis works only if you accept his premise that battery storage should be evaluated as a baseload replacement rather than what it actually is: a fast-dispatching, peak-shaving complement to the thermal fleet, which helps dramatically in firming up renewables like wind and solar. Reject that premise, and his crisis narrative dissolves into the numbers he himself provides.