No Carbon News

(© 2024 No Carbon News)

Discover the Latest News and Initiatives for a Sustainable Future

(© 2024 Energy News Network.)
Subscribe
All News
New Orleans’ latest bid for a better grid: a citywide virtual power plant
Feb 24, 2026

Sitting below sea level along the hurricane-prone Gulf of Mexico, New Orleans is particularly vulnerable to losing power during extreme weather. But the city plans to tackle that problem by helping residents buy backup batteries, which will make the grid more resilient.

In December, the New Orleans City Council ordered local utility Entergy New Orleans to design a $28 million battery incentive program for homes, businesses, and nonprofits (plus $2 million for administration and implementation). Crucially, the scheme won’t cost New Orleanians a dime: It will be paid for by a settlement Entergy reached with the city over problems at one of the utility’s nuclear power plants.

Entergy has until March 1 to file an implementation plan for the program, which is expected to launch later this year. Once the plan is up and running, the incentives could support batteries at around 1,500 homes and 150 community institutions. Those systems would provide backup power for the properties they’re sited on, but also inject power onto the grid when it’s strained.

This would propel New Orleans to the forefront of localities adopting virtual power plants, the concept of aggregating energy devices in homes and businesses and wielding them like a traditional power plant for the good of the broader community. Vermont’s biggest utility has used home batteries to lower costs during heat waves; California tapped home batteries to meet demand in extreme moments; Texas has opened up a market-based version of the concept. But New Orleans would become a pioneer of virtual power plants in the Deep South, and would stand out for the scale of the program relative to the size of the territory.

“We hope if you were already on the fence about getting a battery, here’s a chance to participate in a utility program,” said Ross Thevenot, senior project manager at Entergy New Orleans, who oversees the customer-facing battery effort. ​“We’re the Crescent City — we’ve got water on all sides of us. Customer resilience is obviously important.”

The new investment builds on Entergy’s pilot virtual power plant, which enrolled nearly 140 customer-owned battery systems across the city last year. EnergyHub, a cleantech startup acquired by smart-home company Alarm.com in 2013, manages the distributed controls for the pilot and will run the expanded program. The initiative also builds on a grassroots effort called Community Lighthouse, which formed after 2021’s Hurricane Ida and has installed backup-battery systems at nearly 20 churches so that they can offer shelter and light to neighbors during grid failures.

“We’ve seen how useful those can be when there’s a power outage,” said Nathalie Jordi, who works with Together New Orleans, the nonprofit that spearheaded Community Lighthouse, and who advocated for the new virtual power plant. ​“But how great would it be if, when the power goes out long-term after a hurricane, we have nursing homes that don’t lose their power, we have hardware stores, we have bodegas, we have firehouses?”

If the emerging plan succeeds, New Orleans could teach other parts of the U.S. how to build a cleaner, more responsive grid in a way that brings the whole community along.

Democratize battery access

Arushi Sharma Frank, a D.C.-based distributed energy expert, got an urgent message from Jordi in September 2024. The New Orleans City Council, which, unusually, serves as the city’s utility regulator, wanted to hear how the Community Lighthouse locations had performed during outages from Hurricane Francine earlier that month. Together New Orleans knew there was settlement money available, and it wanted to bring the council a fully-fledged virtual power plant proposal that could put those funds to work. Jordi wondered if Frank could propose a turbocharged virtual power plant like she’d helped design in Texas and Puerto Rico.

For Frank, this offered a chance to harness existing grid technologies to save lives in the aftermath of a hurricane or other disaster.

“There are life-threatening conditions that can be averted if people can get to shelter with power and cooling quickly,” Frank said. Small-scale batteries could ensure that ​“we have a place that any human in New Orleans can walk to in 15 minutes that has power after a storm.”

She got to work, compiling a proposal in 72 hours and arranging for people to testify from 12 other states with operating virtual power plants. The last-minute blitz worked: The City Council green-lit an effort to explore the concept, culminating in the December order.

Often, the companies selling energy devices to regular people cast themselves as electric Davids taking on the utility Goliath — as disrupters of a failing status quo.

In New Orleans, Frank said, the community groups were able to ​“remove this tone of adversarialism” that frequently crops up in virtual power plant proceedings around the country, and instead design something ​“generative, as exposed to extractive.”

The program creates a new market opportunity for solar-battery installers, with upfront incentives that can shave up to $10,000 off the cost of batteries for homes or $100,000 for businesses. It will still be up to cleantech companies — local ones or national brands like Sunrun or Tesla — to compete for customers’ business and guide them through the sales process. Those companies will be the ones designing the systems to provide backup power in the event of outages. And the order earmarks 40% of the residential funds for households with low to moderate income, ensuring installers don’t just pitch to more-affluent customers.

Once the batteries are installed and hooked up to EnergyHub’s control software, it becomes Entergy’s job to decide how and when to use them to benefit the power system more broadly. The regulated monopoly utility has knowledge that battery vendors don’t: which parts of the grid need more capacity or struggle to manage voltage when clouds interrupt rooftop solar production, for example, and other such nuances of a complex interconnected network.

Since Entergy runs the grid and charges customers for the service, it’s also able to pass along savings in the event that the virtual power plant lowers overall grid costs.

“Nonparticipating ratepayers are definitely enjoying the benefits of just having more affordable power, because VPPs are cheaper than traditional grid infrastructure and much quicker to stand up,” said Gabriela Olmedo, EnergyHub’s manager of policy and regulatory affairs.

If Entergy can eventually harness tens of megawatts of aggregated battery capacity, Thevenot said, the utility could bid that into the Midcontinent Independent System Operator’s regional grid and use the ensuing revenue to pay down costs for the overall customer base.

Building on a small-battery success

Utilities habitually seek an extended trial phase for ​“new” technology, even if the same equipment has been operating successfully for years elsewhere in the country. Sometimes, that preference for diligent study pushes off adoption of viable grid technologies. In this case, though, New Orleans was able to move swiftly on its virtual power plant because Entergy’s initial foray had laid a careful groundwork.

Under its existing pilot project, EnergyHub manages those nearly 140 batteries — mostly in homes, but also about a dozen in Community Lighthouse installations. The program pays homes up to $600 per year for sending energy to the grid for two-hour stints when demand is especially high. Last year was the first full year this system operated, and Entergy dispatched it six times, Olmedo said, largely to test that the system works.

“We started slow and steady: Let’s learn what the positives and potential speed bumps are,” Thevenot said. ​“It was a true pilot. We were trying to learn as much as possible.”

Entergy ​“got great data,” he added, and learned to troubleshoot in situations when batteries didn’t respond because of issues like internet-connectivity lapses or system settings preventing power from being dispatched.

Having six dispatches per year falls on the leisurely end of the virtual power plant spectrum. A program in Oahu, Hawaii, for instance, pays customers to set their batteries to discharge for two hours every evening, when the island grid is bound to have high demand.

That said, in this pilot phase, Entergy wanted to be judicious about using the batteries that customers had already bought and paid for, Thevenot said. And the summer of 2025 proved to be far less stressful for the local grid than the previous summer, dampening the need for battery assistance.

The plan had been to increase dispatches to 30 per year, Olmedo noted. (The forthcoming implementation plan will decide what the target is going forward.)

Each dispatch will make a far bigger difference once the new funds get disbursed: The incentives are expected to support roughly 10 megawatts of residential batteries and 10 megawatts of nonresidential, Olmedo said. All that capacity will fall within the city boundary, making for a far more concentrated impact than programs that sprawl over, say, the state of California.

Normally, a small customer base can make it hard for a utility like Entergy to propose spending on innovative programs like a virtual power plant, Frank said. The cost of a battery subsidy would be divided among the customer base, and there simply aren’t many customers to split the tab; many New Orleans households earn a low or moderate income, making them especially sensitive to jumps in utility bills.

“If we were forced to do this and run $28 million through some kind of rider we’d have to collect from customers, that would be a different conversation,” Thevenot said.

The pot of settlement dollars circumvented this dynamic, funding innovation without adding to anyone’s monthly bill. ​“Any dollar that they do spend on creating socialized infrastructure, it also goes further because of the same math,” Frank added.

This may limit how replicable the New Orleans experience can be in other locales. ​“Wait for a bucket of utility penalty funds to materialize” is not a particularly actionable directive for would-be grid reformers. But New Orleans can show the world what good a bunch of batteries can do, and quantify eventual operational savings for the whole customer population. Then, advocates can argue for funding this sort of program on its own merits, based on evidence of how useful it has been in the Crescent City.

Jeff St. John contributed reporting.

Will the EU water down its new carbon tariff?
Feb 24, 2026

At the start of this year, the European Union officially launched the world’s first tariff on the carbon footprints of imports. It’s already looking to carve out a loophole.

The EU’s carbon border adjustment mechanism, or CBAM, requires importers to pay a fee based on the carbon dioxide emissions of the goods they bring in.

It’s a major global policy experiment — one that will have plenty of opportunities to prove itself as the EU busily expands its trade deals. In January, European Commission President Ursula von der Leyen inked a sweeping free-trade deal with the South American bloc known as Mercosur, putting an end to 25 years of negotiations. The following week, she announced a landmark pact with India, which included pledges to ramp up purchases of steel, pharmaceuticals, and heavy equipment from the world’s most populous nation.

Both agreements share a notable trait: They keep CBAM in full force. But in the background, the von der Leyen administration has been less than steady on the policy and has floated a major change that could undermine the law’s efficacy.

Currently, CBAM doesn’t allow for any exemptions from the tariff. But in an amendment drafted in mid-December, the European Commission pitched giving itself the discretionary authority to temporarily remove the carbon levy from particular imports, and even retroactively apply the exemption.

Heavy industry groups and European parliamentarians across the political spectrum have balked at the commission’s proposal in recent weeks.

That’s because while the carbon tariffs are meant to reduce emissions, they also serve as a type of industrial policy that can level the playing field between foreign and domestic manufacturers. Industrial firms in the EU have long been required to offset their emissions by buying credits on the bloc’s Emissions Trading System market, driving the cost of their products higher than those of goods manufactured in countries without such rules.

CBAM rectifies this price discrepancy by subjecting foreign firms to the same carbon fees. For now, the price of carbon dioxide emissions will be calculated by averaging the auction price of Emissions Trading System credits each quarter. Starting next year, the pricing is set to more precisely track the ebbs and flows of overseas factories’ emissions by moving to a weekly average.

The idea that the EU may choose to exempt certain products could sap confidence in the policy and make it harder for foreign firms to justify long-term investments in new, cleaner assembly lines.

“It’s a big deal even before passing into law,” said Antoine Vagneur-Jones, the head of trade and supply chains at the consultancy BloombergNEF. ​“The prospect of sectoral exclusions, even temporary by nature, communicates worrying uncertainty to business at a time when the relevant investments in low-carbon European production require long-term policy visibility.”

The proposal, called Article 27a, has what he called ​“a few hoops to jump through first” before the European Commission could start removing tariffs on any industry. Namely, the European Parliament and the European Council will need to vote to approve the amendment. A vote is not yet scheduled.

The pressure for this ​“emergency brake” on CBAM is coming from a familiar force in EU politics: farmers.

French and Italian agricultural ministers pressed Brussels for the amendment over concerns that CBAM could drive up the price of fertilizer, squeezing farmers’ margins. Such a price shift could risk widespread protests like the so-called nitrogen wars that started in 2019, when the Dutch government’s crackdown on agricultural emissions triggered a revolt among farmers.

But even if the amendment is passed into law, the European Commission cannot halt tariffs at its pleasure. Before suspending the tariff for any products, it must run assessments to determine whether CBAM’s impact on prices of relevant goods, like fertilizer, is significant enough to justify doing so.

“It isn’t clear to me that fertilizer prices would go up by enough to warrant activating 27a over, say, the coming year — even if punitive default values were used,” Vagneur-Jones said. ​“My sense is that we won’t be seeing the provision’s activation anytime soon.”

Still, the proposed pullback highlights ​“a growing conviction among Europeans that the EU is more or less fighting climate change alone,” said Adam Błażowski, the supervisory board chairman of the climate group WePlanet, which advocates for what it sees as pragmatic solutions, such as nuclear power and genetically modified crops.

“This may not be entirely accurate, but this trend only increased after the United States’ second departure from the Paris Agreement,” he said. ​“Mechanisms like CBAM function to preserve a certain level of equality between different economies, but rapidly progressing climate change is a global problem that needs global solutions. Unfortunately in an age of kinetic and trade wars all around Europe, this seems to be an increasingly difficult task.”

Equipping CBAM with an emergency brake may also have practical benefits that go beyond placating political constituencies. It could reduce regulatory complexity, for example — something of increasing importance to EU leaders who want to rejuvenate domestic industries, protecting the continent against geopolitical aggression from Russia, China, and, of late, the U.S.

“Looking to the future of a low-carbon economy, we may not be able to move as fast as we might like, but we have to move as fast as we can,” said Joseph Hezir, the former finance chief of the Department of Energy and current president of the EFI Foundation, a nonpartisan energy-policy think tank. ​“The 27a discussion right now is really about, how fast can we move in that direction?”

Europe’s carbon tariff may soon have company. Several other countries are considering what Hezir called ​“CBAM-like programs,” including the United Kingdom, Canada, and Taiwan. In a Friday post on X in response to the Supreme Court’s decisions to strike down President Donald Trump’s tariffs, U.S. Sen. Bill Cassidy, a Louisiana Republican, called on the White House to champion his ​“Foreign Pollution Fee” bill, which ​“levels the playing field.”

CBAM’s impact is already being felt outside the 27-nation EU. In December, analyst Jian Wu cited the European tariffs as a major force behind China’s ​“thriving” hydrogen-fired metallurgy this year. With CBAM entering into force, he wrote in his newsletter China Hydrogen Bulletin, ​“Chinese steel exporters are facing real pressure to decarbonize their businesses.”

With 60% of its steel exports already headed for Europe before the signing of last month’s free-trade deal, India is also feeling the spur of CBAM on its notoriously coal-choked industrial and utility sectors.

Still, the policy is colliding with a harsh political climate. Anything that raises prices on European consumers is becoming radioactive, said Josh Freed, the chair of Catalyse Europe, a climate policy group.

“Since Russia’s invasion of Ukraine sent energy prices way up, Europeans’ tolerance for any policies that they perceive as increasing prices is nonexistent,” he said. As a result, ​“slowing down and adjusting” both CBAM and the Emissions Trading System schemes ​“is just policy meeting reality.”

Gigantic Form Energy battery to power Google data center in Minnesota
Feb 24, 2026

Form Energy invented a novel iron-air battery to store clean energy for much longer timeframes than conventional lithium-ion batteries can. The startup is still constructing its first commercial project, in Minnesota, but today revealed it has clinched a potentially game-changing follow-up in the same state to support a Google data center.

The utility Xcel Energy will install 300 megawatts of Form’s batteries in Pine Island, Minnesota. It’s a big battery installation for the Midwest, but developers have built several grid storage plants elsewhere with more megawatt capacity. What shoots this project into the energy-storage stratosphere is that it will dispatch energy for up to 100 hours straight — enough to pump clean energy through multiday weather patterns that would limit renewable production. That unique capability means the Pine Island Form plant, fully charged, will hold 30 gigawatt-hours of energy, an astonishing amount for the grid as we know it.

The deal is also notable in that it proves Form has found commercial traction even before its first installation for a utility customer is complete. That outcome was possible because Xcel has seen Form develop its technology for years, said Form CEO Mateo Jaramillo, who co-founded the firm in 2017.

“Xcel in particular has been with us through every step of the journey — when the chemistry was in a very small bucket, essentially, to complete deployed systems,” Jaramillo said. ​“They saw the challenging things that we worked through. They saw us solve hard problems. They saw us come out the other side.”

The arrangement also offers one of the clearest examples yet of how tech giants could power their data centers with clean energy without raising costs for regular customers, if those companies care to try.

Under the agreement, Google will pay Xcel to build 1.4 gigawatts of wind and 200 megawatts of solar. Those resources make cheap, clean power, but they can’t match a data center’s 24/7 operating profile. That’s where the Form batteries come in: They can charge up whenever renewable production exceeds momentary demand and then deliver on-demand power for more than four days.

For anyone still concerned about climate change, that’s an enticing vision at a time when the titans of AI seem happy to toss clean energy out the window. Amazon and Meta have readily endorsed major fossil-gas-plant construction to power their AI sites. Just this week, SoftBank subsidiary SB Energy, which has been an avid clean energy developer, teamed up with the Trump White House to propose the biggest fossil-gas power plant in the world to help fuel the AI computing build-out. Other companies have turned to less efficient, smaller-scale fossil-fueled generators to hack together enough power for their data center plans, as chronicled by analyst Michael Thomas.

Xcel, which provides electricity to nearly 4 million people across eight states, also took great care in its statement to describe the data center not as serving the general AI arms race, but as one that ​“will support core services — including Workspace, Search, YouTube and Maps — that people, communities and businesses use every day.”

The companies also took steps to protect Xcel’s other customers from price impacts to serve the data center: ​“Google will cover any new grid infrastructure costs associated with the project and has planned carefully with Xcel Energy to ensure electricity in the area remains reliable and affordable for all of Xcel Energy’s customers,” the utility noted.
This arrangement lets Xcel pitch the data center as something that actually helps the broader Minnesota community: It will bring investment, construction jobs, and higher clean-energy generation — all without increasing electricity bills at a time when they’re rising fast in much of the country.

Potentially transformative new battery technologies tend to get trapped in yearslong cycles of small-scale pilots and demonstrations, before utilities feel comfortable spending their customers’ dollars on the new thing. Some caution is warranted, as far more novel battery startups have gone bankrupt than have built at multi-megawatt scale. And again, even Form has yet to finish its first commercial installation.

In this case, however, Google is picking up the (still undisclosed) bill. If the batteries don’t work as advertised, that could frustrate Google’s carbon accounting, but Xcel customers would not be on the hook.

Form demonstrated its capabilities with internal installations that Xcel could examine, Jaramillo noted. The startup has also been honing its production quality at its factory in the former steel town of Weirton, West Virginia — a process that required making 60 miles of electrode materials, he noted.

“They don’t treat us like mom and give us cookies when we feel bad — they hold us to a very high standard,” Jaramillo said of Xcel. ​“And we want them to feel good about the product, that it’s safe, that it’s reliable, that it scales.”

Form expects to start delivering batteries to the utility in 2028. That year, the Weirton factory is supposed to reach 500 megawatts of annual production capacity, so the Pine Island project will represent a major share of Form’s manufacturing operations. Xcel expects the clean energy installations to come online in phases from 2028 to 2031.

Meanwhile, its initial project in Minnesota — which was supposed to come online in 2023 — is now set to finish installation this year.

The nascent long-duration storage sector has needed eager patrons to give the technology a shot. Form clinched its first, much smaller contracts with vertically integrated utilities that could take a more holistic long-term planning view than the fast-paced competitive power markets allow for. Now, the data center build-out brings potential customers with mountains of cash and a burning desire to move quickly — an ideal pairing for Form, which has a factory and a need to prove its worth

An update was made on Feb. 25, 2026: New information about Xcel Energy’s timeline for building the clean energy projects was added.

AI, Data Centers, and the US Electric Grid: A Waershed Moment
Feb 23, 2026

Electricity consumption growth rates are increasing across the United States, driven, in part, by a boom in hyperscale data center development. Although the long-term market outlook remains uncertain, the Lawrence Berkeley National Laboratory predicts that data center demand will grow from 176 terawatt hours (TWh) in 2023 (or, about 4.4% of total U.S. electricity consumption) to between 325-580 TWh (6.7-12.0%) by 2028.1 In some parts of the country, AI-driven energy demand is outpacing available capacity, driving companies to delay projects, contract power directly from private producers, and/or install multiple, inefficient reciprocating generators using natural gas.

Data centers may impact grid reliability in some regions. In July 2024, a voltage fluctuation in northern Virginia triggered the simultaneous disconnection of 60 data centers, prompting a 1,500-megawatt (MW) power surplus, which forced emergency adjustments to prevent cascading outages.2 Investors claim that massive investments in energy generation and grid infrastructure are needed to power data center development while mitigating outage risks. However, if the anticipated demand does not materialize, utilities (and their consumers) could face stranded costs.3

Data centers have enjoyed discounted energy tariffs and tax incentives, as state and local governments compete to attract business. Although these early incentives have driven substantial data center investments, emerging regulatory debates are impacting market development across the country. Policy shifts in major data center markets, such as the passage of Texas Senate Bill 6, signal the probability of future market intervention by both regulators and policy makers to address local-level concerns over reliability and affordability.

As data center infrastructure continues to expand, developing effective regulatory policies becomes critical. The future of data centers and their energy needs, as well as the policy decisions made in this realm, will impact U.S. technological competitiveness for decades to come. While overregulation could hinder AI development, insufficient regulation risks grid instability, rising consumer costs, reliance on high-emission energy sources, public backlash, and setbacks to state and corporate climate goals.

This policy brief outlines the current state (and potential consequences) of U.S. data center electricity usage and corresponding grid expansion. The paper provides an overview of the current data center and grid landscape followed by a discussion of potential engineering and policy approaches to address ensuing challenges. The foundations laid herein will inform our future research under the Project on Grid Integration at the Harvard Kennedy School (HKS) and the Harvard School of Engineering and Applied Sciences (SEAS). This Initiative aims to advance 1) the development of new regulatory tools to incentivize increased grid flexibility and 2) the creation of more equitable cost-sharing mechanisms in the wake of expanding data center development. The brief concludes by outlining several critical questions which will guide the Project’s research over the next year.

2. The U.S. Data Center Landscape: An Overview

According to the National Telecommunications and Information Administration (NTIA), there were over 5,000 data centers in the United States in 2024, with demand for data center services expected to grow through 2030.4 Accordingly, capital spending on hyperscale data center infrastructure has risen to unprecedented levels over the past five years. Amazon CEO Andy Jassy noted that AWS’s AI-related revenue is already a multibillion-dollar business “growing at a triple-digit, year-over-year percentage.” In 2024, Amazon, Microsoft, Google, and Meta collectively spent over $200 billion on capital expenditures (CapEx), representing a 62% year-over-year increase from 2023. Each firm’s spending reached an all-time high: Amazon’s CapEx was $85.8 billion5 (up 78% year-over-year), Microsoft’s was $44.5 billion6 (up 58%), Google’s was $52.5 billion7 (up 63%), and Meta’s was $39.2 billion8 (up 40%). Looking ahead, Amazon’s total CapEx9 in 2025 is projected to surpass $100 billion, while Microsoft’s and Google’s are each expected to exceed $80 billion. The data center buildout race reflects both strategic and financial drivers, as companies race to secure long-term returns and future competitive advantages. By investing ahead of demand, these companies are ensuring infrastructure is available when customers need it. From the industry’s perspective, failure to build ahead of demand places companies at a competitive disadvantage.

While data center financing stems primarily from parent-company balance sheets, corporate bonds, and public incentives, project finance is occasionally used, with green bonds emerging as a supplementary tool. Financing the electricity infrastructure upgrades needed to power data centers, however, is a much more challenging endeavor, as utilities operate under tight financial and regulatory constraints that complicate the acquisition of the large-scale capital deployment needed to fund expansive upgrades.

As data centers continue to seek rapid power interconnection, alternative financing mechanisms for power procurement—through both utilities and third-party providers—are gaining prominence. For example, firms are increasingly relying on third-party power contracts, which include collateral commitments, long-term power purchase agreements (PPAs),10 availability payments, and upfront capital payments. Additionally, companies are weighing the costs and benefits of co-locating data centers and power generation, despite challenges surrounding siting rules, asset ownership, and regulatory oversight. Overall, this unprecedented capital outlay exposes both firms and utilities to a range of risks, from increased stranded assets to rising financing costs; therefore, the sustainability of the data center build out depends on both resilient financing structures and continued demand realization.

Future data center market expansion, and its consequent energy usage, remains highly uncertain. Past data center energy studies display numerous flaws. In a review of 258 data center energy consumption estimations, Mytton & Ashtine (2022) found systematic defects within study methodologies, particularly with regards to data availability and transparency.11 The opacity of data center operations, site planning, and energy efficiency complicate energy estimations and projections.12 Subsequently, institutional projections of data center electricity demand range from about 200 TWh to over 1,000 TWh by 2030, according to the World Resources Institute. This range complicates medium-to-long term grid planning, as utilities struggle to determine both the true magnitude of the industry’s future energy needs and its relationship to economywide electrification.

The 2024 United States Data Center Energy Usage Report13 attempted to clarify the extent of current and future data center energy consumption. After a period of stagnation from 2014 to 2016, center energy demand grew in 2017 due, in part, to expanded efforts to digitalize data across economic sectors. From 2018 to 2023, data center energy use increased from roughly 76 TWh (comprising 1.9% of the nation’s total annual electricity consumption) to 176 TWh (4.4%); future data center energy usage could range from 325 to 580 TWh by 2028, or 6.7-12.0% of 2028 national electricity consumption. However, this range remains uncertain, due to the continued opacity of data center and utility planning as well as uncertain data center market trajectories.14

Project risks are assumed by external stakeholders, not just data center companies. For example, utilities face stranded-asset risks with regards to generation and transmission buildout; if infrastructure is built to serve projected data center demand and said demand does not materialize, these assets could be underutilized. Furthermore, increased contract-based financing has shifted projects away from guaranteed “rate-base” recoveries, instead favoring special tariffs and PPA contracts, arrangements which lack transparency and may shift power costs onto other consumers.

These threats raise urgent questions about who should shoulder data center buildout costs and whether returns (and cost recovery) to the utility will remain predictable. Who should pay for grid improvements spurred, at least in part, by data center development? Who are the beneficiaries of these improvements? How should costs be allocated across consumers? How can local communities be protected from rising energy costs and natural resource depletion as data centers expand to new markets across the United States? Rigorous policy, economic, and engineering research—in conjunction with increased transparency from data center operators and utilities serving them—is crucial for future grid planning as well as for mitigating unwanted environmental, social, and economic impacts.

3. Virginia and Texas – Two Sides of the Regulatory Coin

As data center markets continue to expand, regional differences in electricity market design and energy needs are shaping regulatory and market reforms. Simultaneously, local-level impacts are introducing additional variables for policy consideration. This section surveys two of the largest U.S. data center markets, Virginia and Texas, to demonstrate how locales facing similar challenges differ in the pace and substance of their responses.15

3.1. Virginia

Virginia is the epicenter of the global data center industry, with over 4,900 MW of operating capacity (and another 1,000 MW under construction) in Northern Virginia alone.16 By some estimates, about 70% of global internet traffic passes through the region daily.17 The area’s dense fiber network, linkages with federal facilities, and systemic incentives enabled its market dominance. First, Northern Virginia was an early node in the U.S. government’s ARPANET18 and still hosts major internet exchange points.19 Second, the state’s low power costs, strong electric reliability, economic incentives, and mild climate reduce data center operation costs, while some Northern Virginia counties provided early permit acceleration for large campuses.

Data center growth in Virginia will add thousands of megawatts of nearly constant demand over the next few years, thereby compressing planning timelines and raising new questions around who should bear the costs of system improvements. Dominion’s20 2024 resource plan projects nearly 27 GW of new generation by 2039, including 21 GW of renewable energy (i.e., solar, wind, and nuclear small modular reactors [SMRs]) and 5.9 GW of gas.21 Simultaneously, Virginia’s energy rates are increasing. In February 2025, Dominion proposed its first base-rate increase since 1992, adding about $8.51 per month in 2026 and $2.00 per month in 2027 for a typical household.22

Furthermore, rapid demand growth has led PJM, Virginia’s regional transmission organization, to review how it both defines firm service and manages reliability obligations. The region’s wholesale design depends on a balance between competitive generation, long-term capacity procurement, and regulated local service. This dynamic is strained by data center expansion, as a single, fast-growing class of customers with unique load profiles present system needs that differ from those around which PJM was built. Data centers use large, steady electricity loads with limited ability to reduce (or ramp down) their power usage; simultaneously, their energy demand can fluctuate according to equipment usage and job complexity. This pattern differs from the more gradual, weather-sensitive load patterns. Overall, Virginia is under pressure to embrace new rates, financing, and reliability tools to allocate risks to the drivers of this new demand: data centers.

As the data center industry continues to expand, the Virginia grid must adapt. Cost allocation rules and policy incentives will evolve as the state considers how to sustain reliability investments while stabilizing rates for other customers. Several policy reforms have been proposed. For example, lawmakers have debated scaling back Virginia’s data center tax exemptions for both performance and sales. However, proposals to repeal these incentives stalled in the budget process. Furthermore, several 2025 bills sought 1) to link eligibility to tax incentives to improved energy efficiency or clean energy performance, 2) to pause new projects in Northern Virginia, and/or 3) to set uniform development standards, but none of these advanced.23,24 A separate bill establishing statewide standards, including land use reviews, reached the governor’s desk but was vetoed.25 That said, local governments are considering enhancing land use and environmental regulations, in order to slow the data center build out process. As of the time of writing, the state tax exemptions remain in place through 2035, signaling Virginia’s intent to support competitive market development, but serious concerns around land use and affordability are looming on the horizon.

3.2. Texas

Texas, with its lightly regulated, “energy-only” electricity market structure, offers a contrasting example of how U.S. electricity systems are responding to rapid data center development. The state demonstrates how a market that historically favored low-friction interconnection processes is adjusting its regulatory framework in response to unprecedented new load growth.

Over the past several years, Texas data center investments have been attracted by the state’s competitive electricity prices, business-friendly policies (including state sales and use tax exemptions on servers, cooling equipment, backup energy, and other hardware), and rapid interconnection speeds. As a result, the Dallas-Fort Worth area has emerged one of the largest data center markets in the United States and is continuing to witness massive build out. The Electric Reliability Council of Texas (ERCOT)26 projects that peak summer power demand could approach 145 GW by 2031, up from 85 GW in 2024; this represents a significant acceleration relative to the gradual 1-2% annual growth in demand experienced over the past two decades. Over half of this new demand (about 32 GW) is projected to come from data centers (including cryptocurrency miners).27 Unlike past gradual and dispersed growth, the current demand surge is rapid, lumpy, and increasingly clustered around specific localities, leading to increased concerns around demand-supply mismatch, insufficient energy reserve margins, and transmission congestion.28

By mid-2024, state lawmakers grew increasingly alarmed by emerging energy risks, particularly with regards to: (1) fairness in cost recovery, with concerns that data center’s speculative or duplicative29 interconnection requests could shift upgrade costs onto smaller customers; (2) behind-the-meter (BTM) co-location that might pull existing grid-facing generation behind a private fence, reducing available capacity in the system under30 tight conditions; and (3) managing resource adequacy and emergency operations if large loads remained uncurtailed31 during an emergency.

In June 2025, the Texas State Senate enacted Senate Bill 6 (SB6), a package of planning, interconnection, cost-sharing, transparency, and emergency operations reforms aimed at strengthening and protecting the state’s energy grid. The law formalizes ERCOT’s Large Load Interconnection Study (LLIS) process;32 directs the Public Utility Commission of Texas (PUCT) to determine a “reasonable share” of upgrade costs for new large loads;33 and requires improved disclosure to reduce speculative filings.34 Overall, SB6 signals the growing potential for expanded regulation across regional markets in response to increased energy affordability and cost-sharing concerns.

In conclusion, Virginia and Texas face similar energy challenges in the wake of rapid data center development, but their approaches demonstrate different regulatory philosophies. The actions (or lack thereof) taken in these states will serve as models for regulators elsewhere across the country.

4. Technological Opportunities for Data Center Energy Mitigation

Future policy and regulatory solutions for data center energy usage will only work if they are technically feasible, economically sound, and politically acceptable. Data center interconnection is often framed as a choice between grid reliability and economic growth. However, past policies have not been anchored in how large loads behave in the real world. Effective policy solutions must account not only for local-level impacts and cost sharing concerns, but also for computational realities. A modeling-first approach can elucidate policy opportunities by first screening for system reliability, then evaluating system-wide price and congestion effects under certain operational criteria that reflect real flexibility. This exercise will require close collaboration between policymakers, engineers, and business leaders across both the energy grid and corporate sectors.

Ongoing research at the John A. Paulson School of Engineering and Applied Sciences (SEAS) aims to address this gap. By linking security-constrained operations (i.e., reliability screening, congestion and ramping limits) with market outcomes (i.e., price volatility, renewable curtailment risks, and uplift payments), the SEAS team is developing realistic engineering solutions to be integrated into real-world policy tools. This analysis will extend across operational levels, considering everything from hosting capacity to transformer loading to thermal equipment aging. Together, these views link system-wide constraints to local reliability and power-quality considerations to develop standardized, transparent workflows that can align planner decisions, regulatory approvals, and developer obligations on predictable timelines.

Rigorous modeling of data centers’ reliability and economic impacts across transmission and distribution enables evidence-driven policymaking. For example, planners could maintain a public shortlist of locations where the grid can reliably host new large loads, aligning private proposals with places with sufficient grid capacity. A similar structure could apply to transmission and distribution by clarifying non-negotiable conditions (such as contingency margins and equipment limits) and possible trade-offs (such as construction timelines). This transparency would enable faster construction, fairer decisions, and clearer expectations among all stakeholders.

At the same time, AI data center power consumption still lacks a standard electricity load profile. Such a baseline would help grid operators, planners, renewable energy developers, and policymakers compare scenarios, estimate future energy costs, gauge resource adequacy, design demand-side flexibility incentives, and set accurate emissions policies. Job submission scheduling provides opportunities to enhance data center demand-side flexibility. Using a bottom-up, minute-by-minute model informed by real job data (i.e., job-arrival traces, per-job resource demands, GPU power profiles, and standard cluster resource allocation mechanisms), SEAS researchers have demonstrated that queuing dynamics (or, how jobs arrive, wait, and are scheduled under finite resources) shape electricity demand. This detailed modeling provides a more granular understanding of power profile dynamics across multiple time scales, ranging from seconds to hours, thereby clarifying the impact of job dynamics on the energy system. This work will provide the basis for regulatory tools designed to mitigate excess power usage and fluctuations stemming from job-level dynamics.

5. Conclusion and Looking Ahead

While the outlook for data centers and their energy needs remains uncertain, future solutions must leverage robust policy instruments to spur technological and/or operational changes. For example, data centers may be able to improve grid reliability by reducing their power usage during peak periods; however, it is unclear which incentives would best encourage these practices. Theoretical solutions must be translated into effective, real-world policy initiatives that consider economic, political, and social realities as well as technological feasibility. Rigorous policy, economic, and engineering research—in conjunction with increased transparency from data center operators and utilities serving them—will facilitate successful reforms.

The Project on Grid Integration (PGI) is well-positioned to address these challenges. A joint project of the Harvard Kennedy School of Government (HKS) and the Harvard John A. Paulson School of Engineering and Applied Sciences (SEAS), the Project aims to develop new policy, technical, and operational tools that leverage the data center boom in order to strengthen and modernize the U.S. electric grid; at the same time, the project works to minimize the economic, social, and environmental repercussions of rapid data center expansion.

Moving forward, the Project will examine the following questions:

  • Cost Sharing: Who should pay for grid improvements spurred, at least in part, by data center development? Who are the beneficiaries of these improvements? How should costs (and associated risks) be allocated across consumers? What cost allocation mechanisms would be fair, methodologically feasible, and politically possible?
  • Behind-the-Meter Energy: To what extent will data centers rely on third-party energy systems to power their operations? How might behind-the-meter energy buildouts be regulated at the state and federal levels?
  • Grid-Level Impacts: How might data center operational reforms be used to increase grid capacity, enhance system reliability, and improve operational stability? How might data centers be incentivized to limit their load during peak hours in order to preserve whole-of-system reliability? What impact is high frequency oscillation in large-scale AI training jobs having on energy generation systems and the electric grid at large? How might these impacts be mitigated?
  • Utility Buildouts: Will utilities be able to access sufficient capital to build necessary infrastructure improvements? How can the cost of such improvements be balanced with electric affordability?
  • Alternative Energy and Externalities: How can renewable energy and battery storage solutions be better integrated into data center energy needs? How can local communities be protected from rising energy costs and natural resource depletion as data centers expand to new markets across the United States?
  • Data Center Site Selection: How might data-driven analyses of the electric grid inform future data center site selection processes? How can tariff designs and public policy incentivize data center developers to choose grid-optimal sites?

Disclaimer

The views expressed in this paper are the opinion of the authors and do not reflect the views of PJM Interconnection, L.L.C. or its Board of Managers of which Le Xie is a member.

Can a big battery help Boston save billions on the power grid? Maybe.
Feb 23, 2026

The U.S. desperately needs to make more room on its electricity grid. But for years, the country has struggled to build new power lines at a reasonable pace, and despite fast-rising electricity demand, there’s no sign of that changing in the near term.

A project taking shape near Boston could help make the case for an alternative to expanding the grid: big, strategically placed batteries.

In fact, energy storage has already helped defer the need for costly, slow-moving transmission upgrades in Australia, Europe, and South America. But it hasn’t yet caught on in the U.S.

The Trimount battery project, four miles north of Boston, could spur grid planners and operators to take another look at this concept of using storage as a transmission asset. At the very least, it will be hard for them to ignore. With 700 megawatts of power capacity and 2.8 gigawatt-hours of stored energy, the battery installation would be one of the largest in the nation, and by far the largest in New England.

The Trimount project is targeted for a key pinch point in the region’s grid. It will be located at a former Exxon Mobil oil-storage facility in the city of Everett and will plug into a major substation that connects Boston to the greater New England grid. Boston is a ​“load pocket,” a spot on the grid where peak electricity demand sometimes exceeds what transmission lines can supply — whether because of emergencies or more predictable spikes in usage on hot and cold days.

But those moments tend to be relatively short-lived, making batteries a viable tool for weathering imbalances. Batteries can store electricity when it is abundant and then discharge it when the transmission system faces high demand.

“At hours when the grid is overly stressed, the ability to discharge the batteries in the middle of the load pocket alleviates the strain on all the major lines going into the metro area,” said Hans Detweiler, senior director of development for Jupiter Power, the Austin, Texas–based company behind the battery project.

Jupiter Power is seeking approval from Massachusetts’ Energy Facility Siting Board for Trimount and hopes to secure utility contracts later this year, Detweiler said. If everything goes according to plan, the company expects to break ground in 2027 and start operating in late 2028 or early 2029.

That will put Trimount smack-dab in the middle of near-term and long-range planning for the Independent System Operator New England, the entity that manages the region’s transmission grid. And ISO-NE is actively searching for ways to relieve Boston’s peak electricity demands.

To that end, Jupiter Power hired RLC Engineering to conduct a study of how energy storage could help solve challenges identified in ISO-NE’s ​“Boston 2033 Needs Assessment” report. Specifically, the study looked at options for managing when two major transmission lines go out of commission successively, called an N-1-1 event, which could force utilities to institute widespread power outages.

Trimount’s ​“pivotal” position in the grid could allow it to keep the grid up and running during such an emergency, RLC’s study said. The other alternative would be upgrading a number of high-voltage transmission lines, many of them buried underground — a costly, disruptive, and time-consuming process in dense urban environments.

RLC’s analysis found that the Trimount battery project could provide an ​“avoided transmission cost benefit” of about $2.27 billion by avoiding those upgrades — ​“a much more cost-effective way to solve the reliability issue.”

“There are all these ways that storage can save consumers’ money,” Detweiler said. ​“One is that storage — at least in certain locations, like our project — can avoid massive transmission upgrades.”

Barriers to using batteries as a transmission solution

This use of batteries as a sort of shock absorber for the grid has gained more traction outside the U.S.

Take the work of Fluence, a global leader in energy storage solutions, for example. The firm, a joint venture of Siemens and AES Corp., is building what could be the world’s biggest storage-as-a-transmission-asset project in Germany, and it has more than 1.2 gigawatt-hours of projects with transmission-asset components around the world, according to Suzanne Leta, the company’s vice president of policy and advocacy.

If the idea catches on in the U.S., the impact could be significant.

A study from Astrapé Consulting commissioned by the Natural Resources Defense Council found that building 3 gigawatts of energy storage by 2030 could obviate the need for about $700 million in transmission upgrades to serve Illinois as it closes fossil-fueled power plants to meet state climate goals.

And in New York, adding battery storage as a transmission asset could ​“mitigate grid congestion, reduce renewable curtailment, and defer the uncertain need for new power lines,” according to a study by Quanta Technology on behalf of the New York Battery and Energy Storage.

But right now, it’s hard to make these projects happen in the U.S., Leta said. The reason? ISO-NE and other regional grid operators require such batteries to be exclusively used to aid the transmission grid. The battery owners cannot make money from performing other services.

“You have a transmission revenue stream — that may need first priority. But you need additional revenue streams,” Leta said. ​“The reason that hasn’t happened is generally because policymakers have not allowed for those combined revenue streams.”

That’s the case for the Trimount project, which won’t earn money from any grid relief the battery might provide. Instead, like the other large-scale battery projects being built in Massachusetts, it will earn money through the state’s Clean Peak Energy Standard, which offers credits for charging up with renewable energy and discharging it during times of peak demand. And Trimount is seeking to contract the project to one of Massachusetts’ major utilities, which are under state mandate to procure 5 gigawatts of energy storage by 2030.

But if ISO-NE wants to take advantage of the potential transmission savings of Trimount and similar battery projects, it may need to work with stakeholders on another way of doing it. At present, the grid operator’s ​“storage as a transmission-only asset” (SATOA) structure, approved by federal regulators in 2023, bars batteries from doing anything else if they’re used to relieve transmission constraints.

There’s a market rationale for this separation. Grid operators draw a hard line between transmission assets and other energy-market resources like power plants and batteries. If a battery project is collecting money for being a transmission asset, that revenue could subsidize the other energy-market services it provides, giving it an unfair advantage over competitors.

The same kind of limitations apply to the storage-as-transmission-asset rules at the Midcontinent Independent System Operator, which manages the transmission grid and energy markets across 15 U.S. states from Louisiana to North Dakota. It has limited its use of those rules to only one relatively small project to date.

Other major grid operators, such as PJM Interconnection, which covers Washington, D.C., and 13 states from Virginia to Illinois, have yet to develop rules for storage as a transmission asset. In PJM, that absence has played a role in stymieing proposals to use batteries to facilitate the closure of aging fossil-fueled plants.

Alex Lawton, a director at trade group Advanced Energy United, suggested that grid operators may want to find ways for batteries to make money across both energy markets and transmission services in order to use energy storage to help relieve their increasingly urgent transmission shortfalls.

“Yes, we are going to need to build more lines. But we want to do that cost-efficiently,” he said. ​“If it can be solved with a battery, that needs to at least be considered. And we want an analysis that shows all those things.”

Market rules aren’t the only barrier. There’s also the issue of forcing these projects to be part of the glacial pace of planning, approving, and building power lines. Under ISO-NE’s SATOA plan, any battery meant to help defer a grid build-out has to be identified through regional transmission plans, which take years to develop.

Currently, ISO-NE’s soonest opportunity to update its approach to integrate batteries into its transmission planning may be as part of its upcoming work to comply with the Federal Energy Regulatory Commission’s 2024 order to modernize long-term transmission planning, Lawton said. Among the mandates in that sprawling order, FERC calls on grid operators and utilities to incorporate advanced transmission technologies, which can expand the capacity and flexibility of existing power lines.

“We’ve always advocated with long-term transmission planning that there should be a robust process to evaluate alternative transmission technologies,” he said. ​“Storage is, in some cases, the most cost-effective solution.”

But just as companies that own power plants jealously guard their market position against new competitors, utilities that own and operate transmission grids tend to guard their incumbent advantages in winning contracts to build new power lines. ISO-NE’s current SATOA rules don’t provide incentives for transmission owners to consider adding battery storage as an alternative to building power lines, which earn them guaranteed rates of profit, Lawton noted.

The Trimount project ​“could be a really excellent case study to make a case for revisiting SATOA, and strengthening it and expanding it,” he said. It will certainly be worth observing how the project’s future patterns of charging up with excess clean energy and discharging during peak hours, which it’s incentivized to do under the Clean Peak Energy Standard, coincide with relieving the congestion on that part of the transmission grid.

In the meantime, building an enormous battery right next to a major city will bring multiple benefits, Jupiter’s Detweiler noted. The company commissioned a study by Aurora Energy Research that found the Trimount project could save ISO-NE customers about $1.6 billion in capacity market costs over its 20-year lifetime by deferring the need to build other power plants to serve the region’s peak needs.

It remains unclear how ISO-NE will choose to incorporate the Trimount project into its transmission planning once it’s operational, Detweiler said. ​“We are confident that they will notice when a project like ours goes up. The question is how they do the valuation.”

How Oregon is building back smarter after wildfire
Feb 23, 2026

Carole and Alan Balzer have called the town of Talent home since 1998. They met in college in nearby Ashland and never left southwestern Oregon. They love the small-town life and the bucolic setting of orchards, vineyards, and ranches.

On the morning of Sept. 8, 2020, Carole was at work a few towns away when she heard that a fire had ignited in a grassy field in Ashland. Like most people living in the Rogue Valley, the Balzers were used to seasonal drought and the occasional wildfire in the surrounding hills. But that summer had been brutally dry, and every bit of vegetation was parched.

Carole called Alan, who was at their house without a car.

“Do you think I should come home?” she asked.

She never got there. Fueled by unusually strong winds, the fire roared northwest along the valley’s Bear Creek corridor. Alan had just enough time to gather their cat, a computer, and a box of photos before evacuating with a neighbor.

The fire destroyed the Balzers’ home and most of their neighborhood, along with portions of Ashland, Talent, Phoenix, and Medford. Carole didn’t go back to her property until volunteers from Samaritan’s Purse were cleaning up the site a few weeks later.

“They found the three parts of my flute, but of course it was destroyed,” Carole recalls. ​“They gave me a chair to sit in, and I just started bawling.”

The Almeda Fire burned approximately 3,000 acres and damaged more than 3,000 structures; over 2,500 of those were residences. Nearly 40 percent of the students in the Phoenix-Talent School District were displaced from their homes.

The Balzers were among thousands of people who had to find temporary housing after the fire. They were lucky — with the help of friends, they found a rental in Ashland.

Five other big conflagrations and a number of smaller fires also swept through Oregon that September weekend in 2020. Collectively, the ​“Labor Day Fires” burned over 1 million acres, destroyed more than 5,000 structures, and killed at least nine people. It was the most expensive disaster in Oregon’s history; afterward, the state faced the monumental task of helping residents and businesses rebuild.

In early 2021, the Oregon Legislature voted to temporarily relax building codes — mandatory construction standards usually determined by states and updated once every three years. These codes include energy-efficiency standards, which set minimum levels of performance for windows, insulation, heating and cooling systems, and other equipment.

With Oregon’s postfire legislation, buildings replacing those constructed before 2008 were required to meet the 2008 codes, while buildings replacing those from after 2008 had to comply with the codes that were in effect at the time of the original construction. It’s a strategy that jurisdictions in California and Colorado have also employed after devastating wildfires.

Though meant to make rebuilding easier and more affordable, weakening energy-efficiency standards in particular has long-term consequences.

The Oregon Department of Energy estimates that an average new home built to the state’s 2021 residential code is 30% to 35% more energy efficient than a similar home built to the 2008 code. Buildings are collectively responsible for 40% of energy use in the United States, so these codes are an important way to lower greenhouse gas emissions and help Oregon meet its ambitious climate targets. Moreover, reducing energy use lowers costs for individual households and businesses, and it stabilizes power supplies, which helps avoid the construction of new power plants and keeps utility costs lower overall.

Given these benefits, the Oregon Department of Energy looked for ways to encourage residents to prioritize energy efficiency as they rebuilt.

“We allowed people to build to energy-efficiency standards in the past, but we also put on the table incentives to encourage them to build to contemporary standards,” says state Rep. Pam Marsh, a Democrat whose district encompasses southern Jackson County, where the Almeda Fire occurred.

Money for these programs happened to be available. Oregon had received pandemic relief funding through the American Rescue Plan Act of 2021, the $1.9 trillion Covid-19 relief package that directed federal funds to state, local, and tribal governments to mitigate public health and economic impacts.

Meanwhile, Energy Trust of Oregon, a nonprofit that supports energy-efficiency programs and is funded by utility customers, worked closely with the state and officials in fire-affected communities. They revamped existing programs to make them work for fire victims, adding incentives to promote energy-efficient redevelopment.

A lot was available ​“to encourage people to try to build in the most efficient and fire-resilient way possible,” Marsh says.

With the extra support, a large number of developers, builders, and homeowners ended up prioritizing both wildfire resilience and energy efficiency. Five years after the disaster, many of the new homes in the Almeda Fire footprint, including the Balzers’ residence, are among the most energy efficient in the country. This carrots-instead-of-sticks approach to rebuilding could serve as a model for other states grappling not only with how to build back after disasters but also with how to prevent such disasters from happening again.

Incentives and progressive builders drive efficiency

On the morning of Sept. 8, Charlie Hamilton was driving south on Interstate 5 when he noticed a puff of smoke near an Ashland subdivision his company, Suncrest Homes, had helped build. He raced over; to his relief, the neighborhood had escaped the fire.

“Later that day, the phone started ringing,” Hamilton says.

Suncrest Homes has been building residences in Ashland and Talent since the early 1990s. For over a decade, every project has met the standards of Earth Advantage, a national green building program.

“We had to get all our subs trained, and it’s a little bit more expensive,” Hamilton says. ​“But it is such a better house, and it’s so much more efficient — for the homeowner and their utility bills — that it’s worth a little bit of extra effort and a little bit of extra cost.”

Immediately after the Almeda Fire, Hamilton called the Balzers, who were old family friends, to see if he could help.

“We said, ​‘Yeah, maybe you could build a house for us,’” Alan Balzer says.

Kasey Hamilton, who runs Suncrest Homes with her father, Charlie, helped the Balzers and many former clients who had also lost homes apply to Oregon’s Fire Hardening Grant Program. A partnership between the state building codes division and Oregon counties, this program offered rebates for fire-resistant siding and roofing, ember-resistant vents that help keep sparks out of attics, and other measures that make homes more resistant to wildfire damage.

She helped families obtain additional rebates through the Energy Efficient Wildfire Rebuilding Incentive program, which the Oregon Department of Energy created in the wake of the fires. It offered $3,000 for a home rebuilt to the current code and $6,000 for one rebuilt to an above-code standard. For low- and moderate-income households, the incentives jumped to $7,500 and $15,000, respectively.

Meanwhile, Suncrest Homes was able to take advantage of boosted incentives through Energy Trust of Oregon’s energy performance score program, EPS New Construction. The company had long participated in the program, which offers rebates to builders who implement energy-efficient measures. A third-party verifier inspects a home and tests for air leakage and duct tightness to determine its EPS score; the lower the score, the more efficient — and the greater the incentive.

The boosted incentives were designed to encourage developers to rebuild homes that were lost in the Labor Day Fires as efficiently as possible.

“The incentive we had for going up and above code was doubled, and that’s where we saw a lot of uptake,” says Scott Leonard, residential program manager at Energy Trust.

A team from the nonprofit worked with the Jackson County Long-Term Recovery Group to create new bonus incentives for measures that also hardened rebuilt homes to wildfire.

“Here in Jackson County, we were really interested in not just energy-efficiency recovery, but what are the energy-efficiency measures that also have fire-resilience features,” says Karen Chase, senior community strategies manager at Energy Trust and a member of the Long-Term Recovery Group board of directors. After extensive modeling, Energy Trust landed on three factors that save energy while protecting homes from fire: triple-pane windows, exterior rigid insulation, and unvented attics.

There’s a strong overlap between energy efficiency and fire resilience. Windows, for example, transfer heat readily and are responsible for about half the energy loss in a typical home. Triple-pane windows are 40% more efficient than double-pane options and are more likely to stay intact during a wildfire, preventing fire and heat from penetrating the structure.

Suncrest Homes has taken advantage of the boosted EPS incentives in all 35 homes it has rebuilt in the fire zone.

“We basically stopped building any homes outside of fire rebuilds for two years,” Charlie Hamilton says. ​“I will say the single most rewarding thing I’ve ever done in my career is to hand keys back to somebody who’s lost everything.”

The Balzers’ new home was the very first to be rebuilt in the Almeda Fire zone. Their backyard, landscaped with native plants, includes a swale that captures stormwater. They avoided planting any vegetation next to the house — one of several ​“firewise” steps that should make their home much less vulnerable to fire.

Their house has an electric, ductless ​“mini-split” heating and cooling system and heat-recovery ventilator, which ensures an adequate fresh-air exchange, and a superefficient electric heat-pump water heater — typical in all Suncrest Homes. (Suncrest does occasionally specify gas-fired tankless water heaters, as Energy Trust EPS incentives for builders are funded by both gas and electric utility customers and thus are ​“fuel agnostic.”)

“The incentives reward the builder for choosing more-efficient equipment and better fixtures,” says Fred Gant, a local energy rater for the EPS program. The EPS score also helped verify that homes qualified for the Oregon Department of Energy incentives. ​“Our Energy Trust program manager worked very closely with ODOE to qualify those homes,” he says. So far, Gant has rated 220 homes in the fire zone — an impressive number, considering the size of the region.

“One of the things that helped Energy Trust connect with the rebuild was that we had so many EPS builders already working with us in the Rogue Valley,” Chase says. ​“And through this process, more builders signed up to work with us.”

Chase recommends that other communities invest in recruiting and training skilled energy raters. That way, when disaster strikes, knowledgeable experts are in place.

“Fred already knew what to do. He just showed up, and that’s why it worked so well,” says Chase. ​“To have this many highly energy-efficient homes in one community may make it one of the most energy-efficient cities in the country. It really is the epitome of ​‘build back better.’”

Ensuring everyone can rebuild

These days, it’s hard to believe that the Balzers’ Talent neighborhood — with its new homes, neat yards, and fresh landscaping — was an ash-covered moonscape just five years ago.

Single-family homes have been rebuilt far more quickly than other types of residences that burned down in the Almeda Fire. Homeowners with good insurance coverage were able to replace their houses, sometimes with larger dwellings that had better floor plans and features they had always wanted. Having witnessed the total destruction wrought by the fire, they were motivated to rebuild in ways that enhance resilience, and many were able to take advantage of the available incentives.

But half the dwellings lost in the Almeda fire were manufactured homes, many of which housed some of the valley’s most vulnerable people: seniors, low-income households, and Latine residents, including farmworkers. Many of these units were underinsured or not insured at all.

​“Housing was already a problem in the Rogue Valley,” Chase says. ​“Disasters bring to bear in such stark ways where we are weakest.”

Kathy Kali was a manager and a resident at Bear Creek Mobile Home Park, a 71-unit park nestled along Bear Creek in far-north Ashland, when the fire broke out. She was home with her kids when she first noticed smoke billowing to the south. Before long, she was helping neighbors evacuate.

While she knocked on doors, her husband wrangled the kids and the cats. ​“We ended up sleeping in our car with two teenagers and two cats in a parking lot in Canyonville near the casino,” Kali says.

All but three of the park’s units burned.

Kali and her husband had insurance that covered nearly six months of temporary housing, and they were eventually able to put a down payment on a duplex. But she estimates that only about a quarter of the park’s homes were insured.

“It was so shocking for me to see the discrepancy between our situation and [that of] many of my former neighbors,” she says.

Soon after the disaster, Kali began working for the Almeda Fire Zone Captains, a network of community leaders who connected fire survivors with resources. She helped Bear Creek residents find emergency housing assistance and apply for grants to replace their lost units — and simply listened as they shared their traumatic stories of the fire.

Even before the Labor Day Fires, Oregon Housing and Community Services, Energy Trust, and other partners had identified the energy-savings opportunity of replacing old, leaky, mold-prone manufactured homes with new, efficient ones. Over half the state’s inventory of manufactured homes was built before 1976, when the federal government began regulating standards for this housing type. Oregon Housing and Community Services expanded the Manufactured Home Replacement Program in 2021 to better accommodate wildfire victims.

Then in 2024, Oregon Housing and Community Services launched the federally funded Homeowner Assistance and Reconstruction Program. To take advantage of these resources, replacement manufactured homes had to meet the standards of the Northwest Energy-Efficiency Manufactured Housing Program. In addition, Energy Trust offered generous incentives for replacement manufactured homes that met those standards and Energy Star standards.

Kali estimates that she has helped 25 people obtain various grant funding. At Bear Creek Mobile Home Park, around 30 of the burned units were replaced within two years, even as many other parks lay vacant.

“It basically got rebuilt faster than any of the other mobile home parks because they had advocacy — they had me and a hands-on owner who was supportive,” says Kali, who now works as a real estate agent. In contrast, many of the residents in parks with absentee or corporate landlords ​“got dispersed and had no way to know about the resources,” she says.

The uneven recovery of the manufactured home sector has frustrated residents, lawmakers, and advocates. Still, there are some other standout success stories.

After the Almeda Fire destroyed all but 10 units of Talent Mobile Estates, two residents there formed a nonprofit called Coalición Fortaleza to help the park reemerge as the Talent Community Cooperative, a resident-owned manufactured home community. They partnered with Casa of Oregon, an affordable-housing developer, to help residents collectively purchase the land and rebuild.

Casa used a $7.5 million loan to buy the land from the private company that owned it. Portland-based Salazar Architect took on master planning and hosted design workshops to engage residents.

The project was largely funded through Oregon Housing and Community Services, which coordinated the purchase and installation of the new manufactured homes. Because the homes met Energy Star standards, they qualified for Energy Trust incentives of $10,000 for a single-wide or $15,000 for a double-wide.

The homes have noncombustible siding, ember-resistant vents, multipane windows, and other features that make them both more efficient and resilient.

Peter Hainley, Casa’s executive director, stresses that a project like the Talent Community Cooperative is possible only because of coordinated funding.

“So much of this is controlled by money,” Hainley says. ​“The legislature came through pretty quickly because there was the flood of money coming from the federal government — not because of these [fire] disasters, but because of the pandemic.”

Future-proofing communities

The Balzers like to joke that their new home is a kitchen with a house designed around it. Since energy efficiency is part of the package in a Suncrest Home, the couple didn’t have to research high-performance windows or HVAC equipment. Instead, they focused on the custom features they really wanted, like wainscoting and a large kitchen island.

The built-in energy-efficiency will keep them comfortable and buffer them from skyrocketing utility rates for as long as they remain in their home. But it’s not just the Balzers who will benefit. Collectively, energy-efficient construction makes communities more resilient and helps mitigate climate change by lowering energy demand across the board.

It’s a lesson that other jurisdictions might bear in mind. Weakened building codes may make it easier to rebuild, but they don’t help homeowners, communities, or states in the long run. In some cases, the rollbacks don’t even save money. For example, after the devastating Los Angeles wildfires in January 2025, the city’s mayor exempted fire rebuilds from a city ordinance that requires new buildings to be all-electric. A recent report shows that all-electric construction is more affordable, not to mention healthier for occupants.

With enough funding and the right political will, incentives can help ensure that the burden of rebuilding to high energy-efficiency standards doesn’t fall on homeowners and builders who can’t afford the extra cost. States should consider such incentives as an investment in the future.

As climate change worsens, massive disasters like the Almeda Fire will keep happening. Cities, counties, and states will have to help communities rebuild equitably and thoughtfully in ways that are affordable and that ensure homes are less likely to burn down again. High-performance, energy-efficient construction is a key strategy for both responding to and mitigating these disasters — especially since those who live in fire-prone areas are reluctant to leave the places they call home.

Carole Balzer admits she gets anxious now whenever there’s a red-flag warning in the summer. But she and Alan have never considered moving away from the Rogue Valley.

“We have a lot of close friends — that’s what’s keeping us here,” she says. ​“Plus, it’s a beautiful area.”

Green steelmaker Boston Metal to cut jobs following equipment failure
Feb 23, 2026

Green-steel startup Boston Metal has suffered a major setback following an industrial accident at its facility in Brazil.

The Massachusetts-based company announced it will lay off 71 people in the U.S. after the incident at its Brazilian plant last month thwarted a key funding deal, Boston Business Journal first reported. The turn of events was ​“sudden, dramatic, and unexpected,” company sources told the news outlet.

Boston Metal is among the handful of well-funded startups advancing newer and cleaner ways of making steel — a process that traditionally relies on polluting, coal-fueled furnaces. Since spinning out of MIT in 2013, the company has raised over $400 million from a range of investors, including global steel giant ArcelorMittal, the venture-capital arm of oil giant Saudi Aramco, and Microsoft’s Climate Innovation Fund.

On Jan. 30, Boston Metal experienced an ​“unforeseen critical equipment failure” in its manufacturing facility in Brazil, the company told Canary Media in a statement on Monday. Although the incident was ​“fully contained, with no injuries or environmental impact,” the equipment damage prevented Boston Metal from hitting an operational milestone that was tied to a pending financing transaction.

“As a result, we lost access to committed capital essential to supporting our operations in both Brazil and the U.S.,” the company said, forcing the need to reduce its American workforce. Before the accident, Boston Metal employed over 300 professionals in the United States and Brazil.

Globally, steel production accounts for between 7% and 9% of human-caused greenhouse gas emissions. The bulk of that pollution comes from heating coal to transform iron ore into iron, which is turned into higher-strength steel in a separate furnace. While companies like Stegra and SSAB, both in Sweden, are looking to replace coal with green hydrogen in the ironmaking stage, Boston Metal is attempting to reinvent this process entirely.

The startup is developing a novel approach called ​“molten oxide electrolysis,” which involves using electric current to heat iron ore to around 1,600 degrees Celsius to drive chemical reactions, without emitting any carbon dioxide. The resulting material then cools into blocks of steel.

Last March, Boston Metal said it had moved one step closer to commercializing its technology after successfully producing steel from its industrial-size system in the Boston suburb of Woburn. The accomplishment ​“de-risks our technology and validates scalability to achieve commercial production,” the company said in a press release.

Yet as Boston Metal works to refine its green-steel system, it has also been pursuing projects in Brazil that it hopes could become a reliable source of revenue in the nearer term.

Boston Metal’s same molten oxide electrolysis process can be used to extract high-value metals such as niobium, chromium, and manganese from mine-waste tailings. That could reduce the need for other companies to pull those materials directly from the earth.

Adam Rauwerdink, Boston Metal’s senior vice president of business development, told Canary Media last June that the company was initially focusing on extracting and selling niobium — a valuable alloying element used in steel production — to start bringing in money. At the time, niobium sold for about $82 per kilogram (about $74,000 per ton), while steel went for roughly $900 per ton.

Prior to last month’s accident, Boston Metal said it had already restructured its business to concentrate on advancing its operations in critical metals. ​“There is strong near-term demand for critical metals, while the cost and complexity of developing molten oxide electrolysis [for steel] have outpaced what our current revenue and available capital can support,” the company said in this week’s statement.

Boston Metal’s Brazilian subsidiary, Boston Metal do Brasil, built and began operating a pilot facility in the state of Minas Gerais in 2023. Last year, it completed construction on an industrial critical-metals plant, and the subsidiary was set to start ​“generating revenue with industrial-scale production” this year, according to a company fact sheet.

Though Boston Metal says it will press ahead with its high-value metals strategy, it’s unclear how the industrial accident in Brazil will affect that production timeline or impact Boston Metal’s broader expansion plans in the United States. The announcement of layoffs in Massachusetts comes shortly after the office of Democratic Gov. Maura Healey awarded Boston Metal over $950,000 in capital grants to upgrade its Woburn operations — public backing that was reportedly expected to lead to local job growth.

“In the coming months, our priority will be restoring operations in Brazil and scaling the critical metals business in Brazil, the U.S., and internationally,” the company said.

Is clean coal really clean?
Feb 22, 2026

This article was originally published by Project Drawdown and is republished here with attribution.

Key Takeaways

  • “Clean” coal isn’t new – it’s the same coal with added pollution controls, and it still carries major environmental and health harms.
  • Carbon capture and sequestration (CCS) sounds promising, but it's rare, expensive, and can consume up to 25% of a coal plant’s own energy.
  • Only two coal plants worldwide use CCS, largely because the economics make it difficult to scale.
  • Propping up aging coal plants raises costs for operators and consumers, especially as coal’s share of U.S. power continues to decline.
  • “Clean” coal doesn’t address upstream pollution, such as spontaneous coal combustion, which releases large amounts of greenhouse gases and toxins.
  • Renewables like solar and wind are now cheaper, faster to build, and better suited to meet growing electricity demand than coal.

It’s no secret that the Trump Administration wants coal to make a comeback.

As America’s appetite for electricity grows, Trump and his appointed officials have stated, often and plainly, that they want to use “clean” coal to meet this demand.

However, you shouldn’t be fooled by the rebrand. This isn’t some new type of coal; it’s the same coal humans have been burning for centuries, just with additional steps to capture some of the pollution. “Clean” coal still has all of the environmental, health, and economic issues associated with its mining, transportation, and use in power plants. Not to mention the inconvenient, but often overlooked fact that coal can spontaneously combust.

Let’s take a closer look at “clean” coal: what it is, whether or not it actually works, and what the alternatives are for meeting rising energy demand.

What is “clean” coal?

Clean coal refers to coal burned in power plants using technologies that aim to reduce the amount of pollution released during combustion. The number of pollutants targeted under the umbrella of clean coal has changed over time, but these technologies have been used effectively to reduce pollutants like sulfur dioxide (a major cause of acid rain), nitrogen dioxide (which contributes to ground-level ozone), and particulates (which can cause respiratory distress).

Now, “clean” coal is meant to go even further, using recent developments in carbon capture and sequestration (CCS) technologies to capture carbon dioxide from the exhaust gas of coal power plants and store it underground. While this sounds great in theory, in practice, it is much more complicated.

Is “clean” coal actually feasible?

Currently, only two coal power plants in the world have CCS technologies installed, one in Canada and the other in the US. Part of the reason so few plants have added CCS is the high initial installation cost and high energy consumption for operation. To operate CCS, a coal power plant would have to consume 20–25% of the energy it produces.

And while coal is still a sizeable chunk of U.S. power generation, its share of overall generation is dropping. Yet the Trump administration is putting its thumb on the scale, using emergency orders to force coal power plants that were set to retire to stay up and running. Unfortunately, keeping aging infrastructure can be costly, both for the operators themselves and their consumers.

All of this makes the economics of CCS very challenging, as plant operators will have less electricity to sell, more expenses to sustain and retrofit aging facilities, and higher ongoing operating costs. To recoup those costs, operators will have to sell electricity at a higher price, a tall order given already skyrocketing energy prices across the country.

And that’s just the economic and political challenges facing “clean” coal. There’s also the problem of upstream emissions and the inconvenient truth that coal pollution doesn’t start at the power plant.

A problem that even “clean” coal does not address

Even if “clean” coal were economically and politically viable, it still wouldn’t solve one of the biggest, but often overlooked risks of the dirty fuel: spontaneous combustion.

Spontaneous coal combustion (SCC) occurs when coal suddenly catches on fire without any ignition source. While the causes of SCC are not well understood, the negative repercussions are abundantly clear.

SCC reduces the value of coal resources, releases pollution into the environment, damages mining equipment, and affects the health and safety of workers. Indeed, coal fires are among the greatest safety hazards for coal mining communities and miners, emitting a number of planet-warming and health-harming gases, including carbon monoxide, carbon dioxide, methane, ethylene, ethane, as well as particulate matter and coal tar, both of which can be hazardous to people’s health and the environment.

Though quantifying greenhouse gas emissions from SCC is challenging, one study based in China estimated that the annual loss of 20 Mt of coal from SCC may be responsible for as high as 42 Mt of carbon dioxide equivalents, which is roughly the same as the annual emissions from 9 million gas-fueled cars. Such emissions would still occur even if every coal plant in the country were retrofitted with CCS technologies.

Meeting energy demand without “clean” coal

Once upon a time, coal power made sense. It was an abundant, cheap fuel source that could be used virtually anywhere, and even though it had its downsides, it was economically more sensible than other energy sources. But the economic realities of electricity generation have changed over the past decade.

Electricity generated by renewables like solar and wind is now cheaper than electricity from coal power plants. The International Energy Agency found that over 96% of the newly installed solar and wind capacity in 2024 had lower power generation costs than new coal and gas plants, and over 92% of total power expansion that year came from renewables.

In addition, with new and growing demands for electricity for everything from vehicle electrification to AI datacenters, a large amount of power is needed quickly, and solar and wind power are among the fastest to be installed. An average renewable energy facility takes between one and three years to come online, while coal and gas-fired power plants can take up to five years or more.

Ultimately, when deciding whether to keep U.S. coal plants online or even build new ones, it's good to remember a prescient quote often attributed to a surprising source, a former Saudi oil minister: “The Stone Age didn’t end because we ran out of stones.” It’s never been clearer that the global leaders of tomorrow will be those who harness the clean, abundant, and cheap renewable energy sources now available to us. The real question, then, is not whether or not the U.S. should support “clean” coal, but rather, will the U.S. put down the stones and move boldly toward a better future?

Jason Lam, BSc, MEL, is a research fellow focusing on the buildings, electricity, and industry sectors. Jason has a Bachelor of Science degree in biosystems engineering with an environmental specialization from the University of Manitoba. He also has a Master of Engineering Leadership in clean energy engineering from the University of British Columbia, expanding his technical knowledge in clean energy and developing business acumen skills.

This work was published under a Creative Commons CC BY-NC-ND 4.0 license. You are welcome to republish it following the license terms.

About Project Drawdown
Project Drawdown is the world’s leading guide to science-based climate solutions. Our mission is to drive meaningful climate action around the world. A 501(c)(3) nonprofit organization, Project Drawdown is funded by individual and institutional donations.

Chart: Grid battery installations soared to a new high in 2025
Feb 20, 2026

See more from Canary Media’s ​“Chart of the Week” column.

It’s official: Grid batteries broke another record.

More than 13 gigawatts of energy storage was installed across the U.S. last year, per a new report from the Business Council for Sustainable Energy and BloombergNEF. That’s up from the roughly 12 GW installed in 2024.

It’s the latest reminder of the meteoric rise of battery storage, a quick-to-deploy technology that’s key to cutting emissions from the electricity system. Storage enables the grid to bank electricity when it’s cheap and abundant — like when surplus solar is generated in the middle of a sunny day — and deploy it when prices are high and electrons are scarce.

Less than a decade ago, the sector was little more than an intriguing possibility. Energy storage in America mostly meant massive, decades-old pumped-hydro storage projects and a handful of small lithium-ion battery plants.

In 2017, only 500 megawatts of grid battery capacity was online in the U.S.; now, there are individual battery installations larger than 500 MW. Still, the sector had big expectations for itself back then: In 2017, the Energy Storage Association set a goal of reaching 35 GW of storage capacity by 2025.

Last year, the sector smashed that goal, hitting it in July and ending the year with nearly 45 GW of installed capacity.

Increasingly abundant solar power, rising energy demand, and declining battery costs have combined to propel the storage sector to these lofty heights. To date, most utility-scale batteries have been plugged into the grids of Texas and California, two solar-soaked states with radically different approaches to encouraging storage growth.

In the coming years, the storage sector has a smoother path to continued growth than do renewables.

Yes, it faces some challenges. Federal tax incentives are now contingent on compliance with strict but vague anti-China supply-chain rules. Developers also have to deal with tariffs and increasing local opposition.

But, unlike for solar and wind, tax credits for storage were spared in the One Big Beautiful Bill Act that President Donald Trump signed into law in July. Also unlike solar and wind, the battery industry has not yet attracted much explicit trash-talking from either Trump administration officials or Trump himself. Storage is also increasingly cheap and fast to build.

These facts, plus the urgent need for new sources of affordable energy as utility bills rise, have the storage industry poised for continued growth in the years to come.

Where’s New York on climate goals? Falling behind.
Feb 20, 2026

This analysis and news roundup come from the Canary Media Weekly newsletter. Sign up to get it every Friday.

President Donald Trump has all but dismantled U.S. efforts to curb pollution that’s warming the planet and harming human health.

Yet with every federal blow to climate action, states have launched a counterpunch. Take Colorado: After Trump and congressional Republicans ended federal EV tax credits, the state juiced its own clean-car incentives. California has meanwhile inked a deal with the United Kingdom to cooperate on clean energy and climate efforts. And several other states are considering ​“climate superfund” laws, which seek to hold fossil fuel companies financially responsible for climate change–induced damages.

But instead of doubling down on decarbonization in this critical hour, and despite touting that it has one of the most ambitious climate laws in the country, New York is quietly backing away from its efforts.

The most recent and symbolically loaded move concerns that very same climate law, the 2019 Climate Leadership and Community Protection Act. New York’s utility regulator is currently considering suspending its marquee clean-energy goal, which requires the state to get 70% of its power from renewables by 2030 and 100% by 2040.

To be clear, New York is not on track to meet this target anyway. But behind the proposed rollback is a petition, signed by two natural gas company veterans, which claims that the target will jeopardize grid reliability, Gothamist reports. New York’s grid operator has cautioned that power shortfalls are a mounting risk, but environmental advocates point out that the warning doesn’t take a ton of soon-to-connect clean energy projects into account. That includes two offshore wind projects that have been slowed by the Trump administration.

It’s just the latest climate retreat by Gov. Kathy Hochul, a Democrat who is up for reelection this year.

Last year, New York was poised to implement a first-of-its-kind ban on fossil-fueled heating and appliances in new homes and buildings. But in November, just before the rule was set to take effect, the state said it wouldn’t enforce the regulation while a lawsuit continued to play out.

Hochul has also repeatedly delayed the implementation of the cap-and-invest program that’s essential to New York’s emissions goals, leaving what could be billions of dollars for renewables construction and energy-efficiency projects in limbo.

And while Hochul has called for more clean energy and nuclear power to meet rising demand, she has also signaled natural gas is essential to the state’s energy strategy, as she allowed a previously rejected pipeline to move forward.

Hochul’s motive for most of these moves has been clear: She’s worried about rising power prices in the state and has cited a need to ​“govern in reality” amid the federal government’s clean energy assault. But as a warming climate puts New York and the rest of the world increasingly at risk, running in the wrong direction on decarbonization is anything but governing in reality.

More big energy stories

What the endangerment finding rollback means for automakers

The EPA last week revoked the endangerment finding, which underpins the U.S. government’s authority to regulate greenhouse gas emissions. This rollback has upended federal tailpipe emissions regulations, which the administration says will curb vehicle prices and save Americans as much as $1.3 trillion by 2055.

But the EPA’s own analysis tells a different story, The Guardian reports. It estimates Americans will rack up more than $1.4 trillion as they buy more fuel, need more repairs, and face increased traffic and noise — essentially negating those touted savings.

And it’s unlikely that U.S. automakers will backtrack on vehicle efficiency like the EPA wants, experts tell The New York Times. The rest of the world is still moving toward electric vehicles, so cars that use more gasoline are the opposite of what other countries — and many Americans — will buy.

Fake public comments are clean energy’s latest threat

As if clean energy didn’t have enough challenges to deal with.

Last June, Southern California’s top air-quality authority rejected a plan that would have pushed the region away from gas appliances. Regulators received tens of thousands of comments opposing the plan, but a Los Angeles Times investigation found at least 20,000 of them appear to have been AI-generated. The agency’s staffers also reached out to some alleged commenters, and at least three said they hadn’t written a comment.

The incident is similar to what Canary Media’s Kathiann M. Kowalski reported on earlier this month. In Ohio, state regulators may reject a solar farm that received dozens of public comments opposing it. But as the project’s developer found, and Kowalski verified, more than 30 of those commenters appear to have used fake names or lied about living in the county where the solar farm will be built.

Clean energy news to know this week

Another coal-plant prop-up: Documents indicate that the Trump administration will move to let coal plants emit more hazardous pollutants, including mercury, in an attempt to juice the industry. (New York Times)

Chillin’ with Duke Energy: Canary reporter Elizabeth Ouzts let North Carolina utility Duke Energy remotely lower her thermostat in exchange for bill credits, and even in a recent cold spell, Ouzts says she found the savings meaningful. (Canary Media)

Illinois’ nuclear reversal: Illinois Gov. JB Pritzker (D) calls for the state to build at least 2 gigawatts of nuclear capacity, just a month after lifting the state’s long-standing moratorium on nuclear construction. (Bloomberg)

Cancer Alley’s new threat: Louisiana residents already saturated with petrochemical pollution now face a wave of ​“blue ammonia” plants, which will burn fossil fuels and potentially saddle them with even more emissions. (Floodlight)

Power surge: 2025 was a tough year for clean energy in the U.S., but grid batteries still set another installation record as more solar power came online and power demand rose. (Canary Media)

>