As global temperatures rise, extreme weather events are becoming more intense and more frequent all around the world.
Over the past two decades, the cutting-edge field of extreme weather attribution has sought to establish the role that human-caused warming has played in these events.
There are now hundreds of attribution studies, assessing extremes ranging from heatwaves in China and droughts in Madagascar through to wildfires in Brazil and extreme rainfall in South Africa.
Carbon Brief has mapped every attribution study published to date, revealing that three-quarters of the extremes analysed were made more intense or likely due to climate change.
Along with this explosion of new studies, the different types of attribution studies have evolved and expanded over the past two decades.
For example, the World Weather Attribution service was established in 2015 to provide rapid-response studies, streamlining the process of estimating the human contribution to extreme events in a matter of days.
Meanwhile, a growing community of researchers are developing the “storyline approach” to attribution that focuses more on the dynamics of the specific events being studied.
Other researchers are using weather forecasts to attribute events that have not even happened yet. And many studies are now combining these methods to get the best of all worlds in their findings.
In this detailed Q&A, Carbon Brief explores how the field of attribution science has evolved over time and explains the key methods used today.
- What are the origins of ‘extreme weather attribution’?
- What is ‘probabilistic’ attribution?
- Which weather extremes can scientists link to climate change?
- Why do scientists perform ‘rapid’ attribution studies?
- Can the impacts of extreme weather be linked to climate change?
- How do scientists attribute ‘unprecedented’ events?
- How can weather forecasts be used in attribution studies?
- What are the applications of attribution science?
- What are the next steps for attribution research?
What are the origins of ‘extreme weather attribution’?
The Intergovernmental Panel on Climate Change (IPCC) made its first mention of attribution in its first assessment report (pdf), published in 1990. In a section called “Attribution and the fingerprint method”, the report refers to attribution as “linking cause and effect”.

In these early days of attribution science, experts used statistical methods to search for the “fingerprint” of human-caused climate change in global temperature records.
However, the 1990 report says that “it is not possible at this time to attribute all or even a large part of the observed global mean warming to the enhanced greenhouse effect on the basis of the observational data currently available”.
As the observational record lengthened and scientists refined their methods, experts became more confident about attributing global temperature rise to human-caused climate change. By the time its third assessment report was published in 2001, the IPCC could state that “detection and attribution studies consistently find evidence for an anthropogenic signal in the climate record of the last 35 to 50 years”.
Just two years later, Prof Myles Allen – professor of geosystem science at the University of Oxford – wrote a Nature commentary from his home in Oxford that would open the door for attributing extreme weather events to climate change. The article begins:
“As I write this article in January 2003, the floodwaters of the River Thames are about 30 centimetres from my kitchen door and slowly rising. On the radio, a representative of the UK Met Office has just explained that although this is the kind of phenomenon that global warming might make more frequent, it is impossible to attribute this particular event (floods in southern England) to past emissions of greenhouse gases. What is less clear is whether the attribution of specific weather events to external drivers of climate change will always be impossible in principle, or whether it is simply impossible at present, given our current state of understanding of the climate system.”
Just months after Oxford’s floodwaters began to recede, a now-infamous heatwave swept across Europe. The summer of 2003 was the hottest ever recorded for central and western Europe, with average temperatures in many countries reaching 5C higher than usual.
The unexpected heat resulted in an estimated 20,000 “excess” deaths, making the heatwave one of Europe’s deadliest on record.
In 2004, Allen and two other UK-based climate scientists produced the first formal attribution study, published in Nature, which estimated the impact of human-caused climate change on the heatwave.
To conduct the study, the authors first chose the temperature “threshold” to define their heatwave. They decided on 1.6C above the 1961-90 average, because the European summer of 2003 was the first on record to exceed this average temperature.
They then used a global climate model to simulate two worlds – one mirroring the world as it was in 2003 and the other a fictional world in which the industrial revolution never happened. In the second case, the climate is influenced solely by natural changes, such as solar energy and volcanic activity, and there is no human-caused warming.
The authors ran their models thousands of times in each scenario from 1989 to 2003. As the climate is inherently chaotic, each model “run” – individual simulations of how the climate progresses over many years – produces a slightly different progression of temperatures. This means that some runs simulated a heatwave in the summer of 2003, while others did not.
The authors counted how many times the 1.6C threshold temperature was crossed in the summer of 2003 in each model run. They then compared the likelihood of crossing the threshold temperature in the world with – and a world without – climate change.
They concluded that “it is very likely that human influence has at least doubled the risk of a heatwave exceeding this threshold magnitude”.
A Nature commentary linked to the study called the paper a “breakthrough”, stating that it was the “first successful attempt to detect man-made influence on a specific extreme climatic event”.
In the decade following the heatwave study, more teams from around the world began to use the same methods – known as “probabilistic”, “risk-based” or “unconditional” attribution.
Prof Peter Stott is a science fellow in climate attribution at the UK Met Office and an author on the study. Stott tells Carbon Brief that the basic methods used in this first attribution study are “still used to this day”, but that scientists now use more “up-to-date” climate models than the one used in his seminal study.
What is ‘probabilistic’ attribution?
As the 2004 Nature study demonstrated, probabilistic attribution involves scientists running climate models thousands of times in scenarios with and without human-caused climate change, then comparing the two.
This allows them to say how much more likely, intense or long-lasting an event was due to climate change.
Many studies since have added a third scenario, in which the planet is warmer than present-day temperatures, to assess how climate change may impact extreme weather events in the future.
The figure below shows three distributions of multiple different simulated extreme events. The x-axis (horizontal) represents the intensity of the climate variable – in this instance temperature – with lower temperatures on the left and higher temperatures on the right. The y-axis (vertical) shows the likelihood of this variable hitting certain values.
Each curve shows how the climate variable behaves in a different scenario, or “world”. The red-shaded curve shows a pre-industrial world that was not warmed by human influence, the yellow-shaded curve indicates today’s climate, while the dashed line shows a future, warmer world. The curves shift from left to right as the climate warms.
The peak of each curve shows the most likely temperatures, while likelihood is lowest at the far left and far right of each curve, where temperatures are most extreme. The hatched areas show the temperatures that cross a predefined “threshold” temperature. (In the attribution study on the 2003 European heatwave, this threshold was defined as 1.6C above the 1961-90 average.)
The three curves show how the threshold is more likely to be crossed as the world warms.

Which weather extremes can scientists link to climate change?
In 2011, the American Meteorological Society decided to include a “special supplement” about attribution research in its annual report.
The supplement presented six different attribution studies. It generated significant media interest and the “Explaining Extreme Events” report has been published by Bulletin of the American Meteorological Society almost every year since.
As the research field has grown, so too has the range of different extremes that have been studied.
Heatwaves are generally considered the simplest extreme events to attribute, because they are mainly driven by thermodynamic influences. In contrast, storms and droughts are more strongly affected by complex atmospheric dynamics, so can be trickier to simulate in a model.
The graphic below shows the relative confidence of attributing different types of extreme events.

Attribution studies on extreme heat often assess how much hotter, long-lasting or likely an event was due to climate change. For example, one study finds that the summer heatwave that hit France in 2019 was made 1.5-3C hotter due to climate change and about 100 times more likely.
Heatwaves are the most-studied extreme event in attribution literature, but are becoming “less and less interesting for researchers”, according to a Bloomberg article from 2020.
Assessing extreme rainfall is more complicated – in part because the Earth’s chaotic weather system means that the size and path of a storm or heavy rainfall event has a large element of chance, which can make it challenging to identify where climate change fits in.
Nevertheless, many teams have published studies attributing extreme rainfall events and storms. For example, one study (pdf) found that climate change doubled the likelihood of the intense rainfall that fell in northern China in September 2021.
Scientists also study more complex events, such as drought, wildfires and floods, which are impacted by factors including land use and disaster preparedness.
For example, there are many different ways to define a drought. Some are linked just to rainfall, while others consider factors including soil moisture, groundwater and river flow. Some attribution studies investigating the impact of climate change on drought focus only on rainfall deficit, while others (pdf) study temperature or vapour pressure deficit – the difference between the amount of moisture in the air and how much moisture the air can hold when it is saturated.
A scientist’s decision about which type of drought to study sometimes depends on the available data and the type of impacts caused by the drought. In other cases, the choice may come down to what caused the biggest impact on people.
For example, in late 2022, South America was plagued by a severe drought that caused widespread crop failure. An attribution study on the event, therefore, focused on “agricultural” drought, which captures the response of rainfall on soil moisture conditions and is the most relevant for crop health.

Meanwhile, a study on drought in Madagascar over 2019-21 chose to focus on rainfall deficit. The study says “this was because recent research found rainfall deficits were the primary driver of drought in regions of East Africa with very similar climatic properties to south-west Madagascar”.
Wildfires are affected by conditions including temperature, rainfall, wind speed and land use. While some wildfire attribution studies focus on vapour pressure deficit, others quantify the fire weather index, which looks at the effects of fuel moisture and wind on fire behaviour and spread”.
Tropical cyclones are also complex. There is evidence that climate change can increase the peak “rain rates” and wind speeds of tropical cyclones, and that storm tracks are shifting poleward. There are many aspects of a cyclone that can be analysed, such as rainfall intensity, storm surge height and storm size.
Why do scientists perform ‘rapid’ attribution studies?
As extreme weather attribution became more mainstream, researchers began to produce studies more quickly. However, challenges in communicating the findings of attribution studies in a timely way soon became evident.
After conducting a study, writing it up and submitting it to a journal, it can still take months or years for research to be published. This means that, by the time an attribution study is published, the extreme event has likely long passed.
The World Weather Attribution (WWA) initiative was founded in 2015 to tackle this issue. The team uses a standard, peer-reviewed methodology for their studies, but does not publish the results in formal journals – instead publishing them directly on their website.
(After publishing these “rapid attribution” studies on their website, the team often write full papers for publication in formal journals, which are then peer reviewed.)
This means that rather than taking months or years to publish their research, the team can make their findings public just days or weeks after an extreme weather event occurs.
In 2021, the founders of the initiative – including Carbon Brief contributing editor Dr Friederike Otto, who is a senior lecturer in climate science at Imperial College London’s Grantham Institute – wrote a Carbon Brief guest post explaining why they founded WWA:
“By reacting in a matter of days or weeks, we have been able to inform key audiences with a solid scientific result swiftly after an extreme event has occurred – when the interest is highest and results most relevant.”
The guest post explains that to conduct an attribution study, the WWA team first uses observed data to assess how rare the event is in the current climate – and how much this has changed over the observed record. This is communicated using a “return period” – the expected frequency an event of this magnitude could be expected under a given climate.
For example, the WWA analysed the UK’s record-shattering heatwave of 2022, when the country recorded temperatures above 40C for the first time. They found that the maximum temperature seen in the UK on 19 July 2022 has a 1,000-year return period in today’s climate – meaning that even in today’s climate, 40C heat would only be expected, on average, once in a millennium.
The authors then use climate models to carry out the “probabilistic” attribution study, to determine how much more intense, likely or long-lasting the event was as a result of climate change. They conclude by conducting “vulnerability and exposure” analysis, which often highlights other socioeconomic problems.
Sometimes, the authors conclude that climate change did not influence the event. For example, a 2021 rapid attribution study by WWA found that poverty, poor infrastructure and dependence on rain-fed agriculture were the main drivers of the ongoing food crisis in Madagascar, while climate change played “no more than a small part”.
Other groups are also conducting rapid attribution studies. For example, a group of scientists – including some WWA collaborators – recently launched a “rapid experimental framework” research project called ClimaMeter. The tool provides initial attribution results just hours after an extreme weather event takes place.
ClimaMeter focuses on the atmospheric circulation patterns that cause an extreme event – for example, a low-pressure system in a particular region. Once an event is defined, the scientists search the historical record to find events with similar circulation patterns to calculate how the intensity of the events has changed over time.
Can the impacts of extreme weather be linked to climate change?
A branch of attribution science called “impact attribution” – which aims to quantify the social, economic and/or ecological impacts of climate change on extreme weather events – is also gaining popularity. There are four main types of impact attribution, as shown in the graphic below.

1) Trend-to-trend impact attribution
The first method, called “trend-to-trend” impact attribution, assesses long-term trends in both the climate system and in “health outcomes”. This approach was used in a 2021 study on heat-related mortality around the world, which received extensive media attention.
The authors used data from 732 locations in 43 countries to identify relationships between temperature and mortality in different locations, known as “exposure-response functions”. This allowed them to estimate how many people would die in a given location, if temperatures reach a certain level.
The authors used these relationships to calculate heat-related mortality over 1991-2018 for each location under two scenarios – one with and one without human-caused climate change. The study concluded that 37% of “warm-season heat-related deaths” can be attributed to human-caused climate change.
2) Event-to-event attribution
The second type of study is known as “event-to-event” attribution. In one study using this method, the authors used data on observed mortality rates to determine how many people died in Switzerland during the unusually warm summer of 2022.
They calculated how much climate change contributed to warming during that summer. They then then ran a model to calculate the “hypothetical heat-related burden” that would have been seen during the summer without the warming influence of climate change.
Using this method, they estimate that 60% of the 623 heat-related deaths “could have been avoided in absence of human-induced climate change”.
3) Risk-based event attribution
“Risk-based” event impact attribution – which is demonstrated in a more recent study on the 2003 European heatwave – is the third type of impact attribution. This method combines probabilistic event attribution with resulting health outcomes.
When the paper was published, its lead author, Prof Dann Mitchell – a professor of climate science at the University of Bristol – explained the method to Carbon Brief:
“We have a statistical relationship between the number of additional deaths per degree of warming. This is specific to a certain city and changes a lot between cities. We use climate simulations to calculate the heat in 2003, and in 2003 without human influences. Then we compare the simulations, along with the observations.”
They find, for example, that in the summer of 2003, anthropogenic climate change increased the risk of heat-related mortality in London by around 20%. This means that out of the estimated 315 deaths in London during the heatwave, 64 were due to climate change.
4) Fractional attribution
In the final method, known as “fractional” attribution, the authors combine the results of two independent numbers – an estimation of the total damages caused by an extreme weather event, and a calculation of the proportion of the risk from an extreme weather event for which anthropogenic climate change is responsible, known as the “fraction of attributable risk” (FAR).
The authors of one study used this method to estimate the economic damages linked to Hurricane Harvey.

The authors calculate that “fraction of attributable risk” for the rainfall from Harvey was around three-quarters – meaning that climate change was responsible for three-quarters of the intense rainfall.
Separately, the authors find that according to best estimates, the hurricane caused damages of around US$90bn. From this, the authors conclude that US$67bn of the damages caused by the Hurricane’s intense rainfall can be attributed to climate change.
A study on the 2010 Russian heatwave also used this method. The authors found that the heatwave was responsible for more than 55,000 deaths (pdf), and found an 80% chance that the extreme heat would not have occurred without climate warming. The study concludes that almost 45,000 of the deaths were attributable to human-caused climate change.
However, the fractional attribution method has received criticism. One paper argues that the method “inflates the impacts associated with anthropogenic climate change”, because it “incorrectly assumes” that the event has no impact unless it exceeds the threshold defined by the researchers.
Some of the authors of the Hurricane Harvey paper later wrote a paper advising caution in interpreting the results of FAR studies. They say:
“The fraction of attributable risk (FAR) method, useful in extreme weather attribution research, has a very specific interpretation concerning a class of events, and there is potential to misinterpret results from weather event analyses as being applicable to specific events and their impact outcomes…FAR is not generally appropriate when estimating the magnitude of the anthropogenic signal behind a specific impact.”
Expanding scope
Impact attribution is continuing to expand in scope. For example, studies are now being conducted to assess the impact of climate change on disease transmission.
In 2020, scientists quantified the influence of climate change on specific episodes of extreme ice loss from glaciers for the first time. They found that human-caused climate change made the extreme “mass loss” seen in glaciers in the Southern Alps, New Zealand, in 2018 at least 10 times more likely.
Scientists have also linked climate change to ecosystem shifts. One study focusing on temperature finds that the “extremely early cherry tree flowering” seen in Kyoto in 2021 was made 15 times more likely due to climate change.

Others go even further, linking weather extremes to societal impacts. For example, a 2021 study published in Scientific Reports says:
“By combining an extreme event attribution analysis with a probabilistic model of food production and prices, we find that climate change increased the likelihood of the 2007 co-occurring drought in South Africa and Lesotho, aggravating the food crisis in Lesotho.”
Meanwhile, Imperial College London’s Grantham Institute is working on an initiative to publish rapid impact attribution studies about extreme weather events around the world. Similar to WWA studies, these rapid studies will not be peer reviewed individually, but will be based on a peer-reviewed methodology.
Dr Emily Theokritoff – a research associate at Grantham, who is working on the initiative, tells Carbon Brief that it will be launched “in the near future”. She adds:
“The aim is to recharge the field, start a conversation about climate losses and damages, and help people understand how climate change is making life more dangerous and more expensive.”
How do scientists attribute ‘unprecedented’ events?
An attribution method known as the “storyline approach” or “conditional attribution” has become increasingly popular over the past decade – despite initially causing controversy in the attribution community.
In this approach, researchers first select an extreme weather event, such as a specific heatwave, storm or drought. They then identify the physical components, such as sea surface temperature, soil moisture and atmospheric dynamics, that led to the event unfolding in the way it did. This series of events is called a “storyline”.
The authors then use models to simulate this “storyline” in two different worlds – one in the world as we know it and one in a counterfactual world – for example, with a different sea surface temperature or CO2 level. By comparing the model runs, the researchers can draw conclusions about how much climate change influenced that event.
The storyline approach is useful for explaining the influence of climate change on the physical processes that contributed to the event. It can also be used to explore in detail how this event would have played out in a warmer (future) or cooler (pre-industrial) climate.
One study describes the storyline approach as an “autopsy”, explaining that it “gives an account of the causes of the extreme event”.
Prof Ted Shepherd, a researcher at the University of Reading, was one of the earliest advocates of the storyline attribution approach. At the EGU general assembly in Vienna in April 2024, Shepherd provided the opening talk in a session on storyline attribution.
He told the packed conference room that the storyline approach was born out of the need for a “forensic” approach to attribution, rather than a “yes/no” approach. He emphasised that extreme weather events have “multiple causes” and that the storyline approach allows researchers to dissect each of these components.
Dr Linda van Garderen is a postdoctoral researcher at Utrecht University and has carried out multiple studies using the storyline method. She tells Carbon Brief that, while traditional attribution typically investigates probability, the storyline approach analyses intensity.
For example, she led an attribution study using the storyline method which concluded that the 2003 European and 2010 Russian heatwaves would have been 2.5-4C cooler in a world without climate change.
She adds that it can make communication easier, telling Carbon Brief that “probabilities can be challenging to interpret in practical daily life, whereas the intensity framing of storyline studies is more intuitive and can make attribution studies easier to understand”.
Dr Nicholas Leach is a researcher at the University of Oxford who has conducted multiple studies using the storyline approach. He tells Carbon Brief that probabilistic attribution often produces “false negatives”, wrongly concluding that climate change did not influence an event.
This is because climate models have “biases and uncertainties” which can lead to “noise” – particularly when it comes to dynamical features such as atmospheric circulation patterns. Probabilistic attribution methods often end up losing the signal of climate change in this noise, he explains.
The storyline approach is able to avoid these issues more easily, he says. He explains that by focusing on the dynamics of one specific event, rather than a “broad class of events”, storyline studies can eliminate some of this noise, making it more straightforward to identify a signal, he says.
Conversely, others have critiqued the storyline method for producing false positives, which wrongly claim that climate change influenced an extreme weather event.
The storyline approach has also been praised for its ability to attribute “unprecedented” events. In the EGU session on the storyline method, many presentations explored how the storyline method could be used to attribute “statistically impossible” extremes.
Leach explains that when a completely unprecedented extreme event occurs, statistical models often indicate that the event “shouldn’t have happened”. When running a probabilistic analysis using these models, Leach explains: “You end up with the present probability being zero and past probability being zero, so you can’t say a lot.”
He points to the Pacific north-west heatwave of 2021 as an example of this. This event was one of the most extreme regional heat events ever recorded globally, breaking some local high temperature records by more than 6C.

WWA conducted a rapid attribution study on the heatwave, using its probabilistic attribution method. The heatwave was “so extreme” that the observed temperatures “lie far outside the range” of historical observations, the researchers said.
Their assessment suggests that the heatwave was around a one-in-1,000-year event in today’s climate and was made at least 150-times more likely because of climate change.
Leach and his colleagues used the storyline method to attribute the same heatwave. The methods of this study will be discussed more in the following section.
Leach explains that using the storyline approach, he was able to consider the physics of the event, including an atmospheric river that coincided with the “heat dome” that was a key feature of the event. This helped him to represent the event well in his models. The study concluded that the heatwave was 1.3C hotter and eight times more likely as a result of climate change.
Many experts tell Carbon Brief there was initially tension in the attribution community between probabilistic and storyline advocates when the latter was first introduced. However, as the storyline method has become more mainstream, criticism has abated and many scientists are now publishing research using both techniques.
Van Garderen tells Carbon Brief that storyline attribution is “adding to the attribution toolbox”, rather than attempting to replace existing methods. She emphasises that probability-based and storyline attribution answer different questions, and that both are important.
How can weather forecasts be used in attribution studies?
Forecast attribution is the most recent major addition to the attribution toolbox. This method uses weather forecasts instead of climate models to carry out attribution studies. Many experts describe this method as sitting part-way between probabilistic and storyline attribution.
One benefit of using forecasts, rather than climate models, is that their higher resolution allows them to simulate extreme weather events in more detail. By using forecasts, scientists can also attribute events that have not yet happened.
The first use of “advance forecasted” attribution analysis (pdf) quantified the impact of climate change on the size, rainfall and intensity of Hurricane Florence before it made landfall in North Carolina in September 2018.
The authors, in essence, carried out the probabilistic attribution method, using two sets of short-term forecasts for the hurricane rather than large-scale climate models. The analysis received a mixed reaction. Stott told Carbon Brief at the time that it was “quite a cool idea”, but was highly dependent on being able to forecast such events reliably.
Dr Kevin Trenberth, distinguished senior scientist at the National Center for Atmospheric Research, told Carbon Brief in 2019 that the study was “a bit of a disaster”, explaining that the quality of the forecast was questionable for the assessment.
The authors subsequently published a paper in Science Advances reviewing their study “with the benefit of hindsight”. The authors acknowledged that the results are quite a way off what they forecasted. However, they also claimed to have identified what went wrong with their forecasted analysis.
Problems with the “without climate change” model runs created a larger contrast against their real-world simulations, meaning the analysis overestimated the impact of climate change on the event, they said.
Nonetheless, the study did identify a quantifiable impact of climate change on Hurricane Florence, adding to the evidence from studies by other author groups.
This research team has since published more forecast-based attribution studies on hurricanes. One study used hindcasts – forecasts that start from the past and then run forward into the present – to analyse the 2020 hurricane season. The team then ran a series of “counterfactual” hindcasts over the same period, without the influence of human warming from sea surface temperatures.
They found that warmer waters increased three-hour rainfall rates and three-day accumulated rainfall for tropical storms by 10% and 5%, respectively, over the 2020 season.

Meanwhile, a 2021 study by a different team showed how it was possible to use traditional weather forecasts for attribution. The researchers, who penned a Carbon Brief guest post about their work, found that the European heatwave of February 2019 was 42% more likely for the British Isles and at least 100% more likely for France.
To conduct their study, the authors used a weather forecast model – also known as a “numerical weather prediction” model (NWP).
They explain that a NWP typically runs at a higher resolution than a climate model, meaning that it has more, smaller grid cells. This allows it to simulate processes that a climate model cannot and makes them “more suitable for studying the most extreme events than conventional climate models,” the authors argue.
More recently, Leach and his team carried out a forecast attribution study on the record-breaking Pacific north-west heatwave of 2021, years after the event took place.
The authors defined 29 June 2021 as the start of the event, as this is when the maximum temperature of the heatwave was recorded. They then ran their forecasts using a range of “lead times” – the number of days before the event starts that the model simulation is initialised.
The shortest lead time in this study was three days, meaning the scientists began running the model using the weather conditions recorded on 26 June 2021. The short lead time meant that they could tailor the model very closely to the weather conditions at this time and simulated the event itself very accurately.
By comparison, the longest lead times used in this study were 2-4 months. This means that the models were initialised in spring and, by the time they simulated the June heatwave, their simulation did not closely resemble the events that actually unfolded.
Leach tells Carbon Brief that by lengthening the lead time of the weather forecast, they can effectively “shift the dial” from storyline to probabilistic attribution. He explains:
“If you’re using a forecast that’s initialised really near to your event, then you’re kind of going down that storyline approach, by saying, ‘I want what my model is stimulating to look really similar to the event I’m interested in’…
“The further back [in time] you go, the closer you get to the more probabilistic style of statements that are more unconditioned.”
This combination of storyline and probabilistic attribution allows the authors to draw conclusions both about how climate change affected the intensity and the likelihood of the heatwave. The authors estimate that the heatwave was 1.3C more intense and eight times more likely as a result of climate change.
More recently, Climate Central has produced a tool that uses temperature forecasts over the US over the coming days to calculate a “climate shift index”. This index gives the ratio of how common the forecasted temperature is in today’s climate, compared to how likely it would be in a world without climate change.
The index runs from five to minus five. A result of zero indicates that climate change has no detectable influence, an index of five means that climate change made the temperature at least five times more likely and an index of minus five means that climate change made the temperature at least five times less likely.
The tool can be used for attribution. For example, recent analysis by the group used the index to quantify how climate change has influenced the number of uncomfortably hot nights. It concluded:
“Due to human-caused climate change, 2.4 billion people experienced an average of at least two additional weeks per year where nighttime temperatures exceeded 25C. Over one billion people experienced an average of at least two additional weeks per year of nights above 20C and 18C.”
What are the applications of attribution science?
One often-touted application of attribution studies is to raise awareness about the role of climate change in extreme weather events. However, there are limited studies about how effective this is.
One study presents the results of focus group interviews with UK scientists, who were not working on climate change, in which participants were given attribution statements. The study concludes:
“Extreme event attribution shows significant promise for climate change communication because of its ability to connect novel, attention-grabbing and event-specific scientific information to personal experiences and observations of extreme events.”
However, the study identified a range of challenges, including “adequately capturing nuances”, “expressing scientific uncertainty without undermining accessibility of key findings” and difficulties interpreting mathematical aspects of the results.
In another experiment, researchers informed nearly 4,000 adults in the US that climate change had made the July 2023 heatwave in the US at least five times more likely. The team also shared information from Climate Central’s climate shift index. According to the study, both approaches “increased the belief that climate change made the July 2023 heatwave more likely and is making heatwaves in general more likely as well”.
Meanwhile, as the science of extreme weather attribution becomes more established, lawyers, governments and civil society are finding more uses for this evolving field.
For example, attribution is starting to play an important role in courts. In 2017, two lawyers wrote a Carbon Brief guest post stating “we expect that attribution science will provide crucial evidence that will help courts determine liability for climate change related harm”.
Four years later, the authors of a study on “climate litigation” wrote a Carbon Brief guest post explaining how attribution science can be “translated into legal causality”. They wrote:
“Attribution can bridge the gap identified by judges between a general understanding that human-induced climate change has many negative impacts and providing concrete evidence of the role of climate change at a specific location for a specific extreme event that already has led or will lead to damages.”
In 2024, around 2,000 Swiss women used an attribution study, alongside other evidence, to win a landmark case in the European Court of Human Rights. The women, mostly in their 70s, said that their age and gender made them particularly vulnerable to heatwaves linked to climate change. The court ruled that Switzerland’s efforts to meet its emissions targets had been “woefully inadequate”.

The 2024 European Geosciences Union conference in Vienna dedicated an entire session to climate change and litigation. Prof Wim Thiery – a scientist who was involved in many conference sessions on climate change and litigation – tells Carbon Brief that attribution science is particularly important for supporting “reparation cases”, in which vulnerable countries or communities seek compensation for the damages caused by climate change.
He adds Carbon Brief that seeing the “direct and tangible impact” of an attribution study in a court case “motivates climate scientists in engaging in this community”.
(Other types of science are also important in court cases related to climate change, he added. For example, “source attribution” identifies the relative contribution of different sectors and entities – such as companies or governments – to climate change.)
Dr Rupert Stuart-Smith, a research associate in climate science and the law at the University of Oxford’s Sustainable Law Programme, adds:
“We’re seeing a new evolution whereby communities are increasingly looking at impact-relevant variables. Think about inundated areas, lake levels, heatwave mortalities. These are the new target variables of attribution science. This is a new frontier and we are seeing that those studies are directly usable in court cases.”
He tells Carbon Brief that some cases “have sought to hold high-emitting corporations – such as fossil fuel or agricultural companies – liable for the costs of climate change impacts”. He continues:
“In cases like these, claimants typically need to show that climate change is causing specific harms affecting them and courts may leverage attribution or climate projections to adjudicate these claims. Impact attribution is particularly relevant in this context.”
Dr Delta Merner is a lead scientist at the science hub for climate litigation. She tells Carbon Brief that “enhanced source attribution for companies and countries” will be “critical” for holding major emitters accountable. She adds:
“This is an urgent time for the field of attribution science, which is uniquely capable of providing robust, actionable evidence to inform decision-making and drive accountability.”
Meanwhile, many countries’s national weather services are working on “operational attribution” – the regular production of rapid attribution assessments.
Stott tells Carbon Brief that the UK Met Office is operationalising attribution studies. For example, on 2 January 2024, it announced that 2023 was the second-warmest year on record for the UK, with an average temperature of 9.97C.
New methods are also being developed. For example, groups, such as the “eXtreme events: Artificial Intelligence for Detection and Attribution” (XAIDA) team, are researching the use of machine learning and artificial intelligence for attribution studies.
One recent attribution study uses a machine-learning approach to create “dynamically consistent counterfactual versions of historical extreme events under different levels of global mean temperature”. The authors estimate that the south-central North American heatwave of 2023 was 1.18-1.42C warmer because of global warming.
The authors conclude:
“Our results broadly agree with other attribution techniques, suggesting that machine learning can be used to perform rapid, low-cost attribution of extreme events.”
Other scientists are using a method called UNSEEN, which involves running models thousands of times to increase the size of the datasets used to make it easier to derive accurate probabilities from highly variable extremes.
What are the next steps for attribution research?
The experts that Carbon Brief spoke to for this article have high hopes for the future of attribution science. For example, Stott says:
“Attribution science has great potential to improve the resilience of societies to future climate change, can help monitor progress towards the Paris goals of keeping global warming to well below 2C and can motivate progress in driving down emissions towards net-zero by the middle of this century.”
However, despite the progress made over the past two decades, there are still challenges to overcome. One of the key barriers in attribution science is a lack of high-quality observational data in low-income countries.
To carry out an attribution study, researchers need a Iong, high-quality dataset of observations from the area being studied. However, inadequate funding or political instability means that many developing countries do not have sufficient weather station data.
In a 2016 interview with Carbon Brief, Allen said that “right now there is obviously a bias towards our own backyards – north-west Europe, Australia and New Zealand.”
Many WWA studies in global-south countries mention the challenge of finding adequate data and sometimes this affects the results. A WWA study of the 2022 drought in west Africa’s Sahel region was unable to find the signal of climate change in the region’s rainfall pattern – in part, due to widespread uncertainties in the observational data.
Otto, who was an author on the study, explained at the time:
“It could either be because the data is quite poor or because we have found the wrong indices. Or it could be because there really is no climate change signal…We have no way of identifying which of these three options it is.”
Developing better observational datasets is an ongoing challenge. It is highlighted in much of the literature on attribution as an important next step for attribution science – and for climate science more widely. Merner tells Carbon Brief that scientists also need to work on developing “novel approaches for regions without baseline data”.

Meanwhile, many scientists expect the methods used in attribution science to continue evolving. The Detection and Attribution Model Intercomparison Project is currently collecting simulations, which will support improved attribution of climate change in the next set of assessment reports from the Intergovernmental Panel on Climate Change.
Mitchell says that, over the next decade, he thinks that “we will move away from the more generic attribution methods that have served us well to this point, and start developing and applying more targeted – and even more defensible – methods”.
In particular, he highlights the need for more specific methods for impact attribution – for example, studying the impacts of weather events on health outcomes, biodiversity changes or financial losses.
He continues:
“The interplay of different socioeconomic states and interventions with that of climate change can make these particularly difficult to study – but we are getting there with our more advanced, albeit computationally expensive methods, such as using weather forecast models as the foundation of our attribution statements.”
Stott tells Carbon Brief that incorporating impacts into attribution assessments is a “crucial area for development” in attribution science. He explains that impact attribution is “very relevant to the loss-and-damage agenda and further developments in attribution science are likely to include the ability to attribute the financial costs of storms”.
Stuart-Smith tells Carbon Brief that, “in the coming years, growing numbers of studies will quantify the economic burden of climate change and its effects on a broader range of health impacts, including from vector and water-borne diseases”.
Leach also tells Carbon Brief that it is “important for attribution to move their focus beyond physical studies and into quantitative impact studies to increase their relevance and utility in policy and the media”.
He adds:
“Utilising weather forecasts for attribution would fit neatly with this aim as those same models are already widely used by emergency managers and built into impact modelling frameworks.”
Similarly, Stott tells Carbon Brief that “forecast attribution shows great potential”. He explains that by “progressing that science” will allow this method to be used to attribute more types of extreme weather with greater confidence.
Leach advocates for greater use of weather forecast models for all types of attribution. He says:
“Weather forecast models have demonstrated repeatedly over the past few years that they are capable of accurately representing even unprecedented weather extremes. Using these validated state-of-the-art models for attribution could bring an increase in confidence in the results.”
Many scientists also tell Carbon Brief about the importance of operationalising attribution. The weather services in many countries already have this in place. Stott tells Carbon Brief that groups in Japan, South Korea, Australia and the US are also “at various stages of developing operational attribution services”.
Meanwhile, Otto tells Carbon Brief that “the most important next step for attribution in my view is to really integrate the assessment of vulnerability and exposure into the attribution studies”. She adds:
“In order for attribution to truly inform adaptation it is essential though to go from attributing hazards, as we do now mainly, to disentangling drivers of disasters.”
Mitchell adds that he thinks attribution statements “are absolutely essential for [countries to make] national adaptation plans”.
Meanwhile, another study suggests that extreme event attribution studies could be used by engineers, along with climate projections, to assist climate adaptation for civil infrastructure.
Leach tells Carbon Brief that attribution could be useful in the insurance sector for similar reasons. He adds that many insurance sectors use the same forecasts in their catastrophe models that climate scientists use for forecast attribution, meaning that it should be straightforward to add attribution studies into their pipelines.
The post Q&A: The evolving science of ‘extreme weather attribution’ appeared first on Carbon Brief.
Greenhouse Gases
Analysis: EVs just outsold petrol cars in EU for first time ever
Sales of electric vehicles (EVs) overtook petrol cars in the EU for the first time in December 2025, according to new figures released by industry group the European Automobile Manufacturers’ Association (ACEA).
The figures show that registrations of battery EVs – sometimes referred to as BEVs, or “pure EVs” – reached 217,898, up 51% year-on-year from December 2024, as shown in the chart below.
Meanwhile, sales of petrol cars in the bloc fell 19% year-on-year, from 267,834 in December 2024 to 216,492 in December 2025.

Overall in 2025, EVs reached 17.4% of the market share in the bloc, up from 13.6% the previous year.
(EVs run purely from a battery that is charged from an external source, plug-in hybrids have both a battery that can be charged and an internal combustion engine, whilst regular hybrids cannot be plugged in, they have a smaller battery that is charged from the engine or braking.)
According to ACEA, 1,880,370 new battery-electric cars were registered last year, with the four biggest markets – Germany (+43.2%), the Netherlands (+18.1%), Belgium (+12.6%), and France (+12.5%) – accounting for 62% of registrations.
In a release setting out the figures, ACEA described this as “still a level that leaves room for growth to stay on track with the transition”.
Meanwhile, registrations of petrol cars fell by 18.7% across 2025, with all major markets seeing a decrease.
France accounted for the steepest decline in petrol registrations at 32% year-on-year, followed by Germany (-21.6%), Italy (-18.2%), and Spain (-16%).
Overall, 2,880,298 new petrol cars were registered in 2025, a drop in market share from 33.3% in December 2024 to 26.6%.
Hybrid vehicles, which are entirely fuelled by petrol or diesel, remain the largest segment of the EU car market, with sales jumping 5.8% from 307,001 in December 2024 to 324,799 a year later, as shown in the chart below.
However, cars that can run on electricity – battery EVs and plug-in hybrids – are growing even faster, with sales up 51% and 36.7% in December 2025, respectively.

The registration figures follow the EU’s automotive package, released in December to “support the automotive sector’s efforts in the transition to clean mobility”.
It includes a proposed shift from banning the sale of new combustion-engine cars from 2035 to reducing their tailpipe emissions.
Under the proposals, the EU will target a 90% cut in carbon dioxide (CO2) emissions from 2021 levels by 2035, rather than all vehicle sales having to be zero-emissions.
If approved, the package would require that the remaining 10% of emissions be compensated through the use of low-carbon steel made in the EU or from e-fuels and biofuels.
This would allow for plug-in hybrids (PHEVs), “range extenders”, hybrids and pure internal combustion engine vehicles to “still play a role beyond 2035”.
There has been repeated pushback from the automotive sector in Europe against the introduction of “clean car rules”, which has led to targets being shifted more than once.
For example, the head of Stellantis, one of the largest car manufacturers in Europe, recently claimed that there was no “natural” demand for EVs.
Automakers have argued that EU targets for cleaner cars should be eased in the face of competition from Chinese producers and US tariffs.
ACEA figures show Volkswagen continued to claim the largest market share in the EU, accounting for 26.7% of new registrations in December, up from 25.6% a year earlier.
It was followed by Stellantis, Renault, Hyundai, Toyota and BMW.
EV giant Tesla saw its market share drop from 3.5% in December 2024 to 2.2% in December 2025. Over the course of 2025, the brand saw its market share in the EU fall 37.9% from 2024, following controversy around its owner, Elon Musk.
Meanwhile, Chinese EV brand BYD tripled its market share from 0.7% in December 2024 to 1.9% in December 2025.
The post Analysis: EVs just outsold petrol cars in EU for first time ever appeared first on Carbon Brief.
Analysis: EVs just outsold petrol cars in EU for first time ever
Greenhouse Gases
DeBriefed 23 January 2026: Trump’s Davos tirade; EU wind and solar milestone; High seas hope
Welcome to Carbon Brief’s DeBriefed.
An essential guide to the week’s key developments relating to climate change.
This week
Trump vs world
TILTING AT ‘WINDMILLS’: At the World Economic Forum meeting in Davos, Switzerland, Donald Trump was quoted by Reuters as saying – falsely – that China makes almost all of the world’s “windmills”, but he had not “been able to find any windfarms in China”, calling China’s buyers “stupid”. The newswire added that China “defended its wind power development” at Davos, with spokesperson Guo Jiakun saying the country’s efforts to tackle climate change and promote renewable energy in the world are “obvious to all”.
SPEECH FACTCHECKED: The Guardian factchecked Trump’s speech, noting China has more wind capacity than any other country, with 40% of global wind generation in 2024 in China. See Carbon Brief’s chart on this topic, posted on BlueSky by Dr Simon Evans.
GREENLAND GRAB: Trump “abruptly stepped back” from threats to seize Greenland with the use of force or leveraging tariffs, downplaying the dispute as a “small ask” for a “piece of ice”, reported Reuters. The Washington Post noted that, while Trump calls climate change “a hoax”, Greenland’s described value is partly due to Arctic environmental shifts opening up new sea routes. French president Macron slammed the White House’s “new colonial approach”, emphasising that climate and energy security remain European “top priorities”, according to BusinessGreen.
Around the world
- EU MILESTONE: For the first time, wind and solar generated more electricity than fossil fuels in the EU last year, reported Reuters. Wind and solar generated 30% of the EU’s electricity in 2025, just above 29% from plants running on coal, gas and oil, according to data from the thinktank Ember covered by the newswire.
- WARM HOMES: The UK government announced a £15bn plan for rolling out low-carbon technology in homes, such as rooftop solar and heat pumps. Carbon Brief’s newly published analysis has all the details.
- BIG THAW: Braving weather delays that nearly “derail[ed] their mission”, scientists finally set up camp on Antarctica’s thawing Thwaites glacier, reported the New York Times. Over the next few weeks, they will deploy equipment to understand “how this gargantuan glacier is being corroded” by warming ocean waters.
- EVS WELCOME: Germany re-introduced electric vehicle subsidies, open to all manufacturers, including those in China, reported the Financial Times. Tesla and Volvo could be the first to benefit from Canada’s “move to slash import tariffs on made-in-China” EVs, said Bloomberg.
- SOUTHERN AFRICA FLOODS: The death toll from floods in Mozambique went up to 112, reported the African Press Agency on Thursday. Officials cited the “scale of rainfall” – 250mm in 24 hours – as a key driver, it added. Frontline quoted South African president Cyril Ramaphosa, who linked the crisis to climate change.
$307bn
The amount of drought-related damages worldwide per year – intensified by land degradation, groundwater depletion and climate change – according to a new UN “water bankruptcy” report.
Latest climate research
- A researcher examined whether the “ultra rich” could and should pay for climate finance | Climatic Change
- Global deforestation-driven surface warming increased by the “size of Spain” between 1988 and 2016 | One Earth
- Increasing per-capita meat consumption by just one kilogram a year is “linked” to a nearly 2% increase in embedded deforestation elsewhere | Environmental Research Letters
(For more, see Carbon Brief’s in-depth daily summaries of the top climate news stories on Monday, Tuesday, Wednesday, Thursday and Friday.)
Captured

For the first time since monitoring began 15 years ago, there were more UK newspaper editorials published in 2025 opposing climate action than those supporting it, Carbon Brief analysis found. The chart shows the number of editorials arguing for more (blue) and less (red) climate action between 2011-2025. Editorials that took a “balanced” view are not represented in the chart. All 98 editorials opposing climate action were in right-leaning outlets, while nearly all 46 in support were in left-leaning and centrist publications. The trend reveals the scale of the net-zero backlash in the UK’s right-leaning press, highlighting the rapid shift away from a political consensus.
Spotlight
Do the oceans hold hope for international law?
This week, Carbon Brief unpacks what a landmark oceans treaty “entering into force” means and, at a time of backtracking and breach, speaks to experts on the future of international law.
As the world tries to digest the US retreat from international environmental law, historic new protections for the ocean were quietly passed without the US on Saturday.
With little fanfare besides a video message from UN chief Antonio Guterres, a binding UN treaty to protect biodiversity in two-thirds of the Earth’s oceans “entered into force”.
What does the treaty mean and do?
The High Seas Treaty – formally known as the “biodiversity beyond national jurisdiction”, or “BBNJ” agreement – obliges countries to act in the “common heritage of humankind”, setting aside self-interest to protect biodiversity in international waters. (See Carbon Brief’s in-depth explainer on what the treaty means for climate change).
Agreed in 2023, it requires states to undertake rigorous impact assessments to rein in pollution and share benefits from marine genetic resources with coastal communities and countries. States can also propose marine protected areas to help the ocean – and life within it – become more resilient to “stressors”, such as climate change and ocean acidification.
“It’s a beacon of hope in a very dark place,” Dr Siva Thambisetty, an intellectual property expert at the London School of Economics and an adviser to developing countries at UN environmental negotiations, told Carbon Brief.
Who has signed the agreement?
Buoyed by a wave of commitments at last year’s UN Oceans conference in France, the High Seas treaty has been signed by 145 states, with 84 nations ratifying it into domestic law.
“The speed at which [BBNJ] went from treaty adoption to entering into force is remarkable for an agreement of its scope and impact,” said Nichola Clark, from the NGO Pew Trusts, when ratification crossed the 60-country threshold for it to enter into force last September.
For a legally binding treaty, two years to enter into force is quick. The 1997 Kyoto Protocol – which the US rejected in 2001 – took eight years.
While many operative parts of the BBNJ underline respect for “national sovereignty”, experts say it applies to an area outside national borders, giving territorial states a reason to get on board, even if it has implications for the rest of the oceans.
What is US involvement with the treaty?
The US is not a party to the BBNJ’s parent Law of the Sea, or a member of the International Seabed Authority (ISA) overseeing deep-sea mining.
This has meant that it cannot bid for permits to scour the ocean floor for critical minerals. China and Russia still lead the world in the number of deep-sea exploration contracts. (See Carbon Brief’s explainer on deep-sea mining).
In April 2025, the Biden administration issued an executive order to “unleash America’s offshore critical minerals and resources”, drawing a warning from the ISA.
This Tuesday, the Trump administration published a new rule to “fast-track deep-sea mining” outside its territorial waters without “environmental oversight”, reported Agence France-Presse.
Prof Lavanya Rajamani, an expert in international environmental law at the University of Oxford, told Carbon Brief that, while dealing with US unilateralism and “self-interest” is not new to the environmental movement, the way “in which they’re pursuing that self-interest – this time on their own, without any legal justification” has changed. She continued:
“We have to see this not as a remaking of international law, but as a flagrant breach of international law.”
While this is a “testing moment”, Rajamani believes that other states contending with a “powerful, idiosyncratic and unpredictable actor” are not “giving up on decades of multilateralism…they just asking how they might address this moment without fundamentally destabilising” the international legal order.
What next for the treaty?
Last Friday, China announced its bid to host the BBNJ treaty’s secretariat in Xiamen – “a coastal hub that sits on the Taiwan Strait”, reported the South China Morning Post.
China and Brussels currently vie as the strongest contenders for the seat of global ocean governance, given that Chile made its hosting offer days before the country elected a far-right president.
To Thambisetty, preparatory BBNJ meetings in March can serve as an important “pocket of sanity” in a turbulent world. She concluded:
“The rest of us have to find a way to navigate the international order. We have to work towards better times.”
Watch, read, listen
OWN GOAL: For Backchannel, Zimbabwean climate campaigner Trust Chikodzo called for Total Energies to end its “image laundering” at the Africa Cup of Nations.
MATERIAL WORLD: In a book review for the Baffler, Thea Riofrancos followed the “unexpected genealogy” of the “energy transition” outlined in Jean-Baptiste Fressoz’s More and More and More: An All-Consuming History.
REALTY BITES: Inside Climate News profiled Californian climate policy expert Neil Matouka, who built a plugin to display climate risk data that real-estate site Zillow removed from home listings.
Coming up
- 26 January: International day of clean energy
- 27 January: India-EU summit, New Delhi
- 31 January: Submit inputs on food systems and climate change for a report by the UN special rapporteur on climate change
- 1 February: Costa Rica elections
Pick of the jobs
- British Antarctic Survey, boating officer | Salary: £31,183. Location: UK and Antarctica
- National Centre for Climate Research at the Danish Meteorological Institute, climate science leader | Salary: NA. Location: Copenhagen, with possible travel to Skrydstrup, Karup and Nuuk
- Mongabay, journalism fellows | Stipend: $500 per month for 6 months. Location: Remote
- Climate Change Committee, carbon budgets analyst | Salary: £47,007-£51,642. Location: London
DeBriefed is edited by Daisy Dunne. Please send any tips or feedback to debriefed@carbonbrief.org.
This is an online version of Carbon Brief’s weekly DeBriefed email newsletter. Subscribe for free here.
The post DeBriefed 23 January 2026: Trump’s Davos tirade; EU wind and solar milestone; High seas hope appeared first on Carbon Brief.
DeBriefed 23 January 2026: Trump’s Davos tirade; EU wind and solar milestone; High seas hope
Greenhouse Gases
Q&A: What UK’s ‘warm homes plan’ means for climate change and energy bills
The UK government has released its long-awaited “warm homes plan”, detailing support to help people install electric heat pumps, rooftop solar panels and insulation in their homes.
It says up to 5m households could benefit from £15bn of grants and loans earmarked by the government for these upgrades by 2030.
Electrified heating and energy-efficient homes are vital for the UK’s net-zero goals, but the plan also stresses that these measures will cut people’s bills by “hundreds of pounds” a year.
The plan shifts efforts to tackle fuel poverty away from a “fabric-first” approach that starts with insulation, towards the use of electric technologies to lower bills and emissions.
Much of the funding will support people buying heat pumps, but the government has still significantly scaled back its expectations for heat-pump installations in the coming years.
Beyond new funding, there are also new efficiency standards for landlords that could result in nearly 3m rental properties being upgraded over the next four years.
In addition, the government has set out its ambition for scaling up “heat networks”, where many homes and offices are served by communal heating systems.
Carbon Brief has identified the key policies laid out in the warm homes plan, as well as what they mean for the UK’s climate targets and energy bills.
- Why do homes matter for UK climate goals?
- What is the warm homes plan?
- What is included in the warm homes plan?
- What does the warm homes plan mean for energy bills?
- What has been the reaction to the plan?
Why do homes matter for UK climate goals?
Buildings are the second-largest source of emissions in the UK, after transport. This is largely due to the gas boilers that keep around 85% of UK homes warm.
Residential buildings produced 52.8m tonnes of carbon dioxide equivalent (MtCO2e) in 2024, around 14% of the nation’s total, according to the latest government figures.
Fossil-fuel heating is by far the largest contributor to building emissions. There are roughly 24m gas boilers and 1.4m oil boilers on the island of Great Britain, according to the National Energy System Operator (NESO).
This has left the UK particularly exposed – along with its gas-reliant power system – to the impact of the global energy crisis, which caused gas prices – and energy bills – to soar.
At the same time, the UK’s old housing stock is often described as among the least energy efficient in Europe. A third of UK households live in “poorly insulated homes” and cannot afford to make improvements, according to University College London research.
This situation leads to more energy being wasted, meaning higher bills and more emissions.
Given their contribution to UK emissions, buildings are “expected to be central” in the nation’s near-term climate goals, delivering 20% of the cuts required to achieve the UK’s 2030 target, according to government adviser the Climate Change Committee (CCC).
(Residential buildings account for roughly 70% of the emissions in the buildings sector, with the rest coming from commercial and public-sector buildings.)
Over recent years, Conservative and Labour governments have announced various measures to cut emissions from homes, including schemes to support people buying electric heat pumps and retrofitting their homes.
However, implementation has been slow. While heat-pump installations have increased, they are not on track to meet the target set by the previous government of 600,000 a year by 2028.
Meanwhile, successive schemes to help households install loft and wall insulation have been launched and then abandoned, meaning installation rates have been slow.
At the same time, the main government-backed scheme designed to lift homes out of fuel poverty, the “energy company obligation” (ECO), has been mired in controversy over low standards, botched installations and – according to a parliamentary inquiry – even fraud.
(The government announced at the latest budget that it was scrapping ECO.)
The CCC noted in its most recent progress report to parliament that “falling behind on buildings decarbonisation will have severe implications for longer-term decarbonisation”.
What is the warm homes plan?
The warm homes plan was part of the Labour party’s election-winning manifesto in 2024, sold at the time as a way to “cut bills for families” through insulation, solar and heat pumps, while creating “tens of thousands of good jobs” and lifting “millions out of fuel poverty”.
It replaces ECO, introduces new support for clean technologies and wraps together various other ongoing policies, such as the “boiler upgrade scheme” (BUS) grants for heat pumps.
The warm homes plan was officially announced by the government in November 2024, stating that up to 300,000 households would benefit from home upgrades in the coming year. However, the plan itself was repeatedly delayed.
In the spending review in June 2025, the government confirmed the £13.2bn in funding for the scheme pledged in the Labour manifesto, covering spending between 2025-26 and 2029-30.
The government said this investment would help cut bills by up to £600 per household through efficiency measures and clean technologies such as heat pumps, solar panels and batteries.
After scrapping ECO at the 2025 budget, the treasury earmarked an extra £1.5bn of funding for the warm homes plan over five years. This is less than the £1bn annual budget for ECO, which was funded via energy bills, but is expected to have lower administrative overheads.
In the foreword to the new plan, secretary of state Ed Miliband says that it will deliver the “biggest public investment in home upgrades in British history”. He adds:
“The warm homes plan [will]…cut bills, tackle fuel poverty, create good jobs and get us off the rollercoaster of international fossil fuel markets.”
Miliband argues in his foreword that the plan will “spread the benefits” of technologies such as solar to households that would otherwise be unable to afford them. He writes: “This historic investment will help millions seize the benefits of electrification.” Miliband concludes:
“This is a landmark plan to make the British people better off, secure our energy independence and tackle the climate crisis.”
What is included in the warm homes plan?
The warm homes plan sets out £15bn of investment over the course of the current parliament to drive uptake of low-carbon technologies and upgrade “up to” 5m homes.
A key focus of the plan is energy security and cost savings for UK households.
The government says its plan will “prioritise” investment in electrification measures, such as heat pumps, solar panels and battery storage. This is where most of the funding is targeted.
However, it also includes new energy-efficiency standards to encourage landlords to improve conditions for renters.
Some policies were notable due to their absence, such as the lack of a target to end gas boiler sales. The plan also states that, while it will consult on the use of hydrogen in heating homes, this is “not yet a proven technology” and therefore any future role would be “limited”.
New funding
Technologies such as heat pumps and rooftop solar panels are essential for the UK to achieve its net-zero goals, but they carry significant up-front costs for households. Plans for expanding their uptake therefore rely on government support.
Following the end of ECO in March, the warm homes plan will help fill the gap in funding for energy-efficiency measures that it is expected to leave.
As the chart below shows, a range of new measures under the warm homes plan – including a mix of grants and loans – as well as more funding for existing schemes, leads to an increase in support out to 2030.

One third of the total funding – £5bn in total – is aimed at low-income households, including social housing tenants. This money will be delivered in the form of grants that could cover the full cost of upgrades.
The plan highlights solar panels, batteries and “cost-effective insulation” for the least energy-efficient homes as priority measures for this funding, with a view to lowering bills.
There is also £2.7bn for the existing boiler upgrade scheme, which will see its annual allocation increase gradually from £295m in 2025-26 to £709m in 2029-30.
This is the government’s measure to encourage better-off “able to pay” households to buy heat pumps, with grants of £7,500 towards the cost of replacing a gas or oil-fired boiler. For the first time, there will also be new £2,500 grants from the scheme for air-to-air heat pumps (See: Heat pumps.)
A key new measure in the plan is £2bn for low- and zero-interest consumer loans, to help with the cost of various home upgrades, including solar panels, batteries and heat pumps.
Previous efforts to support home upgrades with loans have not been successful. However, innovation agency Nesta says the government’s new scheme could play a central role, with the potential for households buying heat pumps to save hundreds of pounds a year, compared to purchases made using regular loans.
The remaining funding over the next four years includes money assigned to heat networks and devolved administrations in Scotland, Wales and Northern Ireland, which are responsible for their own plans to tackle fuel poverty and household emissions.
Heat pumps
Heat pumps are described in the plan as the “best and cheapest form of electrified heating for the majority of our homes”.
The government’s goal is for heat pumps to “increasingly become the desirable and natural choice” for those replacing old boilers. At the same time, it says that new home standards will ensure that new-build homes have low-carbon heating systems installed by default.
Despite this, the warm homes plan scales back the previous government’s target for heat-pump installations in the coming years, reflecting the relatively slow increase in heat-pump sales. It also does not include a set date to end the sale of gas boilers.
The plan’s central target is for 450,000 heat pumps to be installed annually by 2030, including 200,000 in new-build homes and 250,000 in existing homes.
This is significantly lower than the previous target – originally set in 2021 under Boris Johnson’s Conservative government – to install 600,000 heat pumps annually by 2028.
Meeting that target would have meant installations increasing seven-fold in just four years, between 2024 and 2028. Now, installations only need to increase five-fold in six years.
As the chart below shows, the new target is also considerably lower than the heat-pump installation rate set out in the CCC’s central net-zero pathway. That involved 450,000 installations in existing homes alone by 2030 – excluding new-build properties.

Some experts and campaigners questioned how the UK would remain on track for its legally binding climate goals given this scaled-back rate of heat-pump installations.
Additionally, Adam Bell, policy director at the thinktank Stonehaven, writes on LinkedIn that the “headline numbers for heat pump installs do not stack up”.
Heat pumps in existing homes are set to be supported primarily via the boiler upgrade scheme and – according to Bell – there is not enough funding for the 250,000 installations that are planned, despite an increased budget.
The government’s plan relies in part on the up-front costs of heat pump installation “fall[ing] significantly”. According to Bell, it may be that the government will reduce the size of boiler upgrade scheme grants in the future, hoping that costs will fall sufficiently.
Alternatively, the government may rely on driving uptake through its planned low-cost loans and the clean heat market mechanism, which requires heating-system suppliers to sell a growing share of heat pumps.
Rooftop solar
Rooftop solar panels are highlighted in the plan as “central to cutting energy bills”, by allowing households to generate their own electricity to power their homes and sell it back to the grid.
At the same time, rooftop solar is expected to make a “significant contribution” to the government’s target of hitting 45-47 gigawatts (GW) of solar capacity by 2030.
As it stands, there is roughly 5.2GW of solar capacity on residential rooftops.
Taken together, the government says the grants and loans set out in the warm homes plan could triple the number of homes with rooftop solar from 1.6m to 4.6m by 2030.
It says that this is “in addition” to homes that decide to install rooftop solar independently.
Efficiency standards
The warm homes plan says that the government will publish its “future homes standard” for new-build properties, alongside necessary regulations, in the first quarter of 2026.
On the same day, the government also published its intention to reform “energy performance certificates” (EPCs), the ratings that are supposed to inform prospective buyers and renters about how much their new homes will cost to keep warm.
The current approach to measuring performance for EPCs is “unreliable” and thought to inadvertently discourage heat pumps. It has faced long-standing calls for reform.
As well as funding low-carbon technologies, the warm homes plan says it is “standing up for renters” with new energy-efficiency standards for privately and socially rented homes.
Currently, private renters – who rely on landlords to invest in home improvements – are the most likely to experience fuel poverty and to live in cold, damp homes.
Landlords will now need to upgrade their properties to meet EPC ratings B and C across two new-style EPC metrics by October 2030. There are “reasonable exemptions” to this rule that will limit the amount landlords have to spend per property to £10,000.
In total, the government expects “up to” 1.6m homes in the private-rental sector to benefit from these improvements and “up to” 1.3m social-rent homes.
These new efficiency standards therefore cover three-fifths of the “up to” 5m homes helped by the plan.
The government also published a separate fuel poverty strategy for England.
Heat networks
The warm homes plan sets out a new target to more than double the amount of heating provided using low-carbon heat networks – up to 7% of England’s heating demand by 2035 and a fifth by 2050.
This involves an injection of £1.1bn for heat networks, including £195m per year out to 2030 via the green heat network fund, as well as “mobilising” the National Wealth Fund.
The plan explains that this will primarily benefit urban centres, noting that heat networks are “well suited” to serving large, multi-occupancy buildings and those with limited space.
Alongside the plan, the government published a series of technical standards for heat networks, including for consumer protection.
What does the warm homes plan mean for energy bills?
The warm homes plan could save households “hundreds on energy bills” for those whose homes are upgraded, according to the UK government.
This is in addition to two changes announced in the budget in 2025, which are expected to cut energy bills for all homes by an average of £150 a year.
This included the decisions to bring ECO to an end when the current programme of work wraps up at the end of the financial year and for the treasury to cover three-quarters of the cost of the “renewables obligation” (RO) for three years from April 2026.
Beyond this, households that take advantage of the measures outlined in the plan can expect their energy bills to fall by varying amounts, the government says.
The warm homes plan includes a number of case studies that detail how upgrades could impact energy bills for a range of households. For example, it notes that a social-rented two-bedroom semi-detached home that got insulation and solar panels could save £350 annually.
An owner-occupier three-bedroom home could save £450 annually if it gets solar panels and a battery through consumer loans offered under the warm homes plan, it adds.
Similar analysis published by Nesta says that a typical household that invests in home upgrades under the plan could save £1,000 a year on its energy bill.
It finds that a household with a heat pump, solar panels and a battery, which uses a solar and “time of use tariff”, could see its annual energy bill fall by as much as £1,000 compared with continuing to use a gas boiler, from around £1,670 per year to £670, as shown in the chart below.

Ahead of the plan being published, there were rumours of further “rebalancing” energy bills to bring down the cost of electricity relative to gas. However, this idea failed to come to fruition in the warm homes plan.
This would have involved reducing or removing some or all of the policy costs currently funded via electricity bills, by shifting them onto gas bills or into general taxation.
This would have made it relatively cheaper to use electric technologies such as heat pumps, acting as a further incentive to adopt them.
Nesta highlights that in the absence of further action with regard to policy costs, the electricity-to-gas price ratio is likely to stay at around 4.1 from April 2026.
What has been the reaction to the plan?
Many of the commitments in the warm homes plan were welcomed by a broad range of energy industry experts, union representatives and thinktanks.
Greg Jackson, the founder of Octopus Energy, described it as a “really important step forward”, adding:
“Electrifying homes is the best way to cut bills for good and escape the yoyo of fossil fuel costs.”
Dhara Vyas, chief executive of the trade body Energy UK, said the government’s commitment to spend £15bn on upgrading home heating was “substantial” and would “provide certainty to investors and businesses in the energy market”.
On LinkedIn, Camilla Born, head of the campaign group Electrify Britain, said the plan was a “good step towards backing electrification as the future of Britain, but it must go hand in hand with bringing down the costs of electricity”.
However, right-leaning publications and politicians were critical of the plan, focusing on how a proportion of solar panels sold in the UK are manufactured in China.
According to BBC News, two-thirds (68%) of the solar panels imported to the UK came from China in 2024.
In an analysis of the plan, the Guardian’s environment editor Fiona Harvey and energy correspondent Jillian Ambrose argued that the strategy is “all carrot and no stick”, given that the “longstanding proposal” to ban the installation of gas boilers beyond 2035 has been “quietly dropped”.
Christopher Hammond, chief executive of UK100, a cross-party network of more than 120 local authorities, welcomed the plan, but urged the government to extend it to include public buildings.
The government’s £3.5bn public sector decarbonisation scheme, which aimed to electrify schools, hospitals and council buildings, ended in June 2025 and no replacement has been announced, according to the network.
The post Q&A: What UK’s ‘warm homes plan’ means for climate change and energy bills appeared first on Carbon Brief.
Q&A: What UK’s ‘warm homes plan’ means for climate change and energy bills
-
Greenhouse Gases6 months ago
Guest post: Why China is still building new coal – and when it might stop
-
Climate Change6 months ago
Guest post: Why China is still building new coal – and when it might stop
-
Climate Change2 years ago
Bill Discounting Climate Change in Florida’s Energy Policy Awaits DeSantis’ Approval
-
Greenhouse Gases2 years ago嘉宾来稿:满足中国增长的用电需求 光伏加储能“比新建煤电更实惠”
-
Climate Change2 years ago
Spanish-language misinformation on renewable energy spreads online, report shows
-
Climate Change Videos2 years ago
The toxic gas flares fuelling Nigeria’s climate change – BBC News
-
Climate Change2 years ago嘉宾来稿:满足中国增长的用电需求 光伏加储能“比新建煤电更实惠”
-
Carbon Footprint2 years agoUS SEC’s Climate Disclosure Rules Spur Renewed Interest in Carbon Credits















