Category: Research

Even ‘climate progressive’ nations fall far short of Paris Agreement targets

Even ‘climate progressive’ nations fall far short of Paris Agreement targets

New research focusing on the UK and Sweden, demonstrates just how far even ‘climate progressive’ nations are from meeting our international commitments to avoid dangerous climate change.

The researchers concluded that despite the UK and Sweden claiming to have world leading climate legislation, their planned reductions in emissions will still lead to total emissions two to three times greater than is their fair share of a Paris-compliant global carbon budget.

The annual rate that emissions are expected to be cut is less than half of that required, with the scientists suggesting a minimum for the UK of 10% each year, starting in 2020. Similarly, the date of achieving a fully zero-carbon energy system should be around 2035, rather than the UK’s current ‘net-zero’ by 2050 legislation.

The study led by Professor Kevin Anderson from The University of Manchester, is published in the journal Climate Policy. The team of climate scientists asked how close these countries are to meeting the UN’s climate commitments if the ‘safe’ quantity of emissions, the global carbon budget, is shared fairly between ‘developing’ and ‘developed’ countries.

Professor Kevin Anderson, draws a damning conclusion from the research: “Academics have done an excellent job in understanding and communicating climate science, but the same cannot be said in relation to reducing emissions. Here we have collectively denied the necessary scale of mitigation, running scared of calling for fundamental changes to both our energy system and the lifestyles of high-energy users. Our paper brings this failure into sharp focus.

“Academics have done an excellent job in understanding and communicating climate science, but the same cannot be said in relation to reducing emissions. Here we have collectively denied the necessary scale of mitigation, running scared of calling for fundamental changes to both our energy system and the lifestyles of high-energy users. Our paper brings this failure into sharp focus.” – Professor Kevin Anderson

The Paris Agreement establishes an international covenant to reduce emissions in line with holding the increase in temperature to ‘well below 2°C and to pursue 1.5°C.’ Global modelling studies have repeatedly concluded that such commitments can be delivered through respective national government adjustments to contemporary society, principally price mechanisms driving technical change. However, as emissions have continued to rise, these models have come to increasingly rely on the extensive deployment of highly speculative negative emissions technologies (NETs).

John Broderick, an author from the UK’s Tyndall Centre for Climate Change Research, commented: “This work makes clear just how important issues of fairness are when dividing the global carbon budget between wealthier and poorer nations. It also draws attention to how a belief in the delivery of untested technologies has undermined the depth of mitigation required today.”

Isak Stoddard, the Swedish author on the paper said: “Our conservative analysis demonstrates just how far removed the rhetoric on climate change is from our Paris-compliant carbon budgets. For almost two decades we have deluded ourselves that ongoing small adjustments to business as usual will deliver a timely zero-carbon future for our children.”

Key insights from the paper‘A factor of two: how the mitigation plans of ‘climate progressive’ nations fall far short of Paris-compliant pathways’ by Kevin Anderson, John F. Broderick and Isak Stoddard:

  • Without a belief in the successful deployment of planetary scale negative emissions technologies, double-digit annual mitigation rates are required of developed countries, from 2020, if they are to align their policies with the Paris Agreement’s temperature commitments and principles of equity.
  • Paris-compliant carbon budgets for developed countries imply full decarbonization of energy by 2035-40, necessitating a scale of change in physical infrastructure reminiscent of the post-Second World War Marshall Plan. This brings issues of values, measures of prosperity and socio-economic inequality to the fore.
  • The stringency of Paris-compliant pathways severely limits the opportunity for inter-sectoral emissions trading. Consequently aviation, as with all sectors, will need to identify policies to reduce emissions to zero, directly or through the use of zero carbon fuels.
  • The UK and Swedish governments’ emissions pathways imply a carbon budget of at least a factor of two greater than their fair contribution to delivering on the Paris Agreement’s 1.5-2°C commitment.

This article was originally posted on the University of Manchester website.
Cover photo by Lukas on Unsplash.
‘Doughnut economics’ theory adapted as a city-level planning approach

‘Doughnut economics’ theory adapted as a city-level planning approach

By Will Bugler

A new approach to urban planning and development has been launched today by the Doughnut Economics Action Lab, Circle Economy, C40 Cities and Biomimicry 3.8. The tool, aimed at urban planners and municipal decision makers has been piloted in Amsterdam. The Amsterdam City Doughnut, takes the global concept of the Doughnut and turns it into a tool for transformative action.

The ‘Doughnut economics’ framework for sustainable development, was developed by Oxford economist Dr Kate Raworth in the Oxfam paper ‘A Safe and Just Space for Humanity’ and featured in her best-selling book Doughnut Economics: Seven Ways to Think Like a 21st-Century Economist. Shaped like a doughnut – combining the concept of planetary boundaries with the complementary concept of social boundaries. The framework was proposed to regard the performance of an economy by the extent to which the needs of people are met without overshooting Earth’s ecological ceiling.

Since that time Dr Raworth and others have been working on how to ‘downscale the doughnut’. The Amsterdam City Doughnut represents  a holistic approach to doing just that. Amsterdam was chosen in part as the city has already placed the Doughnut at the heart of its long-term vision and policymaking, and is home to the Amsterdam Donut Coalition, a network of inspiring change-makers who are already putting the Doughnut into practice in their city.

Applied at the scale of a city the downscaled approach starts by asking: “How can our city be a home to thriving people in a thriving place, while respecting the wellbeing of all people and the health of the whole planet?”

To facilitate reflection on this question, the tool explores four interdependent topics, applied in this case to Amsterdam:

These questions translate to four ‘lenses’ of the City Doughnut, producing a new ‘portrait’ of the city from four interconnected perspectives. Drawing on the city’s current targets for the local lenses, as well as on the Sustainable Development Goals and the planetary boundaries for the global lenses, cities can compare desired outcomes for the city against its current performance [see the published tool for more].

New IDB-UNEP report uncovers  barriers and opportunities to scale private sector investment in nature-based solutions to deliver climate resilient infrastructure

New IDB-UNEP report uncovers barriers and opportunities to scale private sector investment in nature-based solutions to deliver climate resilient infrastructure

A collaborative project between the Inter-American Development Bank (IDB), UN Environment Programme (UNEP), Acclimatise, and the UN Environment Programme World Conservation Monitoring Centre (UNEP-WCMC), has explored the barriers to, and opportunities for, increasing private-sector uptake of NbS in the infrastructure sector in Latin America and the Caribbean.

While the value of Nature-based Solutions (Nbs) to society is well understood, particularly by the conservation community, the adoption of NbS for sustainable infrastructure in Latin America and the Caribbean (LAC) remains low. At a time where infrastructure investments are crucial in keeping up with economic and population growth, it is especially vital that LAC explores multifunctional solutions, like NbS, to help build climate-resilient infrastructure in the face of a changing climate.

Four findings have emerged and include:

  1. NbS needs to be better mainstreamed into policy, legislation, and regulations
  2. Project developers in LAC require additional skills, methodologies, tools, and capacity to incorporate NbS into infrastructure projects
  3. Defining the business case is an important first step to build support and secure finance for NbS projects in LAC
  4. There is a need to improve the conditions and scalability of financial instruments suitable for NbS investment in LAC

The project collaborators agree – coordinated action by all those involved with infrastructure development, including policy makers, project developers and financial institutions, is needed to create the enabling conditions for private sector uptake of NbS.

You can access the report here

Acclimatise’s Amanda Rycerz authored the report and recently led a panel discussion event ‘Scaling Private Sector Uptake of Nature-Based Solutions for Climate Resilient Infrastructure’ at the United Nation’s Climate Change Conference (COP25). Additionally, she produced an infographic highlighting the benefits of adopting Nbs solutions.

Cover photo by Zeno Thysman on Unsplash
Beyond climate models: Climate adaptation in the face of uncertainty

Beyond climate models: Climate adaptation in the face of uncertainty

By Erin Owain and Richard Bater

A recently published paper calls upon climatologists to build models on decision-relevant timescales to inform shorter-term, local decision making by policy makers.

Over the past few decades, significant scientific advancements in global climate models have revolutionised our understanding and perception of climate change  By mimicking the dynamics of the global climate system, climate models have enrichened our knowledge and understanding of the knock-on impacts that changes to the climate system can have on the wider Earth system. Climate models, produced by over 20 centres around the world, have played a critical role in providing scientific evidence for decision makers to act to reduce emissions and adapt to projected impacts.

However, in recent years demand has been placed on climate science by policy makers to produce increasingly high-resolution climate projections to inform shorter-term, local decisions. The authors of a recently published paper argues that this is partly attributable to an over-estimation, on the part of decision makers, of the level precision with which the current set of models are able to project future change.

As the authors note, “… the adaptation community should be aware that widely available climate change projections are overconfident and are advised to avoid seductive promises of information about future climate conditions at local scales and particular future dates”.  Additionally, ‘optimising’ decisions using such data, in the absence of rigorous contextualisation and evaluation, can represent poor adaptation practice, especially where inadequate, expensive, or inflexible adaptation measures become ‘locked-in’.

Decision-making timescales across the public and private sectors are often relatively short-term, relative to the timescales of climate projections. The paper draws attention to an over-reliance by decision makers on high-resolution climate projections derived from downscaled climate models. Additionally, it questions whether the demands placed on climate services to produce high resolution climate projections is warranted given that decision-making does not always require such granularity, noting that: “The predominant focus on end-of-century projections neglects more pressing development concerns, which relate to the management of shorter-term risks and climate variability.” A shorter time horizon is often more relevant in lower income countries, which can be more vulnerable to climate shocks due to higher sensitivity and lower adaptive capacity.

A common approach to meet the demand for climate projections at the local level has been to downscale General Circulation Models (GCMs). Downscaling is a process of generating higher spatial and temporal-resolution data from lower-resolution data and is used to derive local-scale data able to inform short-term decision-making.

However, the various methods of downscaling have limitations:

  • Uncertainties regarding the underlying GCM projection data can be compounded, as additional assumptions and approximations are introduced during model selection and processing.
  • Dynamic downscaling can give a false sense of spatial precision whilst relying on fewer models, whereas temporal downscaling can risk mis-portraying shorter-term projections (3 to 10 years) as being akin to forecasts.
  • The error between observed and projected climate change for some parameters can be considerable at local scales, with observed change often being more severe than that projected.
  • Models can struggle to reliably represent seasonality, extreme values, and tipping points.

These factors mean that it is important to consider both ‘outlier’ models and future values that could exceed those projected by any of the climate models, whilst bearing in mind that some models are known to perform better in some region better than others.

Embracing uncertainty

While a common reflex has been to request such high-resolution climate data, in other areas decision makers across sectors have been accustomed to acting in a context of uncertainty, whether related to cyber-attacks, political instability, fluctuation in oil prices and exchange rates, disease epidemics or natural disasters. It is well understood that such eventualities cannot be predicted with high levels of certainty beyond the short-term: as the paper also note, “Often…detailed planning is possible without detailed climate change projections”.

On the other hand, integrating historical climate data with analysis of real-time data and short-term forecasting can be an effective, high-confidence guide to making robust decisions related to climate adaptation. As noted by the authors, there should be a focus by climatologists on building models on decision-relevant timescales, encouraging further dialogue or intermediation between climate science and end users.

Climate projection data remain an indispensable and scientifically sound guide to how climate is likely to change in the future. This paper, however, is a timely corrective to a tendency to overstate the precision of climate model outputs and to make resilience building efforts contingent on the ever-finer optimisation of climate models. A broad understanding of the direction and magnitude of change in given climate parameters, and their likely impacts for given users, can be adequate to identify and prioritise adaptation strategies and measures today. Decisions can be taken today that are robust to a range of climate scenarios, and low-regret, low-cost measures can be implemented that can be easily reversed in light of experience and new information.

In future, as the paper concludes, it is important that the climate services community refocuses attention on better assessing and translating the significance of projected change versus observed variability and trends. Moreover, whilst noting the resource implications it can carry, they could improve the evaluation of climate models selected for use in climate risk analysis. In representing future climate change, it remains, as ever, imperative to consider and translate model reliability and uncertainty, and convey the range of plausible future change.

Cover photo from Marco Dormino on Climate Visuals.
New CCC report highlights progress necessary to prepare for climate change

New CCC report highlights progress necessary to prepare for climate change

by Georgina Wade

The Committee on Climate Change’s 2019 Report to Parliament titled ‘Progress in preparing for climate change’ sets out their assessment of climate change preparations made in England and provides a first evaluation of the Government’s second National Adaptation Programme (NAP).

Declaring that the government has failed to increase adaptation policy ambition and implementation despite the increasing urgency of addressing the risks from climate change, the report finds that England is not prepared for even a 2°C rise in global temperature, let alone more extreme levels of warming.

For this report, the Committee introduced a new scoring system to give a simpler assessment of progress. The results showed that some sectors, including strategic roads, public water supply, and rail have good plans in place that consider the long-term risks and opportunities from climate change, whereas much still needs to be done to improve the health, business and agricultural sectors. Of the 56 risks and opportunities identified in the UK’s Climate Change Risk Assessment, 21 have no formal actions in the NA. Because of this, the CCC deems that they have “been unable to give high scores for managing risk to any of the sectors assessed”.

Additionally, the Committee addressed the preparation of the next UK Climate Risk Assessment (CCRA), due in 2022, and identified some specific recommendations for how this important programme of work can be improved. Highlighting that the need for action has never been clearer, the CCC’s message to the government is simple: Now, do it.

To download the report, click here

Cover photo by Dominik Lange on Unsplash.
Level complete: Could computer games help farmers adapt to climate change?

Level complete: Could computer games help farmers adapt to climate change?

By Georgina Wade

Researchers in Sweden and Finland are pointing to computer games as a possible method of engaging farmers with scientific research and help them adapt to climate change.

Through the development of an interactive web-based maladaptation game, researchers tested stakeholders with four agricultural challenges: precipitation, temperature increase/drought, longer growing seasons and increased risk of pests and weeds. For each challenge, players were told to make a strategic decision based on the options given, that were then combined to form a list of potential negative outcomes based on their decisions.

The research is presented in the article “Benefits and challenges of serious gaming – the case of “The Maladaptation Game” and published in the journal Open Agriculture. She believes her findings provide insight into cognitive behaviour.

“While we observed that the conceptual thinking of the game sometimes clashes with the players’ everyday experience and practice, we believe gaming may function as an eye-opener to new ways of thinking,” explains Asplund.

Asplund also suggests that games should be designed to include elements of thinking and sharing, which will stimulate reflection and discussion among stakeholders.

Access the game here:

Cover photo from Wikimedia Commons
Ten years ago, climate adaptation research was gaining steam. Today, it’s gutted

Ten years ago, climate adaptation research was gaining steam. Today, it’s gutted

By Rod Keenan, University of Melbourne

Ten years ago, on February 7, 2009, I sat down in my apartment in central Melbourne to write a job application. All of the blinds were down, and the windows tightly closed. Outside it was 47℃. We had no air conditioning. The heat seeped through the walls.

When I stepped outside, the air ripped at my nose and throat, like a fan-forced sauna. It felt ominous. With my forestry training, and some previous experience of bad fire weather in Tasmania, I knew any fires that day would be catastrophic. They were. Black Saturday became Australia’s worst-ever bushfire disaster.

I was applying for the position of Director of the Victorian Centre for Climate Change Adaptation Research (VCCCAR). I was successful and started the job later that year.

The climate in Victoria over the previous 12 years had been harsh. Between 1997 and 2009 the state suffered its worst drought on record, and major bushfires in 2003 and 2006-07 burned more than 2 million hectares of forest. Then came Black Saturday, and the year after that saw the start of Australia’s wettest two-year period on record, bringing major floods to the state’s north, as well as to vast swathes of the rest of the country.

In Victoria alone, hundreds of millions of dollars a year were being spent on response and recovery from climate-related events. In government, the view was that things couldn’t go on that way. As climate change accelerated, these costs would only rise.

We had to get better at preparing for, and avoiding, the future impacts of rapid climate change. This is what is what we mean by the term “climate adaptation”.

Facing up to disasters

A decade after Black Saturday, with record floods in Queensland, severe bushfires in Tasmania and Victoria, widespread heatwaves and drought, and a crisis in the Murray-Darling Basin, it is timely to reflect on the state of adaptation policy and practice in Australia.

In 2009 the Rudd Labor government had taken up the challenge of reducing greenhouse gas emissions. With Malcolm Turnbull as opposition leader, we seemed headed for a bipartisan national solution ahead of the Copenhagen climate summit in December. Governments, meanwhile, agreed that adaptation was more a state and local responsibility. Different parts of Australia faced different climate risks. Communities and industries in those regions had different vulnerabilities and adaptive capacities and needed locally driven initiatives.

Led by the Brumby government in Victoria, state governments developed an adaptation policy framework and sought federal financial support to implement it. This included research on climate adaptation. The federal government put A$50 million into a new National Climate Change Adaptation Research Facility, based in Queensland, alongside the CSIRO Adaptation Flagship which was set up in 2007.

The Victorian Government invested A$5 million in VCCCAR. The state faced local risks: more heatwaves, floods, storms, bushfires and rising sea levels, and my colleagues and I found there was plenty of information on climate impacts. The question was: what can policy-makers, communities, businesses and individuals do in practical terms to plan and prepare?

Getting to work

From 2009 until June 2014, researchers from across disciplines in four universities collaborated with state and local governments, industry and the community to lay the groundwork for better decisions in a changing climate.

We held 20 regional and metropolitan consultation events and hosted visiting international experts on urban design, flood, drought, and community planning. Annual forums brought together researchers, practitioners, consultants and industry to share knowledge and engage in collective discussion on adaptation options. We worked with eight government departments, driving the message that adapting to climate change wasn’t just an “environmental” problem and needed responses across government.

All involved considered the VCCCAR a success. It improved knowledge about climate adaptation options and confidence in making climate decisions. The results fed into Victoria’s 2013 Climate Change Adaptation Plan, as well as policies for urban design and natural resource management, and practices in the local government and community sectors. I hoped the centre would continue to provide a foundation for future adaptation policy and practice.

Funding cuts

In the 2014 state budget the Napthine government chose not to continue funding the VCCCAR. Soon after, the Abbott federal government reduced the funding and scope of its national counterpart, and funding ended last year.

Meanwhile, CSIRO chief executive Larry Marshall argued that climate science was less important than the need for innovation and turning inventions into benefits for society. Along with other areas of climate science, the Adaptation Flagship was cut, its staff let go or redirected. From a strong presence in 2014, climate adaptation has become almost invisible in the national research landscape.

In the current chaos of climate policy, adaptation has been downgraded. There is a national strategy but little high-level policy attention. State governments have shifted their focus to energy, investing in renewables and energy security. Climate change was largely ignored in developing the Murray-Darling Basin Plan.

Despite this lack of policy leadership, many organisations are adapting. Local governments with the resources are addressing their particular challenges, and building resilience. Our public transport now functions better in heatwaves, and climate change is being considered in new transport infrastructure. The public is more aware of heatwave risks, and there is investment in emergency management research, but this is primarily focused on disaster response.

Large companies making long-term investments, such as Brisbane Airport, have improved their capacity to consider future climate risks. There are better planning tools and systems for business, and the finance and insurance sectors are seriously considering these risks in investment decisions. Smart rural producers are diversifying, using their resources differently, or shifting to different growing environments.

Struggling to cope

But much more is needed. Old buildings and cooling systems are not built to cope with our current temperatures. Small businesses are suffering, but few have capacity to analyse their vulnerabilities or assess responses. The power generation system is under increasing pressure. Warning systems have improved but there is still much to do to design warnings in a way that ensures an appropriate public reaction. Too many people still adopt a “she’ll be right” attitude and ignore warnings, or leave it until the last minute to evacuate.

In an internal submission to government in 2014 we proposed a Victorian Climate Resilience Program to provide information and tools for small businesses. Other parts of the program included frameworks for managing risks for local governments, urban greening, building community leadership for resilience, and new conservation approaches in landscapes undergoing rapid change.

Investment in climate adaptation pays off. Small investments now can generate payoffs of 3-5:1 in reduced future impacts. A recent business round table report indicates that carefully targeted research and information provision could save state and federal governments A$12.2 billion and reduce the overall economic costs of natural disasters (which are projected to rise to A$23 billion a year by 2050) by more than 50%.

Ten years on from Black Saturday, climate change is accelerating. The 2030 climate forecasts made in 2009 have come true in half the time. Today we are living through more and hotter heatwaves, longer droughts, uncontrollable fires, intense downpours and significant shifts in seasonal rainfall patterns.

Yes, policy-makers need to focus on reducing greenhouse emissions, but we also need a similar focus on adaptation to maintain functioning and prosperous communities, economies and ecosystems under this rapid change. It is vital that we rebuild our research capacity and learn from our past experiences, to support the partnerships needed to make climate-smart decisions.

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Cover photo by CSIRO (CC BY 3.0): A destroyed property at Kinglake after the ‘Black Saturday’ bushfires.
Why the science of extreme weather attribution matters

Why the science of extreme weather attribution matters

By Peter Stott

As the Earth’s climate warms, people face mounting threats from rising seas, and more intense and frequent storms, heatwaves, fires, and droughts. When these events hit, people want to understand whether they are connected to climate change.

The science of attribution can do this. It links observed changes in climate to natural and human-induced causes. In recent years, it has amassed a wealth of evidence to show convincingly that these changes are dominated by the effects of human-induced greenhouse gas emissions. As the IPCC concluded in its Fifth Assessment Report: “Human influence on the climate system is clear”.

Attribution science has been key to demonstrating the need for action to reduce emissions in order to avoid the worst effects of climate change. Societies can use such information to help them adapt to the inevitable changes in the climate coming our way, and ensure that suitable regulatory and legislative frameworks are put in place.

A developing science

The realisation that individual weather events could be linked to climate change came in 2003 when a devastating European heatwave was estimated to have killed more than 70,000 people. Myles Allen, professor of geosystem science at Oxford University, proposed the concept of event attribution, arguing that it would be possible to calculate the increased risk of a particular event due to climate change. The following year, I published a paper in Nature, co-authored by Myles and another Oxford colleague, Daithi Stone, which showed that human-induced climate change had very likely more than doubled the risk of such a heatwave.

This science has since burgeoned. Climate scientists have studied a wide range of weather events around the world, including heatwaves, heavy rainfall events, tropical cyclones and droughts

Event attribution science can now deliver robust assessments of very recent events

China has also been leading, thanks in part to the Climate Science for Service Partnership – China project, a collaborative partnership between UK and Chinese scientists. Some of the results have been published in the annual reports, “Assessing Extreme Events from a Climate Perspective”, which attribute events from the previous year. For example, a collaborative China-UK study led by Ying Sun of the China Meteorological Organisation has shown that anthropogenic influence roughly doubled the chances of the extreme rainfall in south-eastern China in June, 2017, when heavy floods affected more than 10 million people, with 38 dead and about 800,000 people forced to relocate.

The rapid development of attribution science has raised three key questions concerning its future potential. How reliable is it? What aspects of the science need to be improved? And how quickly and routinely can results be produced?

How reliable is event attribution science?

The basis of all event attribution assessments is a model. This is needed to calculate the counter-factual situation in which human activities had not changed the climate. A typical approach is to run a model many times to simulate the current climate in which greenhouse gas concentrations are at today’s elevated levels, taking account of natural and anthropogenic influences on the climate. The results are then compared with alternative simulations in which the climate model includes only natural factors, such as changes in solar output or the climatic effects of volcanic eruptions. These models also include natural climatic variability, such as changes associated with the El Niño Southern Oscillation phenomenon by which temperatures in the Pacific Ocean vary and influence global surface temperatures.

But can these models reliably represent observed reality? In 2016, an independent expert panel of meteorologists and statisticians was convened by the National Academies of Sciences (NAS) in the United States to assess the capability of event attribution. They concluded that “It is now often possible to make and defend quantitative statements about the extent to which human-induced climate change has influenced either the magnitude or the probability of occurrence of specific types of event or event classes”.

The NAS report found that confidence is greatest for extreme events related to an aspect of temperature, being highest for extreme heat and cold events, followed by hydrological drought and heavy precipitation. They found lowest confidence for attribution of severe convective storms and extratropical cyclones.

The reason for lower confidence is related to the ability of models to represent the processes involved in the formation of the extreme events. Whereas climate models can typically represent changes and variability in temperature over large regions very reliably, they can struggle with other types of event, for example, representing the intensity of rainfall in severe convective storms. This is because they do not have the spatial and temporal resolution to resolve the processes involved.

Source: NAS, 2016

What needs to be improved?

The aim of event attribution is to improve our understanding of climate processes and their representation in climate models. This includes increasing the spatial representation of models so they incorporate a wider range of weather processes. It also includes improving the capability to compare models with observations, for example by assessing the ability of models to replicate key features of the evolution of weather events over many occurrences of such events.

As the National Academies of Sciences report pointed out, improvement will also come from the further development of long observational records. It’s also important to frame the attribution question correctly. An attribution study might for example consider how climate change has affected a particular flood in the presence of El Niño. This would require the counterfactual model simulations to also include an El Niño by specifying the pattern of sea surface temperatures associated with the phenomenon. This may give different results to a study that evaluates the effects of climate change on flood risk irrespective of whether there was an El Niño or not. Both types of attribution study may have value but both need to be clearly communicated to avoid confusion.

How quickly and routinely can results be produced?

During an extreme weather event there is often considerable public and media interest in the link with human-induced climate change. Event attribution science can now deliver robust assessments of very recent events, at least for extreme temperature events. These can draw on peer-reviewed methodologies but each individual analysis does not necessarily need to go through such a lengthy process any more than an individual weather forecast needs to.

There is potential for attribution assessments to become part of the regular production of climate services, by complementing climate monitoring and prediction with regular updates on how climate change is altering the probability and magnitude of recent extreme weather events.

However, caution will be needed in which types of weather event are incorporated into such activities. As the National Academies of Sciences report pointed out, confidence for different types of weather event differs. Low confidence events such as severe convective storms will still be studied in peer-reviewed publications, and it is likely to be some time before such events are routinely included in regular climate service assessments. But as climate science develops, and as climate models improve, a wider range of extreme events will be robustly and regularly attributed to natural and human-induced causes.

Moving forward

Attribution science has developed the capability to assess the extent to which extreme weather events are linked to climate variability and change. Scientific uncertainties still remain and it is not possible to make robust attribution statements about all extreme weather events. But it is clear that such events are increasing in frequency and intensity globally. It is also increasingly possible to draw robust conclusions about the extent to which the risks from some extreme events, including large-scale long-lasting temperature-related events, have been affected by human-induced emissions. This information could be of great value for informing climate mitigation, adaptation and litigation.

This article was originally published on China Dialogue and is shared under a Creative Commons license.

Cover photo by Hello Lightbulb on Unsplash
Met Office: Climate change made 2018 UK summer heatwave ‘30 times more likely’

Met Office: Climate change made 2018 UK summer heatwave ‘30 times more likely’

By Daisy Dunne, Carbon Brief

This year’s summer heatwave, which saw temperature records broken across the UK, was made up to 30 times more likely by climate change, a new assessment says.

A preliminary study by scientists at the Met Office Hadley Centre finds that the extreme heat experienced by the UK this year had around a 12% chance of occuring. In a world without climate change, it would have had a 0.5% chance, according to the results.

The influence of climate change on the odds of the 2018 summer heatwave is the highest recorded for a study of this kind looking at extreme events in the UK, the study scientist tells Carbon Brief at the UN’s 24th Conference of the Parties (COP24) in Katowice, Poland.

And, by 2050, the chances of such a heatwave occuring could reach 50%, the scientist adds. “With continued emissions, we’ll eventually make it impossible to adapt.”

Feeling the heat

This year’s summer heatwave dominated front pages in the UK – with all-time temperature records broken in, among other places, Belfast (29.5C), Glasgow (31.9C) and Porthmadog, Wales (33C).

The new analysis suggests that such extreme heat was made around 30 times more likely by human-caused climate change.

The results are “surprising”, says study author Prof Peter Stott, who leads on climate monitoring and attribution at the Met Office Hadley Centre. Speaking to Carbon Brief at COP24, he says:

“This is a piece of scientific evidence showing that this is not just chance; we’re not just unlucky. We’re reaping the results of our emissions.

“If you look right back at global temperatures, it’s effectively impossible to have the temperatures that we’re having now without human-induced climate change. Zooming in to a region like the UK, this is probably the highest I’ve seen in that context.”

Climate change chiefly heightens the risk of heatwaves by raising global temperatures, but the 2018 heat could have also been influenced by “unusual” patterns of weather in the atmosphere, he adds:

“This is largely dominated by rising temperatures. It really is as simple of that. Where we are now, you need relatively unusual circulation patterns to get to such elevated temperatures – but, as we go on, weather patterns which bring warmer temperatures will be less rare.”

Warming’s fingerprint

The new research is the latest in what are known as “single-event attribution” studies. These aim to identify the influence that human-caused climate change does – or does not – have on episodes of extreme weather.

(In 2017, Carbon Brief produced a global map of the results of more than 140 attribution studies.)

For this analysis, scientists used climate models to compare the chances of this year’s summer heatwave happening in today’s world to a hypothetical world without human-caused climate change. Stott explains:

“There are now many models which have, in their simulations, all the forcings on climate – so, increasing greenhouse gas concentrations and other human factors, as well as natural factors, such as volcanic eruptions and solar variability.

“We can basically look into those models and then zoom in over the UK and look at the odds of that extreme weather happening in the UK – and then compare that with the same models, but when they only include natural forcings.”

For the study, the researchers defined a “summer heatwave event” as the average temperature increase experienced across the entire season (June to August), when compared to a baseline period of 1901-1930.

The research has not yet been published in a scientific journal, but the methods used are peer-reviewed, Stott says.

Falling odds

The results suggest that the 2018 summer heatwave had a 12% of occuring. In other words, in today’s climate, this sort of heatwave is likely to happen every eight years.

However, in a world without human-caused climate change, the heatwave had around a 0.5% likelihood of occuring – meaning this kind of event would only occur once in every 245 years.

The findings of the study seem to correspond to historical records of heatwaves in the UK, Stott says:

“If you’re looking at high summer temperatures in the UK, then 2003, 2006 and 2018 were all actually neck in neck. That’s three times in the last 20 years. If you look back at pre-1850s – an estimate of pre-industrial temperatures – it happened once, in 1826. So, once in 200 years versus three times in 20 years – that’s roughly 30 times [more].”

The research follows in the footsteps of another attribution study published earlier this year. That analysis by scientists at the World Weather Attribution network found that, across northern Europe, the 2018 summer heatwave was made up to five times more likely by climate change.

The difference in results likely arises for differences in methods and scope, Stott says. The previous analysis focused on six countries in northern Europe, but did not include the UK.

In addition, the previous study focused on how climate change could have influenced a three-day spike in temperatures, whereas the new analysis looks at temperatures across the summer season, Stott says.

Last week, the Met Office published its 2018 climate projections. Among its findings, it reported that summers as hot as in 2018 could be expected every other year by the middle of the century. Stott says:

“What we’re already experiencing is a forecast of what could happen – but in spades. With continued emissions, we’ll eventually make it impossible to adapt.”

This article originally appeared on Carbon Brief and is shared under a Creative Commons license.

Cover photo from Wikimedia Commons (public domain): Outdoor events at The Overture, a free three-day festival to mark the reopening of Southbank Centre’s Royal Festival Hall, attended by over a quarter of a million people.
Once eradicated mosquito-related diseases may return to Europe thanks to climate change

Once eradicated mosquito-related diseases may return to Europe thanks to climate change

By Will Bugler

Diseases including malaria, yellow fever, zika virus and dengue fever could return to Europe, according to the largest ever study of the mosquito evolutionary tree. The study investigates mosquito evolution over the last 195 million years and suggests that climate change today could provide favourable conditions for mosquito-borne diseases to spread in areas where they had been previously eradicated.

The research from the Milner Centre for Evolution at the University of Bath, University of York and China Agricultural University, shows that the rate at which new species of mosquitos evolve generally increases when levels of atmospheric carbon dioxide are higher. This is a concern because the greater the number of mosquito species, the more potential exists for new ways of transmitting disease, and perhaps for new variants of those diseases.

“It is important to look at the evolution of the mosquito against climate change because mosquitoes are responsive to CO2 levels” explained Dr Katie Davis, from the University of York’s Department of Biology, “Atmospheric CO2 levels are currently rising due to changes in the environment that are connected to human activity, so what does this mean for the mosquito and human health?

“Despite some uncertainties, we can now show that mosquito species are able to evolve and adapt to climate change in high numbers. With increased speciation, however, comes the added risk of disease increase and the return of certain diseases in countries that had eradicated them or never experienced them before.”

Chufei Tang, formerly at the Milner Centre for Evolution and now at the China Agricultural University, said “The rising atmospheric CO2 has been proven to influence various kinds of organisms, but this is the first time such impact has been found on insects.”

More research is needed to understand what climate change means for the future of the mosquito and the work will contribute to further discussions about the value of the mosquito to the ecosystem and how to manage the diseases they carry.

Tang et al (2018) “Elevated atmospheric CO2 promoted speciation in mosquitoes (Diptera, Culicidae)” is published in Communications Biology, DOI: 10.1038/s42003-018-0191-7. Click here to access the study.

Cover photo by U.S. Air Force/Nicholas J. De La Peña (public domain)