Category: Research

Beyond climate models: Climate adaptation in the face of uncertainty

Beyond climate models: Climate adaptation in the face of uncertainty

By Erin Owain and Richard Bater

A recently published paper calls upon climatologists to build models on decision-relevant timescales to inform shorter-term, local decision making by policy makers.

Over the past few decades, significant scientific advancements in global climate models have revolutionised our understanding and perception of climate change  By mimicking the dynamics of the global climate system, climate models have enrichened our knowledge and understanding of the knock-on impacts that changes to the climate system can have on the wider Earth system. Climate models, produced by over 20 centres around the world, have played a critical role in providing scientific evidence for decision makers to act to reduce emissions and adapt to projected impacts.

However, in recent years demand has been placed on climate science by policy makers to produce increasingly high-resolution climate projections to inform shorter-term, local decisions. The authors of a recently published paper argues that this is partly attributable to an over-estimation, on the part of decision makers, of the level precision with which the current set of models are able to project future change.

As the authors note, “… the adaptation community should be aware that widely available climate change projections are overconfident and are advised to avoid seductive promises of information about future climate conditions at local scales and particular future dates”.  Additionally, ‘optimising’ decisions using such data, in the absence of rigorous contextualisation and evaluation, can represent poor adaptation practice, especially where inadequate, expensive, or inflexible adaptation measures become ‘locked-in’.

Decision-making timescales across the public and private sectors are often relatively short-term, relative to the timescales of climate projections. The paper draws attention to an over-reliance by decision makers on high-resolution climate projections derived from downscaled climate models. Additionally, it questions whether the demands placed on climate services to produce high resolution climate projections is warranted given that decision-making does not always require such granularity, noting that: “The predominant focus on end-of-century projections neglects more pressing development concerns, which relate to the management of shorter-term risks and climate variability.” A shorter time horizon is often more relevant in lower income countries, which can be more vulnerable to climate shocks due to higher sensitivity and lower adaptive capacity.

A common approach to meet the demand for climate projections at the local level has been to downscale General Circulation Models (GCMs). Downscaling is a process of generating higher spatial and temporal-resolution data from lower-resolution data and is used to derive local-scale data able to inform short-term decision-making.

However, the various methods of downscaling have limitations:

  • Uncertainties regarding the underlying GCM projection data can be compounded, as additional assumptions and approximations are introduced during model selection and processing.
  • Dynamic downscaling can give a false sense of spatial precision whilst relying on fewer models, whereas temporal downscaling can risk mis-portraying shorter-term projections (3 to 10 years) as being akin to forecasts.
  • The error between observed and projected climate change for some parameters can be considerable at local scales, with observed change often being more severe than that projected.
  • Models can struggle to reliably represent seasonality, extreme values, and tipping points.

These factors mean that it is important to consider both ‘outlier’ models and future values that could exceed those projected by any of the climate models, whilst bearing in mind that some models are known to perform better in some region better than others.

Embracing uncertainty

While a common reflex has been to request such high-resolution climate data, in other areas decision makers across sectors have been accustomed to acting in a context of uncertainty, whether related to cyber-attacks, political instability, fluctuation in oil prices and exchange rates, disease epidemics or natural disasters. It is well understood that such eventualities cannot be predicted with high levels of certainty beyond the short-term: as the paper also note, “Often…detailed planning is possible without detailed climate change projections”.

On the other hand, integrating historical climate data with analysis of real-time data and short-term forecasting can be an effective, high-confidence guide to making robust decisions related to climate adaptation. As noted by the authors, there should be a focus by climatologists on building models on decision-relevant timescales, encouraging further dialogue or intermediation between climate science and end users.

Climate projection data remain an indispensable and scientifically sound guide to how climate is likely to change in the future. This paper, however, is a timely corrective to a tendency to overstate the precision of climate model outputs and to make resilience building efforts contingent on the ever-finer optimisation of climate models. A broad understanding of the direction and magnitude of change in given climate parameters, and their likely impacts for given users, can be adequate to identify and prioritise adaptation strategies and measures today. Decisions can be taken today that are robust to a range of climate scenarios, and low-regret, low-cost measures can be implemented that can be easily reversed in light of experience and new information.

In future, as the paper concludes, it is important that the climate services community refocuses attention on better assessing and translating the significance of projected change versus observed variability and trends. Moreover, whilst noting the resource implications it can carry, they could improve the evaluation of climate models selected for use in climate risk analysis. In representing future climate change, it remains, as ever, imperative to consider and translate model reliability and uncertainty, and convey the range of plausible future change.

Cover photo from Marco Dormino on Climate Visuals.
New CCC report highlights progress necessary to prepare for climate change

New CCC report highlights progress necessary to prepare for climate change

by Georgina Wade

The Committee on Climate Change’s 2019 Report to Parliament titled ‘Progress in preparing for climate change’ sets out their assessment of climate change preparations made in England and provides a first evaluation of the Government’s second National Adaptation Programme (NAP).

Declaring that the government has failed to increase adaptation policy ambition and implementation despite the increasing urgency of addressing the risks from climate change, the report finds that England is not prepared for even a 2°C rise in global temperature, let alone more extreme levels of warming.

For this report, the Committee introduced a new scoring system to give a simpler assessment of progress. The results showed that some sectors, including strategic roads, public water supply, and rail have good plans in place that consider the long-term risks and opportunities from climate change, whereas much still needs to be done to improve the health, business and agricultural sectors. Of the 56 risks and opportunities identified in the UK’s Climate Change Risk Assessment, 21 have no formal actions in the NA. Because of this, the CCC deems that they have “been unable to give high scores for managing risk to any of the sectors assessed”.

Additionally, the Committee addressed the preparation of the next UK Climate Risk Assessment (CCRA), due in 2022, and identified some specific recommendations for how this important programme of work can be improved. Highlighting that the need for action has never been clearer, the CCC’s message to the government is simple: Now, do it.

To download the report, click here

Cover photo by Dominik Lange on Unsplash.
Level complete: Could computer games help farmers adapt to climate change?

Level complete: Could computer games help farmers adapt to climate change?

By Georgina Wade

Researchers in Sweden and Finland are pointing to computer games as a possible method of engaging farmers with scientific research and help them adapt to climate change.

Through the development of an interactive web-based maladaptation game, researchers tested stakeholders with four agricultural challenges: precipitation, temperature increase/drought, longer growing seasons and increased risk of pests and weeds. For each challenge, players were told to make a strategic decision based on the options given, that were then combined to form a list of potential negative outcomes based on their decisions.

The research is presented in the article “Benefits and challenges of serious gaming – the case of “The Maladaptation Game” and published in the journal Open Agriculture. She believes her findings provide insight into cognitive behaviour.

“While we observed that the conceptual thinking of the game sometimes clashes with the players’ everyday experience and practice, we believe gaming may function as an eye-opener to new ways of thinking,” explains Asplund.

Asplund also suggests that games should be designed to include elements of thinking and sharing, which will stimulate reflection and discussion among stakeholders.

Access the game here:

Cover photo from Wikimedia Commons
Ten years ago, climate adaptation research was gaining steam. Today, it’s gutted

Ten years ago, climate adaptation research was gaining steam. Today, it’s gutted

By Rod Keenan, University of Melbourne

Ten years ago, on February 7, 2009, I sat down in my apartment in central Melbourne to write a job application. All of the blinds were down, and the windows tightly closed. Outside it was 47℃. We had no air conditioning. The heat seeped through the walls.

When I stepped outside, the air ripped at my nose and throat, like a fan-forced sauna. It felt ominous. With my forestry training, and some previous experience of bad fire weather in Tasmania, I knew any fires that day would be catastrophic. They were. Black Saturday became Australia’s worst-ever bushfire disaster.

I was applying for the position of Director of the Victorian Centre for Climate Change Adaptation Research (VCCCAR). I was successful and started the job later that year.

The climate in Victoria over the previous 12 years had been harsh. Between 1997 and 2009 the state suffered its worst drought on record, and major bushfires in 2003 and 2006-07 burned more than 2 million hectares of forest. Then came Black Saturday, and the year after that saw the start of Australia’s wettest two-year period on record, bringing major floods to the state’s north, as well as to vast swathes of the rest of the country.

In Victoria alone, hundreds of millions of dollars a year were being spent on response and recovery from climate-related events. In government, the view was that things couldn’t go on that way. As climate change accelerated, these costs would only rise.

We had to get better at preparing for, and avoiding, the future impacts of rapid climate change. This is what is what we mean by the term “climate adaptation”.

Facing up to disasters

A decade after Black Saturday, with record floods in Queensland, severe bushfires in Tasmania and Victoria, widespread heatwaves and drought, and a crisis in the Murray-Darling Basin, it is timely to reflect on the state of adaptation policy and practice in Australia.

In 2009 the Rudd Labor government had taken up the challenge of reducing greenhouse gas emissions. With Malcolm Turnbull as opposition leader, we seemed headed for a bipartisan national solution ahead of the Copenhagen climate summit in December. Governments, meanwhile, agreed that adaptation was more a state and local responsibility. Different parts of Australia faced different climate risks. Communities and industries in those regions had different vulnerabilities and adaptive capacities and needed locally driven initiatives.

Led by the Brumby government in Victoria, state governments developed an adaptation policy framework and sought federal financial support to implement it. This included research on climate adaptation. The federal government put A$50 million into a new National Climate Change Adaptation Research Facility, based in Queensland, alongside the CSIRO Adaptation Flagship which was set up in 2007.

The Victorian Government invested A$5 million in VCCCAR. The state faced local risks: more heatwaves, floods, storms, bushfires and rising sea levels, and my colleagues and I found there was plenty of information on climate impacts. The question was: what can policy-makers, communities, businesses and individuals do in practical terms to plan and prepare?

Getting to work

From 2009 until June 2014, researchers from across disciplines in four universities collaborated with state and local governments, industry and the community to lay the groundwork for better decisions in a changing climate.

We held 20 regional and metropolitan consultation events and hosted visiting international experts on urban design, flood, drought, and community planning. Annual forums brought together researchers, practitioners, consultants and industry to share knowledge and engage in collective discussion on adaptation options. We worked with eight government departments, driving the message that adapting to climate change wasn’t just an “environmental” problem and needed responses across government.

All involved considered the VCCCAR a success. It improved knowledge about climate adaptation options and confidence in making climate decisions. The results fed into Victoria’s 2013 Climate Change Adaptation Plan, as well as policies for urban design and natural resource management, and practices in the local government and community sectors. I hoped the centre would continue to provide a foundation for future adaptation policy and practice.

Funding cuts

In the 2014 state budget the Napthine government chose not to continue funding the VCCCAR. Soon after, the Abbott federal government reduced the funding and scope of its national counterpart, and funding ended last year.

Meanwhile, CSIRO chief executive Larry Marshall argued that climate science was less important than the need for innovation and turning inventions into benefits for society. Along with other areas of climate science, the Adaptation Flagship was cut, its staff let go or redirected. From a strong presence in 2014, climate adaptation has become almost invisible in the national research landscape.

In the current chaos of climate policy, adaptation has been downgraded. There is a national strategy but little high-level policy attention. State governments have shifted their focus to energy, investing in renewables and energy security. Climate change was largely ignored in developing the Murray-Darling Basin Plan.

Despite this lack of policy leadership, many organisations are adapting. Local governments with the resources are addressing their particular challenges, and building resilience. Our public transport now functions better in heatwaves, and climate change is being considered in new transport infrastructure. The public is more aware of heatwave risks, and there is investment in emergency management research, but this is primarily focused on disaster response.

Large companies making long-term investments, such as Brisbane Airport, have improved their capacity to consider future climate risks. There are better planning tools and systems for business, and the finance and insurance sectors are seriously considering these risks in investment decisions. Smart rural producers are diversifying, using their resources differently, or shifting to different growing environments.

Struggling to cope

But much more is needed. Old buildings and cooling systems are not built to cope with our current temperatures. Small businesses are suffering, but few have capacity to analyse their vulnerabilities or assess responses. The power generation system is under increasing pressure. Warning systems have improved but there is still much to do to design warnings in a way that ensures an appropriate public reaction. Too many people still adopt a “she’ll be right” attitude and ignore warnings, or leave it until the last minute to evacuate.

In an internal submission to government in 2014 we proposed a Victorian Climate Resilience Program to provide information and tools for small businesses. Other parts of the program included frameworks for managing risks for local governments, urban greening, building community leadership for resilience, and new conservation approaches in landscapes undergoing rapid change.

Investment in climate adaptation pays off. Small investments now can generate payoffs of 3-5:1 in reduced future impacts. A recent business round table report indicates that carefully targeted research and information provision could save state and federal governments A$12.2 billion and reduce the overall economic costs of natural disasters (which are projected to rise to A$23 billion a year by 2050) by more than 50%.

Ten years on from Black Saturday, climate change is accelerating. The 2030 climate forecasts made in 2009 have come true in half the time. Today we are living through more and hotter heatwaves, longer droughts, uncontrollable fires, intense downpours and significant shifts in seasonal rainfall patterns.

Yes, policy-makers need to focus on reducing greenhouse emissions, but we also need a similar focus on adaptation to maintain functioning and prosperous communities, economies and ecosystems under this rapid change. It is vital that we rebuild our research capacity and learn from our past experiences, to support the partnerships needed to make climate-smart decisions.

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Cover photo by CSIRO (CC BY 3.0): A destroyed property at Kinglake after the ‘Black Saturday’ bushfires.
Why the science of extreme weather attribution matters

Why the science of extreme weather attribution matters

By Peter Stott

As the Earth’s climate warms, people face mounting threats from rising seas, and more intense and frequent storms, heatwaves, fires, and droughts. When these events hit, people want to understand whether they are connected to climate change.

The science of attribution can do this. It links observed changes in climate to natural and human-induced causes. In recent years, it has amassed a wealth of evidence to show convincingly that these changes are dominated by the effects of human-induced greenhouse gas emissions. As the IPCC concluded in its Fifth Assessment Report: “Human influence on the climate system is clear”.

Attribution science has been key to demonstrating the need for action to reduce emissions in order to avoid the worst effects of climate change. Societies can use such information to help them adapt to the inevitable changes in the climate coming our way, and ensure that suitable regulatory and legislative frameworks are put in place.

A developing science

The realisation that individual weather events could be linked to climate change came in 2003 when a devastating European heatwave was estimated to have killed more than 70,000 people. Myles Allen, professor of geosystem science at Oxford University, proposed the concept of event attribution, arguing that it would be possible to calculate the increased risk of a particular event due to climate change. The following year, I published a paper in Nature, co-authored by Myles and another Oxford colleague, Daithi Stone, which showed that human-induced climate change had very likely more than doubled the risk of such a heatwave.

This science has since burgeoned. Climate scientists have studied a wide range of weather events around the world, including heatwaves, heavy rainfall events, tropical cyclones and droughts

Event attribution science can now deliver robust assessments of very recent events

China has also been leading, thanks in part to the Climate Science for Service Partnership – China project, a collaborative partnership between UK and Chinese scientists. Some of the results have been published in the annual reports, “Assessing Extreme Events from a Climate Perspective”, which attribute events from the previous year. For example, a collaborative China-UK study led by Ying Sun of the China Meteorological Organisation has shown that anthropogenic influence roughly doubled the chances of the extreme rainfall in south-eastern China in June, 2017, when heavy floods affected more than 10 million people, with 38 dead and about 800,000 people forced to relocate.

The rapid development of attribution science has raised three key questions concerning its future potential. How reliable is it? What aspects of the science need to be improved? And how quickly and routinely can results be produced?

How reliable is event attribution science?

The basis of all event attribution assessments is a model. This is needed to calculate the counter-factual situation in which human activities had not changed the climate. A typical approach is to run a model many times to simulate the current climate in which greenhouse gas concentrations are at today’s elevated levels, taking account of natural and anthropogenic influences on the climate. The results are then compared with alternative simulations in which the climate model includes only natural factors, such as changes in solar output or the climatic effects of volcanic eruptions. These models also include natural climatic variability, such as changes associated with the El Niño Southern Oscillation phenomenon by which temperatures in the Pacific Ocean vary and influence global surface temperatures.

But can these models reliably represent observed reality? In 2016, an independent expert panel of meteorologists and statisticians was convened by the National Academies of Sciences (NAS) in the United States to assess the capability of event attribution. They concluded that “It is now often possible to make and defend quantitative statements about the extent to which human-induced climate change has influenced either the magnitude or the probability of occurrence of specific types of event or event classes”.

The NAS report found that confidence is greatest for extreme events related to an aspect of temperature, being highest for extreme heat and cold events, followed by hydrological drought and heavy precipitation. They found lowest confidence for attribution of severe convective storms and extratropical cyclones.

The reason for lower confidence is related to the ability of models to represent the processes involved in the formation of the extreme events. Whereas climate models can typically represent changes and variability in temperature over large regions very reliably, they can struggle with other types of event, for example, representing the intensity of rainfall in severe convective storms. This is because they do not have the spatial and temporal resolution to resolve the processes involved.

Source: NAS, 2016

What needs to be improved?

The aim of event attribution is to improve our understanding of climate processes and their representation in climate models. This includes increasing the spatial representation of models so they incorporate a wider range of weather processes. It also includes improving the capability to compare models with observations, for example by assessing the ability of models to replicate key features of the evolution of weather events over many occurrences of such events.

As the National Academies of Sciences report pointed out, improvement will also come from the further development of long observational records. It’s also important to frame the attribution question correctly. An attribution study might for example consider how climate change has affected a particular flood in the presence of El Niño. This would require the counterfactual model simulations to also include an El Niño by specifying the pattern of sea surface temperatures associated with the phenomenon. This may give different results to a study that evaluates the effects of climate change on flood risk irrespective of whether there was an El Niño or not. Both types of attribution study may have value but both need to be clearly communicated to avoid confusion.

How quickly and routinely can results be produced?

During an extreme weather event there is often considerable public and media interest in the link with human-induced climate change. Event attribution science can now deliver robust assessments of very recent events, at least for extreme temperature events. These can draw on peer-reviewed methodologies but each individual analysis does not necessarily need to go through such a lengthy process any more than an individual weather forecast needs to.

There is potential for attribution assessments to become part of the regular production of climate services, by complementing climate monitoring and prediction with regular updates on how climate change is altering the probability and magnitude of recent extreme weather events.

However, caution will be needed in which types of weather event are incorporated into such activities. As the National Academies of Sciences report pointed out, confidence for different types of weather event differs. Low confidence events such as severe convective storms will still be studied in peer-reviewed publications, and it is likely to be some time before such events are routinely included in regular climate service assessments. But as climate science develops, and as climate models improve, a wider range of extreme events will be robustly and regularly attributed to natural and human-induced causes.

Moving forward

Attribution science has developed the capability to assess the extent to which extreme weather events are linked to climate variability and change. Scientific uncertainties still remain and it is not possible to make robust attribution statements about all extreme weather events. But it is clear that such events are increasing in frequency and intensity globally. It is also increasingly possible to draw robust conclusions about the extent to which the risks from some extreme events, including large-scale long-lasting temperature-related events, have been affected by human-induced emissions. This information could be of great value for informing climate mitigation, adaptation and litigation.

This article was originally published on China Dialogue and is shared under a Creative Commons license.

Cover photo by Hello Lightbulb on Unsplash
Met Office: Climate change made 2018 UK summer heatwave ‘30 times more likely’

Met Office: Climate change made 2018 UK summer heatwave ‘30 times more likely’

By Daisy Dunne, Carbon Brief

This year’s summer heatwave, which saw temperature records broken across the UK, was made up to 30 times more likely by climate change, a new assessment says.

A preliminary study by scientists at the Met Office Hadley Centre finds that the extreme heat experienced by the UK this year had around a 12% chance of occuring. In a world without climate change, it would have had a 0.5% chance, according to the results.

The influence of climate change on the odds of the 2018 summer heatwave is the highest recorded for a study of this kind looking at extreme events in the UK, the study scientist tells Carbon Brief at the UN’s 24th Conference of the Parties (COP24) in Katowice, Poland.

And, by 2050, the chances of such a heatwave occuring could reach 50%, the scientist adds. “With continued emissions, we’ll eventually make it impossible to adapt.”

Feeling the heat

This year’s summer heatwave dominated front pages in the UK – with all-time temperature records broken in, among other places, Belfast (29.5C), Glasgow (31.9C) and Porthmadog, Wales (33C).

The new analysis suggests that such extreme heat was made around 30 times more likely by human-caused climate change.

The results are “surprising”, says study author Prof Peter Stott, who leads on climate monitoring and attribution at the Met Office Hadley Centre. Speaking to Carbon Brief at COP24, he says:

“This is a piece of scientific evidence showing that this is not just chance; we’re not just unlucky. We’re reaping the results of our emissions.

“If you look right back at global temperatures, it’s effectively impossible to have the temperatures that we’re having now without human-induced climate change. Zooming in to a region like the UK, this is probably the highest I’ve seen in that context.”

Climate change chiefly heightens the risk of heatwaves by raising global temperatures, but the 2018 heat could have also been influenced by “unusual” patterns of weather in the atmosphere, he adds:

“This is largely dominated by rising temperatures. It really is as simple of that. Where we are now, you need relatively unusual circulation patterns to get to such elevated temperatures – but, as we go on, weather patterns which bring warmer temperatures will be less rare.”

Warming’s fingerprint

The new research is the latest in what are known as “single-event attribution” studies. These aim to identify the influence that human-caused climate change does – or does not – have on episodes of extreme weather.

(In 2017, Carbon Brief produced a global map of the results of more than 140 attribution studies.)

For this analysis, scientists used climate models to compare the chances of this year’s summer heatwave happening in today’s world to a hypothetical world without human-caused climate change. Stott explains:

“There are now many models which have, in their simulations, all the forcings on climate – so, increasing greenhouse gas concentrations and other human factors, as well as natural factors, such as volcanic eruptions and solar variability.

“We can basically look into those models and then zoom in over the UK and look at the odds of that extreme weather happening in the UK – and then compare that with the same models, but when they only include natural forcings.”

For the study, the researchers defined a “summer heatwave event” as the average temperature increase experienced across the entire season (June to August), when compared to a baseline period of 1901-1930.

The research has not yet been published in a scientific journal, but the methods used are peer-reviewed, Stott says.

Falling odds

The results suggest that the 2018 summer heatwave had a 12% of occuring. In other words, in today’s climate, this sort of heatwave is likely to happen every eight years.

However, in a world without human-caused climate change, the heatwave had around a 0.5% likelihood of occuring – meaning this kind of event would only occur once in every 245 years.

The findings of the study seem to correspond to historical records of heatwaves in the UK, Stott says:

“If you’re looking at high summer temperatures in the UK, then 2003, 2006 and 2018 were all actually neck in neck. That’s three times in the last 20 years. If you look back at pre-1850s – an estimate of pre-industrial temperatures – it happened once, in 1826. So, once in 200 years versus three times in 20 years – that’s roughly 30 times [more].”

The research follows in the footsteps of another attribution study published earlier this year. That analysis by scientists at the World Weather Attribution network found that, across northern Europe, the 2018 summer heatwave was made up to five times more likely by climate change.

The difference in results likely arises for differences in methods and scope, Stott says. The previous analysis focused on six countries in northern Europe, but did not include the UK.

In addition, the previous study focused on how climate change could have influenced a three-day spike in temperatures, whereas the new analysis looks at temperatures across the summer season, Stott says.

Last week, the Met Office published its 2018 climate projections. Among its findings, it reported that summers as hot as in 2018 could be expected every other year by the middle of the century. Stott says:

“What we’re already experiencing is a forecast of what could happen – but in spades. With continued emissions, we’ll eventually make it impossible to adapt.”

This article originally appeared on Carbon Brief and is shared under a Creative Commons license.

Cover photo from Wikimedia Commons (public domain): Outdoor events at The Overture, a free three-day festival to mark the reopening of Southbank Centre’s Royal Festival Hall, attended by over a quarter of a million people.
Once eradicated mosquito-related diseases may return to Europe thanks to climate change

Once eradicated mosquito-related diseases may return to Europe thanks to climate change

By Will Bugler

Diseases including malaria, yellow fever, zika virus and dengue fever could return to Europe, according to the largest ever study of the mosquito evolutionary tree. The study investigates mosquito evolution over the last 195 million years and suggests that climate change today could provide favourable conditions for mosquito-borne diseases to spread in areas where they had been previously eradicated.

The research from the Milner Centre for Evolution at the University of Bath, University of York and China Agricultural University, shows that the rate at which new species of mosquitos evolve generally increases when levels of atmospheric carbon dioxide are higher. This is a concern because the greater the number of mosquito species, the more potential exists for new ways of transmitting disease, and perhaps for new variants of those diseases.

“It is important to look at the evolution of the mosquito against climate change because mosquitoes are responsive to CO2 levels” explained Dr Katie Davis, from the University of York’s Department of Biology, “Atmospheric CO2 levels are currently rising due to changes in the environment that are connected to human activity, so what does this mean for the mosquito and human health?

“Despite some uncertainties, we can now show that mosquito species are able to evolve and adapt to climate change in high numbers. With increased speciation, however, comes the added risk of disease increase and the return of certain diseases in countries that had eradicated them or never experienced them before.”

Chufei Tang, formerly at the Milner Centre for Evolution and now at the China Agricultural University, said “The rising atmospheric CO2 has been proven to influence various kinds of organisms, but this is the first time such impact has been found on insects.”

More research is needed to understand what climate change means for the future of the mosquito and the work will contribute to further discussions about the value of the mosquito to the ecosystem and how to manage the diseases they carry.

Tang et al (2018) “Elevated atmospheric CO2 promoted speciation in mosquitoes (Diptera, Culicidae)” is published in Communications Biology, DOI: 10.1038/s42003-018-0191-7. Click here to access the study.

Cover photo by U.S. Air Force/Nicholas J. De La Peña (public domain)
New approach reveals ocean warming more than previously thought

New approach reveals ocean warming more than previously thought

By Georgina Wade

A study published in Nature suggests that oceans are warming far faster than the previous estimates laid out by the Intergovernmental Panel on Climate Change.

Using a new approach that derives ocean temperatures by measuring carbon dioxide and oxygen levels in the atmosphere, the study found that between 1991 and 2016 the oceans warmed an average of 60 percent more per year than the panel’s official estimates.

If proven accurate, the new temperature estimates could be an indication that global warming has exceeded conservative estimates and is more in line with predicted worst-case scenarios.

Led by Laure Resplandy, a biogeochemical oceanographer at Princeton University, the research finds that the IPCC’s measurements for observed ocean heat were too low.

“Their estimates overlap with previous estimates, but it’s aligned with some of the higher estimates,” she said.  “It’s not like completely changing our understanding of what the ocean might be taking up – it’s a new type of measurement that’s weighing in toward the higher end of that.”

Although her work differs from IPCC findings, Resplandy emphasised that her findings do not oppose the IPCC’s dire warning of only 12 years to limit climate change catastrophe, “it doesn’t change the results,” she said. “What it does is that it makes it harder to get there.”

Recent problems found regarding calculations made in the study do not invalidate the study’s methodology. In a note added to the original news release, co-author Ralph Kneeling wrote:

 “I am working with my co-authors to address two problems that came to our attention since publication. These problems, related to incorrectly treating systematic errors in the O2 measurements and the use of a constant land O2:C exchange ratio of 1.1, do not invalidate the study’s methodology or the new insights into ocean biogeochemistry on which it is based. We expect the combined effect of these two corrections to have a small impact on our calculations of overall heat uptake, but with larger margins of error.  We are redoing the calculations and preparing author corrections for submission to Nature.”

Cover photo by Victor Carvalho on Unsplash
Parts of the world could be facing multiple climate-related crises at once by 2100

Parts of the world could be facing multiple climate-related crises at once by 2100

By Elisa Jiménez Alonso

A newly published study in Nature Climate Change finds that risks posed by climate change will be so wide-ranging by the end of this century that some parts of the world could face up to six climate-related crises simultaneously.

The paper analyses a range of climate hazards including heatwaves, wildfires, sea level rise, hurricanes, flooding, drought and water shortages. Many of these problems are already being noticed around the world. This year alone several severe flood events occurred around the world from Japan and Nigeria to the United States, in summer a heatwave led to temperature records across the whole Northern Hemisphere. Just last week, California experienced some of its worst wildfires in its history.

According to the paper, under current greenhouse gas emission scenarios, the situation will get much worse. By 2100, large parts of the world, especially in the Tropics and mostly coastal areas, might experience up to 6 simultaneous climate-related crises. Lead author of the study Camilo Mora, University of Hawaii, described the prospect “like a terror movie that is real.”

Global map of cumulative climate hazards. The main map shows the cumulative index of climate hazards, which is the summation of the rescaled change in all hazards between 1955 and 2095. Smaller maps indicate the difference for each individual hazard for the same time period. Individual hazards were rescaled to be normalized between − 1 and 1. Negative values indicate a decrease in the given hazard, whereas positive values represent an increase relative to the 1950s baseline values. The largest value in the cumulative index was six (that is, cumulatively, the equivalent to the largest change in six climate hazards occurred for any one cell). Plots are based on RCP 8.5, results for all three mitigation scenarios are provided in Supplementary Figs. 1–3. An interactive data visualization is available at and time-series animations at

According to the authors the largest losses of human life during extreme climate events will occur in developing nations, while developed nations will mostly be impacted by high economic losses, a trend that is already true today. To accompany the paper, ESRI developed an interactive map to visualize the findings of the study and even under the most optimistic emission scenario it is clear that adaptation and resilience building are a dire necessity pretty much all over the globe. “We see that climate change is literally redrawing the lines on the map and revealing the threats that our world faces at every level,” said Dawn Wright, ESRIS’s chief scientist.

The paper includes a multidisciplinary effort by 23 authors who reviewed over 3,000 papers on the effects of climate change determining close to 500 ways in which these effects could impact human physical and mental health, food security, water availability, infrastructure and many other aspects.

This study is just another urgent reminder that inaction in terms of climate change mitigation and also adaptation will come at way too high a cost, not just economically but also in terms of human lives.

Mora, C., Spirandelli, D., Franklin, E., Lynham, J., Kantar, M., Miles, W., Smith, C., Freel, K., Moy, J., Louis, L., Barba, E., Bettinger, K., Frazier, A., Colburn IX, J., Hanasaki, N., Hawkins, E., Hirabayashi, Y., Knorr, W., Little, C., Emanuel, K., Sheffield, J., Patz, J. and Hunter, C. (2018). Broad threat to humanity from cumulative climate hazards intensified by greenhouse gas emissions. Nature Climate Change. Access the article by clicking here (paywall).

Cover photo by Eoghan Rice – Trócaire / Caritas (CC BY 2.0): Debris lines the streets of Tacloban, Leyte island. This region was the worst affected by typhoon Hayan in 2013, causing widespread damage and loss of life.
Ocean warming may be faster than thought

Ocean warming may be faster than thought

By Tim Radford

Science knows that ocean warming is occurring. A big challenge now is to work out how quickly the temperature is rising.

The seas are getting hotter – and researchers have thought again about just how much faster ocean warming is happening. They believe that in the last 25 years the oceans have absorbed at least 60% more heat than previous global estimates by the UN’s Intergovernmental Panel on Climate Change (IPCC) had considered.

And they calculate this heat as the equivalent to 150 times the annual human electricity generation in any one year.

“Imagine if the ocean was only 30 feet (10m) deep,” said Laure Resplandy, a researcher at the Princeton Environment Institute in the US. “Our data show that it would have warmed by 6.5°C every decade since 1991. In comparison, the estimate of the last IPCC assessment report would correspond to a warming of only 4°C every decade.”

The oceans cover 70% of the Blue Planet, but take up about 90% of all the excess energy produced as the Earth warms. If scientists can put a precise figure to this energy, then they can make more precise guesses about the surface warming to come, as humans continue to burn fossil fuels, release greenhouse gases such as carbon dioxide into the atmosphere, and drive up the planetary thermometer.

“There will have to be an even more drastic shutdown of fossil fuel investment and an even faster switch to renewable sources of energy”

At the academic level, this is the search for a factor known to climate researchers as climate sensitivity: the way the world responds to ever-increasing ratios of greenhouse gas in the atmosphere.

At the human level, this plays out as ever-greater extremes of heat, drought and rainfall, with ever-higher risks of catastrophic storm or flood, or harvest failure, and ever-higher tallies of human suffering.

Comprehensive global measurements of ocean temperature date only from 2007 and the network of robot sensors that deliver continuous data about the top half of the ocean basins.

Dr Resplandy and her colleagues report in the journal Nature that they used a sophisticated approach based on very high-precision measurements of levels of oxygen and carbon dioxide in the air.

Gases released

Both gases are soluble, and the oceans are becoming more acidic as the seas absorb ever-greater levels of carbon dioxide. But as seas warm, they also become less able to hold their dissolved gases, and release them into the atmosphere.

This simple consequence of atmospheric physics meant that the researchers could use what they call “atmospheric potential oxygen” to arrive at a new way of measuring the heat the oceans must have absorbed over time.

They used the standard unit of energy: the joule. Their new budget for heat absorbed each year between 1991 and 2016 is 13 zettajoules. That is a digit followed by 21 zeroes, the kind of magnitude astronomers tend to use.

That the oceans are warming is no surprise: this has been obvious from the crudest comparison of old naval data with modern surface checks, and for years some researchers argued that ever-higher ocean temperatures could account for the so-called slowdown in global warming in the first dozen years of this century.

Challenging achievement

The new finding counts first as an academic achievement: there is now a more precise thermometer reading, and new calculations can begin.

One of the researchers, Ralph Keeling of the Scripps Institution of Oceanography, said: “The result significantly increases the confidence we can place in estimates of ocean warming and therefore help reduce uncertainty in the climate sensitivity, particularly closing off the possibility of very low climate sensitivity.”

But the result also suggests that internationally agreed attempts to hold planetary warming to a maximum of just 2°C – and the world has already warmed by around 1°C in the last century – become more challenging.

It means that there will have to be an even more drastic shutdown of fossil fuel investment and an even faster switch to renewable sources of energy such as sun and wind power.

Tim Radford, a founding editor of Climate News Network, worked for The Guardian for 32 years, for most of that time as science editor. He has been covering climate change since 1988.

This article was originally published on Climate News Network.

Cover photo by Giga Khurtsilava on Unsplash