Category: Earth Observation & Climate Data

‘Eternal’ Swiss snow is melting faster

‘Eternal’ Swiss snow is melting faster

By Paul Brown

Scientists say stretches of “eternal” Swiss snow are melting faster than 20 years ago, with serious impacts for water supply and tourism.

Parts of Europe’s alpine mountain chain are undergoing accelerating melting, as the “eternal” Swiss snow thaws ever faster, threatening both the skiing industry and the nation’s water supply.

Over a period of only 22 years, thousands of satellite images have provided irrefutable evidence that an extra 5,200 square kilometres of the country are now snow-free, compared with the decade 1995-2005.

Researchers from the University of Geneva and the United Nations Environment Programme have used data from four satellites which have been constantly photographing the Earth from space, compiling a record published by the Swiss Data Cube, which uses Earth observations to give a comprehensive  picture of the country’s snow cover and much else besides, including crops grown and forest cover.

It is the loss of snow cover that most disturbs the scientists. What they call “the eternal snow zone” still covered 27% of Swiss territory in the years from 1995 to 2005. Ten years later it had fallen to 23% – a loss of 2,100 sq km.

The eternal snow line marks the part of Switzerland above which the snow never used to melt in summer or winter. It is also defined as the area where any precipitation year-round has an 80-100% chance of being snow.

“We have stored the equivalent of 6,500 images covering 34 years, a feat that only an open data policy has made possible”

Other parts of the country, including the Swiss Plateau (about 30% of Switzerland’s area), the Rhone Valley, the Alps and the Jura mountains are also losing snow cover, adding up to the 5,200 sq km total. These areas, below the eternal snow line, have until now usually had lying snow in the winter.

The study was launched in 2016 on behalf of Switzerland’s Federal Office for the Environment. Knowing the extent of snow cover and its retreat is essential for developing public policies, the researchers say.

Beyond the economic issues linked to the threat to ski resorts – a familiar area of concern, heightened by this latest research, as many of them now face shortened seasons or outright abandonment – other problems such as flood risk and water supply are coming to the fore. Snow stores water in the winter for release in spring and summer, for both agriculture and drinking water.

Currently the increasing loss of ice from glaciers in the summer is making up for the missing snow, but previous work by scientists has shown that in the future, when glaciers disappear altogether, Switzerland could face a crisis.

The researchers have relied on the information available from the Data Cube to establish what is happening on the peaks. By superimposing repeated pictures of the same place over one another they have been able to observe small changes over time.

Wealth of data

The data was made freely available to researchers. One of them, Grégory Giuliani, said: “We have stored the equivalent of 6,500 images covering 34 years, a feat that only an open data policy has made possible. If we had had to acquire these images at market value, more than 6 million Swiss francs would have been invested.

“Knowing that each pixel of each image corresponds to the observation of a square of 10 by 10 meters, we have 110 billion observations today. It is inestimable wealth for the scientific community.”

Apart from snow cover scientists are worried about many other changes taking place in Switzerland because of climate change. They already know that glaciers are melting at record speeds and plants, birds and insects are heading further up the mountains, but there is much else to be gleaned from the new data base.

The Data Cube offers the possibility of studying vegetation, the evolution and rotation of agricultural areas, urbanisation and even water quality, as satellite images can be used to monitor three essential indicators in lakes and rivers: suspended particles, whether organic or mineral; chlorophyll content; and surface temperature.

The data are freely accessible, not only to scientists worldwide but also to the public, making it easy to compare data for specific areas of the territory at different times. “Our ambition is that everyone should be able to navigate freely in Swiss territory to understand its evolution”, said Grégory Giuliani.


Paul Brown, a founding editor of Climate News Network, is a former environment correspondent of The Guardian newspaper, and still writes columns for the paper.

This article was originally published on Climate News Network.

Cover photo by Steve Evans/Flickr (CC BY-NC 2.0)
Radiant Earth releases its open Earth imagery platform

Radiant Earth releases its open Earth imagery platform

Radiant Earth Foundation announced last week the release of its new open Earth imagery platform aimed to help policymakers, researchers, journalists, and others use satellite images to understand and serve their communities.

The platform offers instant and secure, free access to Earth observation data to help the global development community apply the data to real-world problems.

Currently, there are more than 600 Earth observation satellites orbiting the planet measuring global changes in real time which, in turn, lead to better informed interventions and investments from the public and private sectors.

While the current growing market for Earth observation data is often highly fragmented and cost-prohibitive, Radiant Earth Foundation’s platform brings together billions of dollars’ worth of satellite imagery and makes it available to the global development community. Additionally, the provision of user-friendly analytical tools and support allows for a range of users to consume and analyse the data in their everyday work. This includes non-imagery data, including air quality, population, and weather statistics.

Radiant Earth Foundation’s platform is now available to the public at app.radiant.earth through secure self-sign-up or integrated social sign-on via Twitter, Facebook, GitHub, or Google accounts.

Radiant Earth Foundation will host a webinar on September 26, 2018, at 11 a.m. EDT to demonstrate the platform’s unique features to users. To attend the webinar please register here: http://bit.ly/REFPlatfromWebinar.


Cover photo by NASA.
Hurricane Lane Approaches Hawaii

Hurricane Lane Approaches Hawaii

By Kathryn Hansen, NASA Earth Observatory

Multiple threatening tropical cyclones spun over the Pacific Ocean in August 2018. In the northwest Pacific basin, typhoons Soulik and Cimaron took aim at Japan and the Korean Peninsula. Then Hurricane Lane lined up in the tropical Pacific for an encounter with the Hawaiian Islands.

At 10:45 a.m. Hawaii Standard Time (20:45 Universal Time) on August 21, 2018, the Moderate Resolution Imaging Spectroradiometer (MODIS) on NASA’s Terra satellite acquired this natural-color image of Hurricane Lane. Around that time, Lane was a powerful category 4 hurricane with maximum sustained winds of 250 kilometers (155 miles) per hour. The storm’s center was 925 kilometers (575 miles) south-southeast of Honolulu. By that evening, Lane intensified to a category 5 storm.

Direct hits on the Hawaiian Islands are rare, but plenty of storms get close. Hurricanes Madeline and Lester threatened the islands in August 2016, but both storms weakened and passed without a direct hit.

The evolution and track of Hurricane Lane until 22 August. Credit: NASA (see high resolution image by clicking here)

The exact track that Hurricane Lane will take remains to be seen. Forecasts from the Central Pacific Hurricane Center called for the storm’s center to curve northwest then north-northwest, bringing it “very close to or over the main Hawaiian Islands” from August 23 through 25. According to the National Weather Service in Honolulu, only one other category 5 hurricane in database records passed within 560 kilometers (350 miles) of Hawaii.

The westward path of Lane’s track from August 17 to August 22 is shown above. (View the large image to see the storm track from August 15 onward.) The track is overlaid on a map of sea surface temperatures in the tropical Pacific Ocean on August 21, 2018. Temperature data were compiled by Coral Reef Watch, which blends observations from the Suomi NPP, MTSAT, Meteosat, and GOES satellites, and computer models.

The map highlights sea surface temperatures of 27.8°C (82°F), a threshold that scientists generally believe to be warm enough to fuel a hurricane. According to the Central Pacific Hurricane Center, water temperatures along the forecasted track (not shown) were expected to stay between 27°C and 28°C, which is “warm enough to support a major hurricane.”


NASA Earth Observatory images by Lauren Dauphin and Joshua Stevens, using MODIS data from LANCE/EOSDIS Rapid Response, sea surface temperature data from Coral Reef Watch, cloud data from the NASA-NOAA GOES project, and storm track information from Unisys. Story by Kathryn Hansen.

This article originally appeared on NASA Earth Observatory and can be accessed by clicking here.

New report: practical guidance for using climate information for climate resilient water management

New report: practical guidance for using climate information for climate resilient water management

A new paper released by the Action on Climate Today (ACT) programme, shows how climate information can be used effectively to inform decisions related to climate resilient water management (CRWM). The paper provides practical recommendations on how best to use and integrate climate information into decision-making processes, coupled with case studies showing what this looks like in a variety of different contexts. The paper argues that while using the best available climate information can help decision-makers to go beyond business-as-usual practices in water management, good decisions can be made even in the absence of good climate information and data.

Since 2014 the ACT programme has been actively working in five South Asian countries to help national and sub-national governments mainstream adaptation to climate change into development planning and delivery systems. As part of that work, the programme is introducing CRWM into the water resources management and agriculture sectors. As presented in an earlier learning paper “Climate-Resilient Water Management: An operational framework from South Asia”, one major factor to take CRWM beyond business-as-usual approaches is using the best available climate information and data.

CRWM needs to be informed by reliable information about physical exposure and social vulnerability to climate shocks and stresses in order to create a comprehensive narrative of the impact that climate extremes, uncertainty, and variability can have on water resources management. This requires combining different types of climate information. ACT’s new paper seeks to inform government agencies and individual officials, practitioners and donors, researchers and wider civil society on:

  • How to understand the role of climate information in producing analysis including a typology of different types of climate information; and
  • How to best use climate information to inform and guide the policy-making processes.

Based on experience and learning from ACT projects, the paper presents 10 key recommendations for integrating climate information into water resources management. This is targeted at those seeking to design and implement CRWM programmes and initiatives, to help overcome some of the critical challenges to accessing and using climate information.

Climate change is already impacting the water cycle. In particular, climate change is thought to be making the monsoon more erratic and unpredictable, and decreasing the number of rainfall days while, at the same time, increasing their intensity.[1] Additionally, climate change is projected to increase the frequency and severity of both floods and droughts.[2] At same time, in South Asia, as in much of the world, water demand is increasing and accelerating in response to population growth, urbanisation, increased industrial demand, and the relatively high dependence on agriculture for livelihoods. The latter is especially problematic as rising temperatures and less rainfall decrease soil moisture, forcing farmers to water their crops more. Changes in the hydrologic cycle coupled with increased water demand will have manifold impacts on food and livelihood security, agriculture and urbanisation, industrialisation and, hence, the economy at large. As a result, there is a need for the South Asian water resources sector to plan for climate change.

Click here to access the full ACT learning paper “Using climate information for Climate-Resilient Water Management: Moving from science to action” and a learning brief.


[1] Loo, Y., Billa, L., and Singh, A. (2015). Effect of climate change on seasonal monsoon in Asia and its impact on the variability of monsoon rainfall in Southeast Asia. Geoscience Frontiers, Volume 6, Issue 6, 817-823.  https://www.sciencedirect.com/science/article/pii/S167498711400036X

[2] Kundzewicz, Z.W., L.J. Mata, N.W. Arnell, P. Döll, P. Kabat, B. Jiménez, K.A. Miller, T. Oki, Z. Sen and I.A. Shiklomanov, 2007: Freshwater resources and their management. Climate Change 2007: Impacts, Adaptation and Vulnerability. Contribution of Working Group II to the Fourth Assessment Report of the Intergovernmental Panel on Climate Change, M.L. Parry, O.F. Canziani, J.P. Palutikof, P.J. van der Linden and C.E. Hanson, Eds., Cambridge University Press, Cambridge, UK, 173-210. https://www.ipcc.ch/pdf/assessment-report/ar4/wg2/ar4-wg2-chapter3.pdf

Cover photo my Dr Michel Royon/Wikimedia (public domain).
Met Office finds climate concerns for the UK

Met Office finds climate concerns for the UK

The Met Office published its 4th annual State of the UK Climate report this week that gives a summary of the UK weather and climate through the calendar year 2017, alongside the historical context for a number of essential climate variables. It provides an accessible, authoritative and up-to-date assessment of UK climate trends, variations and extremes based on the previous year’s observational datasets. For the first time this report is being published as a special issue of the International Journal of Climatology, which is the Royal Meteorological Society journal of climate science.

Here are some key facts from the report:

  • 2017 was the 5th warmest year over land in a record dating back to 1910.
  • In contrast to summer 2018, UK summers have been notably wetter over the most recent decade, with a 20% increase in rainfall compared to 1961-1990.
  • Above average temperatures from February to June, and also in October, helped position 2017’s high temperature ranking, whilst the second half of the year saw temperatures nearer to average.
  • Nine of the ten warmest years for the UK have occurred since 2002, and the top ten have all occurred since 1990. The Central England Temperature series, which extends back to 1659, shows that the 21st century (since 2001) has so far been warmer than the previous three centuries.
  • For the UK as a whole, rainfall in 2017 was close to average, but with large regional differences. Much of Highland Scotland and lowland England were considerably dry, whilst west Wales, north-west England, and parts of south-west and north-east Scotland saw wetter condition.

Additional facts are available in the infographic published alongside the report:

Photo by Met Office, 2018

Download the full report here: https://rmets.onlinelibrary.wiley.com/doi/abs/10.1002/joc.5798


Cover photo by giografiche/Pixabay.
NOAA Hurricane Season Forecast 2018: 75% chance of near or above normal season

NOAA Hurricane Season Forecast 2018: 75% chance of near or above normal season

By Elisa Jiménez Alonso

Forecasters from the National Oceanic and Atmospheric Administration’s (NOAA) Climate Prediction Centre (CPC) predict a 35 per cent chance of an above-normal season, a 40 per cent chance of a near-normal season, and a 25 per cent chance of a below-normal season for the upcoming Atlantic hurricane season, which extends from June 1 to November 30. Prior to the peak of the season, in early August, NOAA will provide an update to this outlook.

In terms of storms, this means that there is a 70 per cent chance of 10 to 16 named storms (winds of 63 km/h or higher) forming, of which 5 to 9 could become hurricanes (winds of 120 km/h or higher), including 1 to 4 major hurricanes (category 3, 4 or 5; with winds of 179 km/h or higher). For context, average hurricane seasons tend to produce 12 named storms, of which 6 become hurricanes, which includes 3 major hurricanes.

Two of the main factors driving this outlook are the possibility of a weak El Niño developing and near-average seas surface temperatures in the Atlantic Ocean and Caribbean Sea. However, both of these factors are also influenced by atmospheric and oceanic conditions that are conducive to hurricane development and have been producing stronger hurricane seasons since 1995.

Hurricane track and intensity forecasts are incredibly important for risk management and preparedness. After 2017’s devastating Atlantic hurricane season, many communities, especially in the Caribbean, still find themselves in very vulnerable situations.


Listen to our latest podcast with Angela Burnett, author of the Irma Diaries, who witnessed Hurricane Irma first hand and collected survivor stories from the British Virgin Islands to shed light on the urgency of building back better and building resilience:

Cover photo by NOAA: NOAA’s GOES-16 satellite (now GOES-East) captured this infrared/visible image of Hurricane Harvey on August 25, 2017.
Low awareness of climate risks hampering resilience of critical infrastructure sectors claims new study

Low awareness of climate risks hampering resilience of critical infrastructure sectors claims new study

Low levels of awareness of climate risks and the availability of climate services are significant barriers to climate adaptation in the electricity sector, according to new research from Germany. However, the research also finds that the underlying market opportunity for climate services remains strong.

Damage to a critical infrastructure, its destruction or disruption by for example natural disasters, will have a significant negative impact on the security of the EU and the well-being of its citizens. Focussing on the German electricity sector, the report found that stakeholders in the sector claimed to need seasonal forecasts and decadal predictions, the latter aligning closely with energy companies’ time frames for strategic planning. However, despite this, there is currently a low level of demand for climate services from the sector.

The report found that four major barriers prevented the uptake of climate services:

  1. low awareness of the climate-related risks,
  2. low awareness of the benefits climate services can provide,
  3. mismatches between the required and the available timescales and spatial resolution of data and
  4. a lack of trust in the reliability of data.

In order to overcome these hurdles, the report recommends that considerable work needs to be done in the first instance to increase the visibility of the climate services industry and how it can contribute to the climate resilience of key sectors. It proposes that a ‘Climate Service Provider Store’ is created to provide information about where appropriate climate service providers are available.

Additionally, the case study recommends that work continues to ensure that seasonal and decadal forecast become ever-more accurate and that regional cooperation between industry networks and climate services providers are strengthened.

The case study was led by the non-profit research organization HZG under the MArket Research for a Climate Services Observatory (MARCO) programme of which Acclimatise is a proud partner. MARCO, a 2-year project coordinated by European Climate-KIC, hopes that research such as this will help to remove the barriers to the growth of the climate services industry across Europe.


Download the full case study “Critical Energy Infrastructureshere.

Download an infographic highlighting the key findings of the case study here.

Cover photo from pxhere (public domain).
India’s new disaster database to help improve country’s preparedness

India’s new disaster database to help improve country’s preparedness

By Devika Singh

In highly disaster-prone India, a new database is meant to improve disaster risk management, especially on the local level, saving lives and avoiding huge economic losses.

India’s proactive engagement in disaster management planning and preparation has evolved from a focus on relief and rescue to building resilience. The first country to publicly announce its National Disaster Management Plan (NDMP) only one year after becoming a signatory to the Sendai Framework for Disaster Risk Reduction, India is now working on a National Disaster Database that will launch by 2020.

The recently announced database will collect local level statistical data on natural disasters and their impacts. This includes disaggregated data on mortality rates, service interruption, economic losses, damages to infrastructure, and more. The database will provide the basis for government decision-making, from fund allocations and investments, to actions to reduce losses.

Aggregated national level, or even state level data on disasters and their impacts for a country of India’s proportions are of little use in planning and decision-making process. The challenges faced by people in the wake of a disaster are best assessed using local data. This will give decision-makers insights into local level impacts helping distribute funds and personnel effectively while implementing locally relevant measures to minimise losses.

Having experienced the third largest number of disasters globally in 2016 (after China and the USA), India was home to a whopping 50 per cent of the global population affected by disasters in 2015-2016 and has a total projected annual average loss of US$ 9,824 million. According to UNESCAP, disaster risk reduction interventions have an estimated return rate between four and seven times. For the Asia Pacific Region an annual investment of $2.3-$4 billion could reduce the annual average loss of $160 billion by 10%. This new database will help Indian authorities target disaster risk management investments better and thus improve their results.

Rajesh Sharma, a disaster risk information and application specialist at the United Nations Development Programme (UNDP) explains that through the database practitioners “will be able to understand what is happening in their own district and then they can take the preventive actions needed or request further information. Then we really have resilient development.”

Sharma is further encouraging the government to make the National Disaster Database open access as, “this will demand more action from all the stakeholders and I think it will raise more awareness in the long run, leading to an overall improvement in the use of disaster risk information… It could help focus minds and attention on disasters that otherwise go unnoticed, but have a huge knock-on effect for local communities.”


Cover photo by McKay Savage (CC BY 2.0): People going about their daily lives commuting to work or school just after heavy morning rains flooded the streets.
Africa’s 20,000 weather station plan

Africa’s 20,000 weather station plan

By Sarah Wild

supported by The Rockefeller Foundation Bellagio Center

Farmers across Africa need weather data and climate projections to put food on people’s plates. But often the data does not exist or is inaccessible. The same goes for scientists trying to model how the climate is changing, and how diseases spread, for the planet’s second-largest continent.

Only 10 out of Africa’s 54 countries offer adequate meteorological services, the World Bank estimates, with fewer than 300 of its weather stations meeting the World Meteorological Organisation’s observation standards.

The Bank estimates it would take US$1 billion to modernise key meteorological infrastructure.

An investment of that scale is prohibitive in countries where funds for basic services are often missing. But there may be a way around it: the Trans-African HydroMeteorological Observatory (TAHMO) hopes to expand the continent’s weather-watching capacity for a fraction of the cost.

The not-for-profit organisation has an ambitious plan to deploy weather stations at every 30km. This would translate into 20,000 individual stations supplying weather data to information-thirsty scientists, companies, and governments. It estimates it can do this for US$50 million, with an additional US$12 million a year for maintenance.

It is thrilling to be in a position to transform the culture of African climate observations and the capacity for scientific discovery. –John Selker

So far, about 500 weather stations — white cylinders about the length of a forearm and a shoe box-sized data logger — have been mounted on poles around the continent.

“It is thrilling to be in a position to transform the culture of African climate observations and the capacity for scientific discovery,” says John Selker, TAHMO co-director and a professor of hydrology at Oregon State University in the United States. Climate data is simply scarce, he says; and where it exists, scientists often have pay for it and jump through bureaucratic hurdles to get it.

Selker and colleague Nick van de Giesen from Delft University in the Netherlands established the observatory in 2010. The idea was seeded in 2005, when the two were performing an experiment in Ghana and needed rainfall data. “I asked where we could get [the data] from and [our colleague] laughed, and said there was no way we could get that data from anyone,” Selker recalls.

He had been working on electronic measurement methods and knew it was possible to develop sensors that could generate the data.

The continent’s climate is one of the world’s most understudied, according to the Future Climate for Africa report. And the authors of a World Bank report say the paucity of data from synoptic weather stations “inevitably results in poorer-quality numerical guidance and forecasts in those regions”. Calibration of the sensors used in surface observations is very important, they explain, but in practice few are calibrated to internationally accepted standards.

100 MB/year

TAHMO is trying to fill this gap with its network of weather stations. The non-profit teamed up with US-German company Meter Group to develop a weather station that measures rainfall, temperature, solar radiation, pressure, and wind speed, among other variables. The stations have no moving parts, meaning they are less likely to break and require maintenance.

Each station produces about 100MB of weather data a year, transmitted via a sim card in the data logger to a central database.

This data is available for free for scientists and local meteorological facilities, but companies wanting to access it need to pay. Selker says that, so far, TAHMO has signed memorandums of understanding with 18 African countries and received their permission to operate the network within their jurisdiction.

“It has been a massive challenge to convince African countries to make their data freely available,” he says.

Countries are suspicious of doing this because of ideas of national sovereignty and commercial value, according to Selker. “Their weather agencies have been under the impression that they can make a lot of money out of their data, even though no one has.”

The US National Oceanic and Atmospheric Administration makes its data freely available, and this is one of the main reasons why US climate and weather patterns are better understood than those on the African continent, according to Selker. “You can’t do science on proprietary data — it doesn’t help the building of scientific understanding.”

However, there is a reason African countries’ meteorological agencies charge for their data: many survive through selling it to make up for a continent-wide lack of investment in weather infrastructure, says Mark New, director of the African Climate and Development Initiative at the University of Cape Town.

TAHMO funds its operations, and the free provision to scientists and local facilities, by selling its data to companies. Technology company IBM is its biggest client: it channels the data into its subsidiary, The Weather Company, to use in its weather modelling. TAHMO also sells data to seed producers, insurers, farming companies, and consulting firms.

Intermediaries, whether they are businesses or non-profits, need weather data before they can develop the data-based services that farmers need.
Selker estimates the value of the data to the African economy at about US$100 billion a year, through industries such as insurance. “If we could insure those crops, farmers could take bigger risks,” he says. “As long as governments keep those data secured, that economic opportunity is missed.”

Ben Schaap, research lead at the Global Open Data for Agriculture and Nutrition, says there are pros and cons to keeping weather data secure. If it is free, companies can more easily use it to innovate and create services; if the data is protected, the data provider can more easily create a business model around it — but this could make using the data less economically viable for other businesses.

The public-private partnership model under which TAHMO operates is a departure from the open-access national data provision services such as NASA, or the bite-sized donor-funded projects that aim to boost climate data capacity. Selker says the African landscape is littered with the detritus of donor-funded projects that survived for the duration of the project and then fell into disrepair when the funding dried up.

“Those other projects were just that — projects,” adds Selker. “Instead, we’re an enterprise.”

However, for TAHMO’s data to benefit small-scale, poor farmers, businesses would need to know what they need and find it worthwhile to develop services for them, says Schaap.

According to Julio Araujo, research coordinator for Future Climate For Africa, there’s ongoing debate on the best model for delivering climate services, and TAHMO is one to watch. “Selling services to African countries… is an improvement over the status quo,” he says.

Data hunger

Scientists welcome the idea of free and easily accessible data, especially for areas where conflict and poor governance mean it simply does not exist, according to New.

John Kioli, chairman of Kenya’s climate change working group, welcomed the possibility of free on-the-ground data. “We need data for research and also forecasting. It is important to have reliable data and, if possible, data that is coming from Africa.”

Scientists have already started publishing articles using TAHMO data, says Selker. To access it, they simply need to apply and declare research use. “We’re sending data to scientists in the United Kingdom, US, Europe, and Africa. We’re seeing that hunger and demand for African climate data in the African science community too,” he says.

They could help answer scientific questions over how the African climate as a whole is changing, but also tackle uncertainties around how this change plays out on a city level.

“There are many possibilities,” says Marieke de Groen, TAHMO’s regional coordinator for southern Africa, who oversees eight stations in Johannesburg. She says weather data is essential to get a better understanding of water needs and water flows, which have implications for storm water management and how groundwater is recharged.

For a large city like Johannesburg, where it may be raining in one part while the sun shines in another, the effects of extreme weather events will not be uniform across the city, explains De Groen. “You have to get a picture of the distribution. Satellite and radar data help, but this is from the ground up.”


This article was supported by The Rockefeller Foundation Bellagio Center. For nearly 60 years the Bellagio Center has supported individuals working to improve the lives of poor and vulnerable people globally through its conference and residency programs, and has served as a catalyst for transformative ideas, initiatives, and collaborations.

This article was originally published on SciDev.Net. Read the original article.

Cover photo by NASA (public domain): Dust storm off the west coast of northern Africa.

Do satellites hold the answer to reporting greenhouse gases?

Do satellites hold the answer to reporting greenhouse gases?

By Jane Burston

In 2002, the European satellite Envisat gave the world its first operational orbiting sensor for detecting atmospheric greenhouse gases from space. Since then, dozens of missions have launched or are being planned around the world – including most recently by US NGO, the Environmental Defense Fund – each promising increasingly precise and comprehensive ways of monitoring greenhouse gases from orbit.

Improved greenhouse gas monitoring is important if we are to meet the goals of the Paris Agreement on climate change. Monitoring is needed for many reasons: to report emissions to the UN Framework Convention on Climate Change (UNFCCC); to determine where to focus reduction efforts; and to track the success of policy interventions.

However, satellites are not yet capable of replacing our current systems for reporting emissions. Instead, they are complementary, adding to our knowledge about the factors causing emissions – and how those emissions affect the climate.

Hard to measure

At a sub-national level, cities and regions are increasing monitoring efforts. Thousands of cities and regions have voluntarily signed up to reducing their emissions through platforms like those provided by CDP, ICLEI and C40.

Many industrial companies also want to measure emissions on their sites, driven by the need to comply with regulation and, in the case of gas extraction and transport, to ensure they are not losing a valuable commodity.

Greenhouse gases are hard to measure: there are multiple natural and anthropogenic sources at all scales which are unpredictably distributed in space and time. In the oil-and-gas sector, for example, gas pipelines can be tens of thousands of kilometres long, some facilities are in remote areas making accessibility a challenge, and emissions might be sporadic – occurring when specific processes are running rather than continuously.

In the agriculture sector, sources of emissions include livestock, which are small points of geographically dispersed emissions, and rice paddies, which stretch across large amounts of land in tropical regions and emit variably, depending on the time of year and the weather.

Currently, these emission sources are measured using ground-based techniques, but these are by no means perfect. Covering all the world’s sources with sensors is financially and practically impossible. Regional data from aircraft and networks of tall towers are expensive and labour intensive to maintain.

Satellite solution?

On the face of it, satellites could overcome these issues. The same sensor could provide global coverage, ensuring consistent measurements and overcoming the issues of inaccessibility and cost.

Satellite capabilities have already come a long way since the debut measurements of Envisat 16 years ago. The European SCIAMACHY sensor on-board Envisat was constrained by a large field of view (30km by 60km), suitable only for studying regional or global distribution of greenhouse gases in the atmosphere. In comparison, Europe’s latest satellite capable of monitoring methane, Sentinel-5P, provides greater spatial resolution, with a field of view of 7km by 7km.

Global coverage, however, is not as simple as it seems.

Current satellite instruments rely on sunlight to measure greenhouse gases, meaning they cannot operate at night or see through clouds. This is a particular problem for tropical regions, which are big sources and sinks of greenhouses gases due to expanding wetlands, an abundance of rice paddies and extensive forest-burning to provide land for farming.

A satellite’s orbit also restricts where and when it can make a measurement. Most satellites pass over the same area of ground every two weeks or so, restricting the ability to provide repeat measurements of a single location over a short amount of time.

Solving problems

Future satellite designs, orbits and the use of constellations – groups of satellites working together – could get around some of these problems.

NASA’s planned GeoCARB mission will operate in geostationary orbit, meaning it will travel at the same speed as the Earth rotating below it, allowing it to stay above the same point on Earth above North and South America. This ability to stare at a point for longer than other satellites, along with flexible scanning patterns and a small footprint, will enable the satellite to make cloud-free measurements over most of the Americas at least once per day. However, this comes at the expense of global coverage, which can only be overcome by launching constellations at enormous cost.

Another approach is for a satellite to produce its own source of light instead of relying on the sun. The planned Franco-German MERLIN satellite design includes its own active laser source, allowing it to measure in darkness. Unfortunately, the power of the planned laser limits the areas over which measurements can be taken to a thin “pencil-line” track that will limit timely global sampling. Additionally, the instrument cannot see through clouds, so will not boost monitoring in the tropics or in other cloudy regions.

Arguably, the biggest issue is that current satellite technology is not sensitive enough to detect many sources of emissions. In order to deliver quantification of greenhouse gases, resolutions on the order of metres – not kilometres – will be required.

Commercial satellites are coming onto the market claiming capabilities for making point-source greenhouse gas measurements down to the sub-50 metres level. If proven, this technology could be game-changing for facility-scale monitoring. However, currently no publicly available validation activities have taken place. Until this happens, the precision and useful applications of these satellites cannot be established.

This means that industrial facilities will need to rely on ground-based measurements or aerial flyovers for some time to come.

Direct quantification of emissions using satellite data is also not currently robust or granular enough to provide direct measurements of greenhouse gas emissions to contribute to UNFCCC reporting.

Complementary roles

Even though they lack the capability to detect and quantify individual sources, satellites can still play two roles in emissions reporting.

The first is to improve data about those activities taking place that can affect emissions, and to what extent, rather than measuring the greenhouse gas emissions directly. For example, satellite images can enable us to see where deforestation is happening. This is very different to measuring the amount of CO2 taken up by, or the greenhouse gases emitted from a forest. Activity data – such as deforestation rates – is only part of the puzzle, but it is a significant part.

The second is a potential longer-term role. The UNFCCC guidelines allow countries to supplement their “bottom-up” emissions inventory reporting, which work by using facility-by-facility information, with “top-down” measurements and reporting of atmospheric concentrations, which is a useful “sense check” on the inventory.

The UK already makes these measurements using a network of tall towers, such as Ridge Hill in Herefordshire and Mace Head in Ireland. As capabilities improve in the future, it is possible these atmospheric measurements could be aided by satellites.

As well as reporting emissions, the data provided by satellites, such as Envisat, have also proved a key tool for academic study. The data might be too coarse for quantifying individual sources of emissions, but the regional and global scale measurements have given scientists a better understanding of the carbon cycle and improved modelling of atmospheric dispersion of emissions. Sentinel-5P, for example, aims to improve our understanding of chemical processes occurring in the atmosphere and how these are linked to our climate.

Not a replacement

Do satellites hold the answer to comprehensive round-the-clock monitoring of greenhouse gases and their sources? Not yet.

Satellites are certainly not yet capable of replacing components of the current ground-based monitoring and inventory system. However, seeing satellites as a direct replacement for ground-based monitoring may be missing the point.

We will always need to use a variety of data sources; combining them can yield the most accurate and comprehensive results. Continued development and complementary use of satellites, alongside other techniques, will provide valuable data that will ultimately help improve our understanding of emissions and allow us to eventually get on top of reducing them.


Jane Burston is head of energy and environment at the UK’s National Physical Laboratory. This article originally appeared on Carbon Brief and is share under a Creative Commons license.

Cover photo by Alex Gindin on Unsplash.