Category: Earth Observation & Climate Data

New report: practical guidance for using climate information for climate resilient water management

New report: practical guidance for using climate information for climate resilient water management

A new paper released by the Action on Climate Today (ACT) programme, shows how climate information can be used effectively to inform decisions related to climate resilient water management (CRWM). The paper provides practical recommendations on how best to use and integrate climate information into decision-making processes, coupled with case studies showing what this looks like in a variety of different contexts. The paper argues that while using the best available climate information can help decision-makers to go beyond business-as-usual practices in water management, good decisions can be made even in the absence of good climate information and data.

Since 2014 the ACT programme has been actively working in five South Asian countries to help national and sub-national governments mainstream adaptation to climate change into development planning and delivery systems. As part of that work, the programme is introducing CRWM into the water resources management and agriculture sectors. As presented in an earlier learning paper “Climate-Resilient Water Management: An operational framework from South Asia”, one major factor to take CRWM beyond business-as-usual approaches is using the best available climate information and data.

CRWM needs to be informed by reliable information about physical exposure and social vulnerability to climate shocks and stresses in order to create a comprehensive narrative of the impact that climate extremes, uncertainty, and variability can have on water resources management. This requires combining different types of climate information. ACT’s new paper seeks to inform government agencies and individual officials, practitioners and donors, researchers and wider civil society on:

  • How to understand the role of climate information in producing analysis including a typology of different types of climate information; and
  • How to best use climate information to inform and guide the policy-making processes.

Based on experience and learning from ACT projects, the paper presents 10 key recommendations for integrating climate information into water resources management. This is targeted at those seeking to design and implement CRWM programmes and initiatives, to help overcome some of the critical challenges to accessing and using climate information.

Climate change is already impacting the water cycle. In particular, climate change is thought to be making the monsoon more erratic and unpredictable, and decreasing the number of rainfall days while, at the same time, increasing their intensity.[1] Additionally, climate change is projected to increase the frequency and severity of both floods and droughts.[2] At same time, in South Asia, as in much of the world, water demand is increasing and accelerating in response to population growth, urbanisation, increased industrial demand, and the relatively high dependence on agriculture for livelihoods. The latter is especially problematic as rising temperatures and less rainfall decrease soil moisture, forcing farmers to water their crops more. Changes in the hydrologic cycle coupled with increased water demand will have manifold impacts on food and livelihood security, agriculture and urbanisation, industrialisation and, hence, the economy at large. As a result, there is a need for the South Asian water resources sector to plan for climate change.

Click here to access the full ACT learning paper “Using climate information for Climate-Resilient Water Management: Moving from science to action” and a learning brief.


[1] Loo, Y., Billa, L., and Singh, A. (2015). Effect of climate change on seasonal monsoon in Asia and its impact on the variability of monsoon rainfall in Southeast Asia. Geoscience Frontiers, Volume 6, Issue 6, 817-823.  https://www.sciencedirect.com/science/article/pii/S167498711400036X

[2] Kundzewicz, Z.W., L.J. Mata, N.W. Arnell, P. Döll, P. Kabat, B. Jiménez, K.A. Miller, T. Oki, Z. Sen and I.A. Shiklomanov, 2007: Freshwater resources and their management. Climate Change 2007: Impacts, Adaptation and Vulnerability. Contribution of Working Group II to the Fourth Assessment Report of the Intergovernmental Panel on Climate Change, M.L. Parry, O.F. Canziani, J.P. Palutikof, P.J. van der Linden and C.E. Hanson, Eds., Cambridge University Press, Cambridge, UK, 173-210. https://www.ipcc.ch/pdf/assessment-report/ar4/wg2/ar4-wg2-chapter3.pdf

Cover photo my Dr Michel Royon/Wikimedia (public domain).
Met Office finds climate concerns for the UK

Met Office finds climate concerns for the UK

The Met Office published its 4th annual State of the UK Climate report this week that gives a summary of the UK weather and climate through the calendar year 2017, alongside the historical context for a number of essential climate variables. It provides an accessible, authoritative and up-to-date assessment of UK climate trends, variations and extremes based on the previous year’s observational datasets. For the first time this report is being published as a special issue of the International Journal of Climatology, which is the Royal Meteorological Society journal of climate science.

Here are some key facts from the report:

  • 2017 was the 5th warmest year over land in a record dating back to 1910.
  • In contrast to summer 2018, UK summers have been notably wetter over the most recent decade, with a 20% increase in rainfall compared to 1961-1990.
  • Above average temperatures from February to June, and also in October, helped position 2017’s high temperature ranking, whilst the second half of the year saw temperatures nearer to average.
  • Nine of the ten warmest years for the UK have occurred since 2002, and the top ten have all occurred since 1990. The Central England Temperature series, which extends back to 1659, shows that the 21st century (since 2001) has so far been warmer than the previous three centuries.
  • For the UK as a whole, rainfall in 2017 was close to average, but with large regional differences. Much of Highland Scotland and lowland England were considerably dry, whilst west Wales, north-west England, and parts of south-west and north-east Scotland saw wetter condition.

Additional facts are available in the infographic published alongside the report:

Photo by Met Office, 2018

Download the full report here: https://rmets.onlinelibrary.wiley.com/doi/abs/10.1002/joc.5798


Cover photo by giografiche/Pixabay.
NOAA Hurricane Season Forecast 2018: 75% chance of near or above normal season

NOAA Hurricane Season Forecast 2018: 75% chance of near or above normal season

By Elisa Jiménez Alonso

Forecasters from the National Oceanic and Atmospheric Administration’s (NOAA) Climate Prediction Centre (CPC) predict a 35 per cent chance of an above-normal season, a 40 per cent chance of a near-normal season, and a 25 per cent chance of a below-normal season for the upcoming Atlantic hurricane season, which extends from June 1 to November 30. Prior to the peak of the season, in early August, NOAA will provide an update to this outlook.

In terms of storms, this means that there is a 70 per cent chance of 10 to 16 named storms (winds of 63 km/h or higher) forming, of which 5 to 9 could become hurricanes (winds of 120 km/h or higher), including 1 to 4 major hurricanes (category 3, 4 or 5; with winds of 179 km/h or higher). For context, average hurricane seasons tend to produce 12 named storms, of which 6 become hurricanes, which includes 3 major hurricanes.

Two of the main factors driving this outlook are the possibility of a weak El Niño developing and near-average seas surface temperatures in the Atlantic Ocean and Caribbean Sea. However, both of these factors are also influenced by atmospheric and oceanic conditions that are conducive to hurricane development and have been producing stronger hurricane seasons since 1995.

Hurricane track and intensity forecasts are incredibly important for risk management and preparedness. After 2017’s devastating Atlantic hurricane season, many communities, especially in the Caribbean, still find themselves in very vulnerable situations.


Listen to our latest podcast with Angela Burnett, author of the Irma Diaries, who witnessed Hurricane Irma first hand and collected survivor stories from the British Virgin Islands to shed light on the urgency of building back better and building resilience:

Cover photo by NOAA: NOAA’s GOES-16 satellite (now GOES-East) captured this infrared/visible image of Hurricane Harvey on August 25, 2017.
Low awareness of climate risks hampering resilience of critical infrastructure sectors claims new study

Low awareness of climate risks hampering resilience of critical infrastructure sectors claims new study

Low levels of awareness of climate risks and the availability of climate services are significant barriers to climate adaptation in the electricity sector, according to new research from Germany. However, the research also finds that the underlying market opportunity for climate services remains strong.

Damage to a critical infrastructure, its destruction or disruption by for example natural disasters, will have a significant negative impact on the security of the EU and the well-being of its citizens. Focussing on the German electricity sector, the report found that stakeholders in the sector claimed to need seasonal forecasts and decadal predictions, the latter aligning closely with energy companies’ time frames for strategic planning. However, despite this, there is currently a low level of demand for climate services from the sector.

The report found that four major barriers prevented the uptake of climate services:

  1. low awareness of the climate-related risks,
  2. low awareness of the benefits climate services can provide,
  3. mismatches between the required and the available timescales and spatial resolution of data and
  4. a lack of trust in the reliability of data.

In order to overcome these hurdles, the report recommends that considerable work needs to be done in the first instance to increase the visibility of the climate services industry and how it can contribute to the climate resilience of key sectors. It proposes that a ‘Climate Service Provider Store’ is created to provide information about where appropriate climate service providers are available.

Additionally, the case study recommends that work continues to ensure that seasonal and decadal forecast become ever-more accurate and that regional cooperation between industry networks and climate services providers are strengthened.

The case study was led by the non-profit research organization HZG under the MArket Research for a Climate Services Observatory (MARCO) programme of which Acclimatise is a proud partner. MARCO, a 2-year project coordinated by European Climate-KIC, hopes that research such as this will help to remove the barriers to the growth of the climate services industry across Europe.


Download the full case study “Critical Energy Infrastructureshere.

Download an infographic highlighting the key findings of the case study here.

Cover photo from pxhere (public domain).
India’s new disaster database to help improve country’s preparedness

India’s new disaster database to help improve country’s preparedness

By Devika Singh

In highly disaster-prone India, a new database is meant to improve disaster risk management, especially on the local level, saving lives and avoiding huge economic losses.

India’s proactive engagement in disaster management planning and preparation has evolved from a focus on relief and rescue to building resilience. The first country to publicly announce its National Disaster Management Plan (NDMP) only one year after becoming a signatory to the Sendai Framework for Disaster Risk Reduction, India is now working on a National Disaster Database that will launch by 2020.

The recently announced database will collect local level statistical data on natural disasters and their impacts. This includes disaggregated data on mortality rates, service interruption, economic losses, damages to infrastructure, and more. The database will provide the basis for government decision-making, from fund allocations and investments, to actions to reduce losses.

Aggregated national level, or even state level data on disasters and their impacts for a country of India’s proportions are of little use in planning and decision-making process. The challenges faced by people in the wake of a disaster are best assessed using local data. This will give decision-makers insights into local level impacts helping distribute funds and personnel effectively while implementing locally relevant measures to minimise losses.

Having experienced the third largest number of disasters globally in 2016 (after China and the USA), India was home to a whopping 50 per cent of the global population affected by disasters in 2015-2016 and has a total projected annual average loss of US$ 9,824 million. According to UNESCAP, disaster risk reduction interventions have an estimated return rate between four and seven times. For the Asia Pacific Region an annual investment of $2.3-$4 billion could reduce the annual average loss of $160 billion by 10%. This new database will help Indian authorities target disaster risk management investments better and thus improve their results.

Rajesh Sharma, a disaster risk information and application specialist at the United Nations Development Programme (UNDP) explains that through the database practitioners “will be able to understand what is happening in their own district and then they can take the preventive actions needed or request further information. Then we really have resilient development.”

Sharma is further encouraging the government to make the National Disaster Database open access as, “this will demand more action from all the stakeholders and I think it will raise more awareness in the long run, leading to an overall improvement in the use of disaster risk information… It could help focus minds and attention on disasters that otherwise go unnoticed, but have a huge knock-on effect for local communities.”


Cover photo by McKay Savage (CC BY 2.0): People going about their daily lives commuting to work or school just after heavy morning rains flooded the streets.
Africa’s 20,000 weather station plan

Africa’s 20,000 weather station plan

By Sarah Wild

supported by The Rockefeller Foundation Bellagio Center

Farmers across Africa need weather data and climate projections to put food on people’s plates. But often the data does not exist or is inaccessible. The same goes for scientists trying to model how the climate is changing, and how diseases spread, for the planet’s second-largest continent.

Only 10 out of Africa’s 54 countries offer adequate meteorological services, the World Bank estimates, with fewer than 300 of its weather stations meeting the World Meteorological Organisation’s observation standards.

The Bank estimates it would take US$1 billion to modernise key meteorological infrastructure.

An investment of that scale is prohibitive in countries where funds for basic services are often missing. But there may be a way around it: the Trans-African HydroMeteorological Observatory (TAHMO) hopes to expand the continent’s weather-watching capacity for a fraction of the cost.

The not-for-profit organisation has an ambitious plan to deploy weather stations at every 30km. This would translate into 20,000 individual stations supplying weather data to information-thirsty scientists, companies, and governments. It estimates it can do this for US$50 million, with an additional US$12 million a year for maintenance.

It is thrilling to be in a position to transform the culture of African climate observations and the capacity for scientific discovery. –John Selker

So far, about 500 weather stations — white cylinders about the length of a forearm and a shoe box-sized data logger — have been mounted on poles around the continent.

“It is thrilling to be in a position to transform the culture of African climate observations and the capacity for scientific discovery,” says John Selker, TAHMO co-director and a professor of hydrology at Oregon State University in the United States. Climate data is simply scarce, he says; and where it exists, scientists often have pay for it and jump through bureaucratic hurdles to get it.

Selker and colleague Nick van de Giesen from Delft University in the Netherlands established the observatory in 2010. The idea was seeded in 2005, when the two were performing an experiment in Ghana and needed rainfall data. “I asked where we could get [the data] from and [our colleague] laughed, and said there was no way we could get that data from anyone,” Selker recalls.

He had been working on electronic measurement methods and knew it was possible to develop sensors that could generate the data.

The continent’s climate is one of the world’s most understudied, according to the Future Climate for Africa report. And the authors of a World Bank report say the paucity of data from synoptic weather stations “inevitably results in poorer-quality numerical guidance and forecasts in those regions”. Calibration of the sensors used in surface observations is very important, they explain, but in practice few are calibrated to internationally accepted standards.

100 MB/year

TAHMO is trying to fill this gap with its network of weather stations. The non-profit teamed up with US-German company Meter Group to develop a weather station that measures rainfall, temperature, solar radiation, pressure, and wind speed, among other variables. The stations have no moving parts, meaning they are less likely to break and require maintenance.

Each station produces about 100MB of weather data a year, transmitted via a sim card in the data logger to a central database.

This data is available for free for scientists and local meteorological facilities, but companies wanting to access it need to pay. Selker says that, so far, TAHMO has signed memorandums of understanding with 18 African countries and received their permission to operate the network within their jurisdiction.

“It has been a massive challenge to convince African countries to make their data freely available,” he says.

Countries are suspicious of doing this because of ideas of national sovereignty and commercial value, according to Selker. “Their weather agencies have been under the impression that they can make a lot of money out of their data, even though no one has.”

The US National Oceanic and Atmospheric Administration makes its data freely available, and this is one of the main reasons why US climate and weather patterns are better understood than those on the African continent, according to Selker. “You can’t do science on proprietary data — it doesn’t help the building of scientific understanding.”

However, there is a reason African countries’ meteorological agencies charge for their data: many survive through selling it to make up for a continent-wide lack of investment in weather infrastructure, says Mark New, director of the African Climate and Development Initiative at the University of Cape Town.

TAHMO funds its operations, and the free provision to scientists and local facilities, by selling its data to companies. Technology company IBM is its biggest client: it channels the data into its subsidiary, The Weather Company, to use in its weather modelling. TAHMO also sells data to seed producers, insurers, farming companies, and consulting firms.

Intermediaries, whether they are businesses or non-profits, need weather data before they can develop the data-based services that farmers need.
Selker estimates the value of the data to the African economy at about US$100 billion a year, through industries such as insurance. “If we could insure those crops, farmers could take bigger risks,” he says. “As long as governments keep those data secured, that economic opportunity is missed.”

Ben Schaap, research lead at the Global Open Data for Agriculture and Nutrition, says there are pros and cons to keeping weather data secure. If it is free, companies can more easily use it to innovate and create services; if the data is protected, the data provider can more easily create a business model around it — but this could make using the data less economically viable for other businesses.

The public-private partnership model under which TAHMO operates is a departure from the open-access national data provision services such as NASA, or the bite-sized donor-funded projects that aim to boost climate data capacity. Selker says the African landscape is littered with the detritus of donor-funded projects that survived for the duration of the project and then fell into disrepair when the funding dried up.

“Those other projects were just that — projects,” adds Selker. “Instead, we’re an enterprise.”

However, for TAHMO’s data to benefit small-scale, poor farmers, businesses would need to know what they need and find it worthwhile to develop services for them, says Schaap.

According to Julio Araujo, research coordinator for Future Climate For Africa, there’s ongoing debate on the best model for delivering climate services, and TAHMO is one to watch. “Selling services to African countries… is an improvement over the status quo,” he says.

Data hunger

Scientists welcome the idea of free and easily accessible data, especially for areas where conflict and poor governance mean it simply does not exist, according to New.

John Kioli, chairman of Kenya’s climate change working group, welcomed the possibility of free on-the-ground data. “We need data for research and also forecasting. It is important to have reliable data and, if possible, data that is coming from Africa.”

Scientists have already started publishing articles using TAHMO data, says Selker. To access it, they simply need to apply and declare research use. “We’re sending data to scientists in the United Kingdom, US, Europe, and Africa. We’re seeing that hunger and demand for African climate data in the African science community too,” he says.

They could help answer scientific questions over how the African climate as a whole is changing, but also tackle uncertainties around how this change plays out on a city level.

“There are many possibilities,” says Marieke de Groen, TAHMO’s regional coordinator for southern Africa, who oversees eight stations in Johannesburg. She says weather data is essential to get a better understanding of water needs and water flows, which have implications for storm water management and how groundwater is recharged.

For a large city like Johannesburg, where it may be raining in one part while the sun shines in another, the effects of extreme weather events will not be uniform across the city, explains De Groen. “You have to get a picture of the distribution. Satellite and radar data help, but this is from the ground up.”


This article was supported by The Rockefeller Foundation Bellagio Center. For nearly 60 years the Bellagio Center has supported individuals working to improve the lives of poor and vulnerable people globally through its conference and residency programs, and has served as a catalyst for transformative ideas, initiatives, and collaborations.

This article was originally published on SciDev.Net. Read the original article.

Cover photo by NASA (public domain): Dust storm off the west coast of northern Africa.

Do satellites hold the answer to reporting greenhouse gases?

Do satellites hold the answer to reporting greenhouse gases?

By Jane Burston

In 2002, the European satellite Envisat gave the world its first operational orbiting sensor for detecting atmospheric greenhouse gases from space. Since then, dozens of missions have launched or are being planned around the world – including most recently by US NGO, the Environmental Defense Fund – each promising increasingly precise and comprehensive ways of monitoring greenhouse gases from orbit.

Improved greenhouse gas monitoring is important if we are to meet the goals of the Paris Agreement on climate change. Monitoring is needed for many reasons: to report emissions to the UN Framework Convention on Climate Change (UNFCCC); to determine where to focus reduction efforts; and to track the success of policy interventions.

However, satellites are not yet capable of replacing our current systems for reporting emissions. Instead, they are complementary, adding to our knowledge about the factors causing emissions – and how those emissions affect the climate.

Hard to measure

At a sub-national level, cities and regions are increasing monitoring efforts. Thousands of cities and regions have voluntarily signed up to reducing their emissions through platforms like those provided by CDP, ICLEI and C40.

Many industrial companies also want to measure emissions on their sites, driven by the need to comply with regulation and, in the case of gas extraction and transport, to ensure they are not losing a valuable commodity.

Greenhouse gases are hard to measure: there are multiple natural and anthropogenic sources at all scales which are unpredictably distributed in space and time. In the oil-and-gas sector, for example, gas pipelines can be tens of thousands of kilometres long, some facilities are in remote areas making accessibility a challenge, and emissions might be sporadic – occurring when specific processes are running rather than continuously.

In the agriculture sector, sources of emissions include livestock, which are small points of geographically dispersed emissions, and rice paddies, which stretch across large amounts of land in tropical regions and emit variably, depending on the time of year and the weather.

Currently, these emission sources are measured using ground-based techniques, but these are by no means perfect. Covering all the world’s sources with sensors is financially and practically impossible. Regional data from aircraft and networks of tall towers are expensive and labour intensive to maintain.

Satellite solution?

On the face of it, satellites could overcome these issues. The same sensor could provide global coverage, ensuring consistent measurements and overcoming the issues of inaccessibility and cost.

Satellite capabilities have already come a long way since the debut measurements of Envisat 16 years ago. The European SCIAMACHY sensor on-board Envisat was constrained by a large field of view (30km by 60km), suitable only for studying regional or global distribution of greenhouse gases in the atmosphere. In comparison, Europe’s latest satellite capable of monitoring methane, Sentinel-5P, provides greater spatial resolution, with a field of view of 7km by 7km.

Global coverage, however, is not as simple as it seems.

Current satellite instruments rely on sunlight to measure greenhouse gases, meaning they cannot operate at night or see through clouds. This is a particular problem for tropical regions, which are big sources and sinks of greenhouses gases due to expanding wetlands, an abundance of rice paddies and extensive forest-burning to provide land for farming.

A satellite’s orbit also restricts where and when it can make a measurement. Most satellites pass over the same area of ground every two weeks or so, restricting the ability to provide repeat measurements of a single location over a short amount of time.

Solving problems

Future satellite designs, orbits and the use of constellations – groups of satellites working together – could get around some of these problems.

NASA’s planned GeoCARB mission will operate in geostationary orbit, meaning it will travel at the same speed as the Earth rotating below it, allowing it to stay above the same point on Earth above North and South America. This ability to stare at a point for longer than other satellites, along with flexible scanning patterns and a small footprint, will enable the satellite to make cloud-free measurements over most of the Americas at least once per day. However, this comes at the expense of global coverage, which can only be overcome by launching constellations at enormous cost.

Another approach is for a satellite to produce its own source of light instead of relying on the sun. The planned Franco-German MERLIN satellite design includes its own active laser source, allowing it to measure in darkness. Unfortunately, the power of the planned laser limits the areas over which measurements can be taken to a thin “pencil-line” track that will limit timely global sampling. Additionally, the instrument cannot see through clouds, so will not boost monitoring in the tropics or in other cloudy regions.

Arguably, the biggest issue is that current satellite technology is not sensitive enough to detect many sources of emissions. In order to deliver quantification of greenhouse gases, resolutions on the order of metres – not kilometres – will be required.

Commercial satellites are coming onto the market claiming capabilities for making point-source greenhouse gas measurements down to the sub-50 metres level. If proven, this technology could be game-changing for facility-scale monitoring. However, currently no publicly available validation activities have taken place. Until this happens, the precision and useful applications of these satellites cannot be established.

This means that industrial facilities will need to rely on ground-based measurements or aerial flyovers for some time to come.

Direct quantification of emissions using satellite data is also not currently robust or granular enough to provide direct measurements of greenhouse gas emissions to contribute to UNFCCC reporting.

Complementary roles

Even though they lack the capability to detect and quantify individual sources, satellites can still play two roles in emissions reporting.

The first is to improve data about those activities taking place that can affect emissions, and to what extent, rather than measuring the greenhouse gas emissions directly. For example, satellite images can enable us to see where deforestation is happening. This is very different to measuring the amount of CO2 taken up by, or the greenhouse gases emitted from a forest. Activity data – such as deforestation rates – is only part of the puzzle, but it is a significant part.

The second is a potential longer-term role. The UNFCCC guidelines allow countries to supplement their “bottom-up” emissions inventory reporting, which work by using facility-by-facility information, with “top-down” measurements and reporting of atmospheric concentrations, which is a useful “sense check” on the inventory.

The UK already makes these measurements using a network of tall towers, such as Ridge Hill in Herefordshire and Mace Head in Ireland. As capabilities improve in the future, it is possible these atmospheric measurements could be aided by satellites.

As well as reporting emissions, the data provided by satellites, such as Envisat, have also proved a key tool for academic study. The data might be too coarse for quantifying individual sources of emissions, but the regional and global scale measurements have given scientists a better understanding of the carbon cycle and improved modelling of atmospheric dispersion of emissions. Sentinel-5P, for example, aims to improve our understanding of chemical processes occurring in the atmosphere and how these are linked to our climate.

Not a replacement

Do satellites hold the answer to comprehensive round-the-clock monitoring of greenhouse gases and their sources? Not yet.

Satellites are certainly not yet capable of replacing components of the current ground-based monitoring and inventory system. However, seeing satellites as a direct replacement for ground-based monitoring may be missing the point.

We will always need to use a variety of data sources; combining them can yield the most accurate and comprehensive results. Continued development and complementary use of satellites, alongside other techniques, will provide valuable data that will ultimately help improve our understanding of emissions and allow us to eventually get on top of reducing them.


Jane Burston is head of energy and environment at the UK’s National Physical Laboratory. This article originally appeared on Carbon Brief and is share under a Creative Commons license.

Cover photo by Alex Gindin on Unsplash.
Half of Earth’s satellites restrict use of climate data

Half of Earth’s satellites restrict use of climate data

Mariel Borowitz, Georgia Institute of Technology

Scientists and policymakers need satellite data to understand and address climate change. Yet data from more than half of unclassified Earth-observing satellites is restricted in some way, rather than shared openly.

When governments restrict who can access data, or limit how people can use or redistribute it, that slows the progress of science. Now, as U.S. climate funding is under threat, it’s more important than ever to ensure that researchers and others make the most of the collected data.

Why do some nations choose to restrict satellite data, while others make it openly available? My book, “Open Space,” uses a series of historical case studies, as well as a broad survey of national practices, to show how economic concerns and agency priorities shape the way nations treat their data.

The price of data

Satellites can collect comprehensive data over the oceans, arctic areas and other sparsely populated zones that are difficult for humans to monitor. They can collect data consistently over both space and time, which allows for a high level of accuracy in climate change research.

For example, scientists use data from the U.S.-German GRACE satellite mission to measure the mass of the land ice in both the Arctic and Antarctic. By collecting data on a regular basis over 15 years, GRACE demonstrated that land ice sheets in both Antarctica and Greenland have been losing mass since 2002. Both lost ice mass more rapidly since 2009.

Satellites collect valuable data, but they’re also expensive, typically ranging from US$100 million to nearly $1 billion per mission. They’re usually designed to operate for three to five years, but quite often continue well beyond their design life.

Many nations attempt to sell or commercialize data to recoup some of the costs. Even the U.S. National Oceanic and Atmospheric Administration and the European Space Agency – agencies that now make nearly all of their satellite data openly available – attempted data sales at an earlier stage in their programs. The U.S. Landsat program, originally developed by NASA in the early 1970s, was turned over to a private firm in the 1980s before later returning to government control. Under these systems, prices often ranged from hundreds to thousands of dollars per image.

In other cases, agency priorities prevent any data access at all. As of 2016, more than 35 nations have been involved in the development or operation of an Earth observation satellite. In many cases, nations with small or emerging space programs, such as Egypt and Indonesia, have chosen to build relatively simple satellites to give their engineers hands-on experience.

Since these programs aim to build capacity and demonstrate new technology, rather than distribute or use data, data systems don’t receive significant funding. Agencies can’t afford to develop data portals and other systems that would facilitate broad data access. They also often mistakenly believe that demand for the data from these experimental satellites is low.

If scientists want to encourage nations to make more of their satellite data openly available, both of these issues need to be addressed.

Promoting access

Since providing data to one user doesn’t reduce the amount available for everyone else, distributing data widely will maximize the benefits to society. The more that open data is used, the more we all benefit from new research and products.

In my research, I’ve found that making data freely available is the best way to make sure the greatest number of people access and use it. In 2001, the U.S. Geological Survey sold 25,000 Landsat images, a record at the time. Then Landsat data was made openly available in 2008. In the year following, the agency distributed more than 1 million Landsat images.

For nations that believe demand for their data is low, or that lack resources to invest in data distribution systems, economic arguments alone are unlikely to spur action. Researchers and other user groups need to raise awareness of the potential uses of this data and make clear to governments their desire to access and use it.

Intergovernmental organizations like the Group on Earth Observations can help with these efforts by connecting research and user communities with relevant government decision-makers. International organizations can also encourage sharing by providing nations with global recognition of their data-sharing efforts. Technical and logistical assistance – helping to set up data portals or hosting foreign data in existing portals – can further reduce the resource investment required by smaller programs.

Promise for future

Satellite technology is improving rapidly. I believe that agencies must find ways to take advantage of these developments while continuing to make data as widely available as possible.

Satellites are collecting more data than ever before. Landsat 8 collected more data in its first two years of operation than Landsat 4 and 5 collected over their combined 32-year lifespan. The Landsat archive currently grows by a terabyte a day.

This avalanche of data opens promising new possibilities for big data and machine learning analyses – but that would require new data access systems. Agencies are embracing cloud technology as a way to address this challenge, but many still struggle with the costs. Should agencies pay commercial cloud providers to store their data, or develop their own systems? Who pays for the cloud resources needed to carry out the analysis: agencies or users?

The ConversationSatellite data can contribute significantly to a wide range of areas – climate change, weather, natural disasters, agricultural development and more – but only if users can access the data.


Mariel Borowitz, Assistant Professor of International Affairs, Georgia Institute of Technology. This article was originally published on The Conversation. Read the original article.

Cover photo by Tim Mossholder on Unsplash.
Satellites detect climate-change-driven sea level rise acceleration

Satellites detect climate-change-driven sea level rise acceleration

By Elisa Jiménez Alonso

A study released earlier this year shows that sea level rise is already happening, and it is accelerating.

Gathering and evaluating satellite data from different missions, like Jason-3 and TOPEX/Poseidon, a team of researchers led by University of Colorado-Boulder professor Steve Nerem found that in the last 25 years sea levels had risen a total of 7 centimetres. However, the rate at which this happened was not constant but has increased.

Using data from the Gravity Recovery and Climate Experiment, also called GRACE, the scientists determined that the acceleration is caused by global warming. More than half of the observed sea level rise is due to thermal expansion, meaning that as ocean water gets warmer it expands and its level rises. The rest can be attributed to melting ice from Greenland’s and Antarctica’s ice sheets.

The observed acceleration has the potential to “double the total sea level rise by 2100 as compared to projections that assume a constant rate – to more than 60 centimetres instead of about 30,” according to Nerem.

The projections from this new study also align with those from the climate models used by the Intergovernmental Panel on Climate Change (IPCC). Using Earth observation data, the scientists have observed evidence that validates those model projections providing a “data-driven assessment of sea level change that does not depend on the climate models.” The IPCC models show sea level rise between 52 and 98 centimetres by the end of this century under a business-as-usual scenario in which current emissions are not reduced.

For coastal cities such projections are a worrying signal as they could cause unprecedented problems and flooding during high tides and storm surges. Even though the number might not sound like much, the records set in Boston Harbor during this year’s ‘bomb cyclone’ or the regular inundations in Miami during king tides are occurring with the seven-centimetre sea level rise from the past century.


S. Nerem, B. D. Beckley, J. T. Fasullo, B. D. Hamlington, D. Masters, G. T. Mitchum (2018). Climate-change–driven accelerated sea-level rise. Proceedings of the National Academy of Sciences Feb 2018, 201717312; DOI: 10.1073/pnas.1717312115

Cover photo by NOAA (public domain): Artistic rendering of Jason-3 satellite mission.
Using satellites to track the retreat of Antarctica’s glaciers

Using satellites to track the retreat of Antarctica’s glaciers

By Dr. Hannes Konrad

The long history of the Antarctic Ice Sheet is one of ups and downs. Over millions of years it has responded to ever changing ocean and air temperatures and snowfall as its glaciers have retreated and advanced.

Unlike the long-gone ice sheets that covered large parts of North America and Europe, Antarctica still holds huge volumes of ice even in today’s warm conditions. Its ice also forms by far the largest reservoir of fresh water on the planet – enough to raise global sea levels by about 60 metres if it were to melt.

In a new study, my colleagues and I provide the most complete assessment of Antarctica’s outer glaciers to date, showing that between 2010 and 2016 they lost an area of ice equivalent to the size of Greater London.

A natural frontier of the Antarctic Ice Sheet

Antarctica’s rocky landscape is covered by around 25m cubic kilometres of ice. It has hundreds – if not thousands – of glaciers oozing their way from the continent’s centre out towards its coast.

While these glaciers are many kilometres thick towards the centre of Antarctica, they are much thinner at their lowest end – sometimes called the “toe” or “snout”. As a result, the ones that make it to the ocean will often lift off the ground and float on the surface of the water to form ice shelves.

You can try this yourself with ice cubes and a glass of water. While most ice cubes will float, if the water level is low enough, the larger ones can actually rest on the bottom of the glass.

The distinction between “grounded” and “floating” ice is an important one for glaciologists, because it is only where ice is resting firmly on ground that changes in its size affect global sea levels. This deduction – based on the theory discovered by Greek mathematician Archimedes – is the same that dictates that adding an ice cube to a glass of water will raise the level, but if that ice cube melts, the water won’t rise any further.

Grounded ice is also held back by friction with the rock beneath. In contrast, floating ice glides freely over the water and can gain speed much more easily. The exact point between the floating and grounded parts of a glacier is the so-called “grounding line”, a natural frontier of Antarctica’s glaciers.

Small changes to the glacier can affect how much is lying on the seabed. Returning to the example of an ice cube sitting at the bottom of a glass of water, just a tiny amount of melting (or adding a bit more water) can be enough for it to lift up and begin to float.

So keeping a close eye on glacier grounding lines is a critical aspect of monitoring the health of Antarctica’s glaciers.

Thousands of years of retreat

But pinpointing and tracking grounding lines is easier said than done.

Found at the sea bed, grounding lines are buried underneath huge amounts of ice – often hundreds of metres thick. This means they cannot be identified by simply looking at the ice from above or indeed from any other angle.

There are, however, a number of indirect methods that scientists can use. For example, rising sea levels from melting ice sheets in the northern hemisphere and a warming climate have forced Antarctic glacier grounding lines to retreat since the end of the last ice age around 12,000 years ago. This retreat has left traces in the sediments deposited on the seafloor, from which we can reconstruct how quickly Antarctic ice has melted.

Based on this type of analysis, it appears that the Antarctic grounding line has retreated by an average of around 25 metres per year in active areas over the past thousand years, with occasional periods of more rapid retreat reaching several hundred metres per year.

In recent decades, developments in Earth observations from satellites have provided another tool with which to monitor changes in the Earth’s ice.

For example, scientists use satellites to detect where the ice begins to float by tracking movements due to the influence of ocean tides. Grounding line motion is then detected by comparing two or more of the grounding line positions observed from space.

Using this data, scientists have recorded grounding line retreat as fast as two kilometers per year in places in Antarctica. (It is worth noting that although this is faster than the pace of retreat recorded since the last ice age, satellites are better able to capture short bursts of rapid retreat than methods using seabed sediments.)

Grounding lines today

In our new Nature Geoscience study we use another approach, which relates ice thickness changes to grounding line movement. As an ice shelf thins, more of it lifts off the seabed and begins to float, pulling the grounding line inland. Likewise, if an ice shelf gets thicker, more of it sinks to the seafloor and the grounding line advances.

Using data from the European Space Agency’s CryoSat-2 satellite, we quantify grounding-line movement along a third of Antarctica’s grounding lines, which cover 61 glaciers. This is three times the length and four times the number of glaciers as mapped in previous surveys.

The animation below gives an overview of how we detected grounding line retreat using satellite measurements of ice thickness.

Animation illustrating how horizontal motion of glacier grounding lines is detected using satellite measurements of their elevation change. Credit: Konrad et al. (2018)

Our results show that most grounding lines across Antarctica are in retreat. This is particularly the case in West Antarctica where 22% of grounding lines are retreating at a faster rate than the average speed when the Earth was emerging from the last ice age (25 metres per year). This compares with 3% of grounding lines in East Antarctica and 10% of those at the Antarctic Peninsula.

Overall, the continent lost almost 1,500 square kilometres of “grounded” ice between 2010 and 2016. That’s an area the size of Greater London. For the fastest-flowing glaciers, their grounding lines are retreating by around 110 metres for every metre that the glacier thins near the grounding line.

You can see the changes in the map below, which shows the pace of grounding retreat (red) and advance (blue) between 2010 and 2016. Most of the glaciers in rapid retreat are along the “English Coast” from the tip of the Peninsula to the Ross Sea.

Grounding line retreat in some of these areas is faster than one kilometre a year. However, it appears that the notorious Pine Island Glacier – one of the fastest retreating glaciers around the turn of the millennium – has now stabilised, showing that extreme speeds are not necessarily sustained. The neighbouring Thwaites Glacier, however, is retreating just as quickly as it has been since at least the 1990s.

In East Antarctica we generally see slower motion, somewhat split equally between retreating and advancing grounding lines, and we see that the grounding lines of neighbouring glaciers can easily move in opposite directions.

Map showing rates of grounding line migration and their coincidence with ocean conditions around Antarctica between 2010 and 2016. Seabed temperatures taken from NOAA’s World Ocean Atlas 2013, Volume 1: Temperature. Grounding line locations are from Rignot et al. (2013). Credit: Konrad et al. (2018).

Warm ocean water gnawing at the ice

So what is causing this retreat around the Antarctic ice sheet?

Oceanographic studies have found that warm ocean water is melting the undersides of Antarctica’s ice shelves, gnawing at the ice periodically which eventually forces these glaciers to give way.

It shows how intertwined the various parts of our planet are and how easily things can get out of balance. In this sense, detecting grounding line motion, or changes in Antarctic glaciers in general, often teaches lessons about oceans (or other parts of the system Earth), too, and vice versa.

Scientists know that the Antarctic Ice Sheet has been both bigger and smaller than today in the past. So it is, in principle, not surprising that it is currently moving slowly but steadily away from its current extent.

For coastal communities and low lying islands, though, the path down towards a smaller Antarctic Ice Sheet is worrying as the future pace and scale of decline and related sea level rise remain uncertain.


Dr Hannes Konrad was formerly a research fellow in the School of Earth and Environment at the University of Leeds and now works at the Alfred Wegener Institute for Polar and Marine Research in Germany.

Konrad, H. et al. (2018) Net retreat of Antarctic glacier grounding lines, Nature Geoscience, doi:10.1038/s41561-018-0082-z

This article was originally published on Carbon Brief and is shared under a Creative Commons license.

Cover photo by Matt Palmer on Unsplash.