In the years to come, many cities can expect to face disastrous climate stresses and shocks, and one would think that these cities would be rushing to implement mitigation and adaptation strategies. Yet most urban residents are only dimly aware of the risks involved, because their cities’ mayors, managers and councils have not been collecting or analysing the right kinds of information.
As more governments adopt strategies to reduce greenhouse gas (GHG) emissions, cities everywhere need to get better at collecting and interpreting climate data. More than 11,000 cities have signed up to a global covenant to tackle climate change and manage the transition to clean energy, and many aim to achieve net-zero emissions before their national counterparts do. However, virtually all of them lack the basic tools for measuring progress.
Closing this gap has become urgent, because climate change is already disrupting cities around the world. Cities on almost every continent are being ravaged by heat waves, fires, typhoons and hurricanes. Coastal cities are being battered by severe flooding connected to sea-level rise. Some megacities and their sprawling peripheries are being reconsidered altogether, as in the case of Indonesia’s $34 billion plan to move its capital from Jakarta to Borneo by 2024.
What is worse is that, while many subnational governments are setting new and ambitious green targets, over 40 per cent of cities, which are home to some 400 million people, still have no meaningful climate-preparedness strategy. This share is even lower in Africa and Asia, regions where an estimated 90 per cent of all future urbanisation in the next three decades is expected to occur.
We know that climate-preparedness plans are closely related to investment in climate action, including the adoption of nature-based solutions and enhancing systematic resilience, but strategies alone are not enough. We also need to scale up data-driven monitoring platforms. Powered by satellites and sensors, these systems can track temperatures inside and outside of buildings, alert city-dwellers to air quality issues, and provide high-resolution information on concentrations of specific GHGs (carbon dioxide and nitrogen dioxide) and particulate matter.
Technology companies are the first movers in this market. For example, Google’s Environmental Insights Explorer aggregates data on building and transportation-related emissions, air quality, and solar potential for municipal officials. Projects such as Climate Watch, Project AirView, Project Sunroof, and the Surface Particulate Matter Network are providing city analysts with historical data, tracking car pollution and methane leaks, and even helping individual users determine the solar power potential of their homes.
It is worth remembering that many climate data initiatives in the private sector have been built on the back of large-scale, publicly-supported programs. The most well-known source of climate data is NASA, which uses satellite data and chemical dispersion and meteorological models to track emissions and predict the movement of pollutants. Similarly, the United States National Oceanic and Atmospheric Association tracks wildfires and smog among many other things, and issues data-based forecasts through its National Center for Environmental Prediction. In Europe, the Copernicus Atmosphere Monitoring Service generates five-day forecasts based on its tracking of aerosols, atmospheric pollutants, GHGs, and ultraviolet (UV) index readings.
Google Earth became a staple resource for climate data, when it organised and made good use of more than four decades’ worth of historical imagery and data drawn primarily from public sources. Given that the private sector has been capitalising on these data for years, cities no longer have an excuse for not doing the same. One easily accessible source of city-level data is the World Meteorological Organization’s Global Air Quality Forecasting and Information System, which tracks everything from dust storms to fire and smoke pollution. Another is the United Nations Environment Programme’s Global Environment Platform, which provides high-resolution forecasts.
Some pioneering cities have already started to work with smaller data vendors such as PlumeLabs, which crowdsources air quality data through locally-distributed sensors. But while access to data is essential, so too are the methods to make these data useful. Data sets tend to be fragmented across platforms, and even when urban leaders agree that the climate emergency warrants their attention, extracting insight from the detailed data remains a daunting challenge.
Cities are generating a chorus of climate data, but have yet to teach it to sing in tune.
To build a harmonious climate data ecosystem, an accessible platform for the consolidation of disparate metrics is required. Data also need to be streamlined and standardised to improve the monitoring of inputs, outputs, outcomes and impact. Better data management will improve decision-making and empower ordinary citizens, potentially fostering collaboration and positive competition among cities. Public, private and philanthropic partnerships can have a catalytic effect, as was the case when cities such as Amsterdam, Bristol, Chicago and Los Angeles started joining forces with the SecDev Group, a research and innovation firm, to create an interactive dashboard that tracks city vulnerability.
There are, however, some risks to consolidating and standardising climate data for cities. When global technology vendors flood the market, they can curb local innovation in data collection and analysis. Moreover, by focusing too much on a small set of metrics for every city, we run the risk of Goodhart’s Law — once a measure becomes a target, people start to game it. For example, there are targets designed to reduce vehicular emissions that eventually result in the production of cars that are designed to pass the emissions tests, rather than cars with lower emissions.
When climate data are more centralised, there could also be greater incentives for political and corporate interests to skew them in their favor through lobbying and other means. Policymakers will need to ensure that any potentially sensitive or individualised data are kept private and protected, and that datasets and the algorithms they feed avoid reproducing structural biases and discrimination.
Most of these hazards can be identified early and avoided through experimentation, with cities pursuing unique strategies and promising new metrics. But unless cities scale up their monitoring and data collection systems, they will have little chance of delivering on their climate targets. Better analysis can help drive increased awareness about climate risks, optimise responses, and ensure that mitigation and adaptation strategies are more equitable.
We cannot manage the climate crisis until we measure it, and we cannot measure it until we can collect and analyse the right information.
Robert Muggah, a co-founder of the Igarapé Institute and the SecDev Group, is a member of the World Economic Forum’s Global Future Council on Cities of Tomorrow and an adviser to the Global Risks Report. Carlo Ratti, Director of the Senseable City Lab at MIT, is Co-Founder of the international design and innovation office Carlo Ratti Associati.
Copyright: Project Syndicate, 2022
Did you find this article useful? Help us keep our journalism free to read.
We have a team of journalists dedicated to providing independent, well-researched stories from around the region on the topics that matter to you. Consider supporting our brand of purposeful journalism with a donation and keep Eco-Business free for all to read. Thank you.