In August 2010, I wondered whether climate data could be a change agent, striking a hopeful tone. Climate data gathered and released by the National Oceanic and Atmospheric Administration (NOAA) and National Aeronautics and Space Administration (NASA) “could spur better decisions and a more informed society,” I proposed.

In February 2010 the Obama administration proposed a climate service that would publish climate data in the same way that NOAA provides weather information. The data would be collected by satellites focused upon Earth’s surface insteads of the stars, and published at, a “climate services portal” that launched that year.

I’m not at all convinced that or the data on it has made a significant impact upon public opinion or attitudes about climate change or has improved understanding of the underlying science.

According to the Pew Research Center, a majority of Americans believe that the Earth is getting warmer. There is, however, a significant partisan gap with respect to whether human activity is the cause of that change, with just 44% of American adults telling Pew that they support that position.

To say that’s out of step with scientific consensus on global warming would be a gross understatement, with some 97% of climate scientists supporting the theory that human activity is behind climate change. Over the past year, that consensus in peer-reviewed science journals is even clearer, with just one author amongst 9,136 who contributed 2,259 articles dissenting. Accordingly, the US Department of Defense has drawn up a climate change adaption roadmap (PDF).

In March 2014, the world’s largest scientific society, The American Association for the Advancement of Science (AAAS) made a rare intervention into a public policy debate when it released “What We Know,” an 18-page report and website intended to raise public awareness of global warming, warn of risks, and urge action to manage the resulting risks.

Years after seeing go live, I can’t help but be somewhat chastened by my past optimism about the impact of any single website on public opinion. After all, while politics, ideology, and education all clearly matter to what people believe, it also appears that public attitudes may be at the mercy of the weather: what people see outside of their window heavily influences their thinking.

I don’t think I’m alone in doubting the efficacy of a new website, given historic lows in trust in US government, the aforementioned divide on this issue, and the damage to the Obama administration’s reputation on technology after the troubled launch of in October 2013.

Such attitudes could lead the general public to read headlines last week about the White House introducing a new climate data website and wonder why it would matter to the larger debate, much less humanity’s ability to adapt to significant changes in the environment. Can the White House really “battle climate change” with yet another “new website?”

There are already tremendous websites focused on explaining climate change and the science behind it to the general public, backed up with reams of scientific data, research, and analysis, including NASA’s climate site, NOAA’s climate site, the Natural Resources Defense Council, and RealClimate.

Given all of that context, why should anyone think the Climate Data Initiative will be any different?

Put simply, this is about web services, not websites. It’s about making huge amounts of archived open government data available to humanity, from satellites to the next generation of sensors in city buses. It’s about encouraging the biggest tech companies in the world to use the data in their professional and consumer-facing services, not publicizing a .gov website.

How? First, start by recounting a few key facts: the White House and General Services Agency launched a new climate data community at a subdomain of, not a new website.

This makes a lot of sense, as does publicizing it. Climate data is distributed across multiple agencies and isn’t always clearly labeled or highlighted. aggregates government data from across agencies and provides tools to analyze it, maps to visualize it, resources to learn more, and challenges to encourage third-party developers to use the data. So far, there doesn’t appear to be new data among the 83 datasets, but the administration indicates that more is coming.

As I reported last week, the Obama administration launched this effort to increase community resilience against climate change with the commitments from some of the biggest US tech companies in the world, including Google, Microsoft, Esri, and Intel, to apply the data in their various software platforms and tools. The World Bank also published an excellent new field guide for the “Open Data for Resilience Initiative” to scale around the globe. (Disclosure: I was consulted during the research behind its production.)

“The reduction of disaster risk must be heavily embedded in all of our development efforts,” said Rachel Kyte, special envoy for climate change at the World Bank, speaking at the White House. “We must move from a culture of response to a culture of preparation and resilience. Open and transparent data provides the basis for dialogue and political discussion.”

Second, the launch includes universities, nonprofits, and a long roster of technology companies working with the federal government. While the involvement of Microsoft Research, Intel, and Google got a lot of coverage by technology media, along with “the cloud” and “big data,” the stories and jargon may obscure what’s happening: the online publication of vast amounts of data collected by the US federal government about the Earth’s climate, for humanity to use to understand how the planet is changing.

On that count, it’s worth noting that Google will donate one petabyte of storage to host climate data and 50 million hours of processing time on the Google Earth Engine. If you aren’t familiar with Earth Engine, head over to Time Lapse and see what 30+ years of government satellite data looks like, visualized over time.

The data that powers the Earth Engine comes from the 40+ years of Landsat satellite collections (USGS/NASA) has been used to power an interactive timelapse of the planet from 1984-2012, the first high-resolution global map of deforestation, and a deforestation alert system.

What really matters, in other words, is the government data that the Obama administration is releasing, not a new website or section of one.

This isn’t just another effort to publish government data online and stand up a new .gov to promote it, hoping that some magic will take place online to transform it into insight. While there’s clearly private sector demand for climate data, as evidenced by the success of the Climate Corporation in digging in government data dirt or MapBox and satellite data, the political will to release more of it for reuse is a welcome addition.

By priming the pump for the climate data’s reuse by these technology giants, the thinking goes, the costs of digitization, structuring, publication, and hosting are being borne by private sector companies, which have an incentive to do so given the market opportunities for their products. Doing so serves the government’s much broader goal of improving the resilience of communities affected by changes in the climate and the understanding of the public.

Opening up government data for private sector innovation isn’t far from a new or unproven idea, as the history of how weather data or the history of global positioning system (GPS) data highlights. More recently, open health data shows similar promise, along with other sectors. McKinsey estimates that open data could add more than $3 trillion dollars in economic value annually. That research supports the efforts the Obama administration has made to open government data to be cataloged and published for use in a broader economy.

While it’s possible to describe the White House publicizing scientific data that shows decades of changes to the Earth’s surfaces as a “crowdsourcing campaign to prove climate change problems,” that’s not what I took away from the climate announcement. For one, the White House is clear in its public assessment of the issue.

“Climate change is a monumental economic and security challenge,” said White House advisor John Podesta, speaking at the White House this week. “It’s real, driven by human activity and happening now.”

For another, it would be a mistake to frame this week’s events in purely political terms, though naturally that’s what some outlets did. This is not simply another effort to convince the public that the Earth’s climate is changing, or that the activity of humans are responsible for that shift, but to provide the means for communities to measure risk and start making resource allocations to mitigate it. Maps and data visualizations of projected changes in water levels, desertification, and deforestation are powerful tools for teaching and projection. Such tools can also be used by emergency managers before, during, and after severe weather events.

As I’ve written elsewhere, new open government data releases and application programming interfaces (APIs) to distribute them are quickly becoming fundamental public infrastructure for digital government in the 21st century. Increasingly, government data look like public goods, substantially enabled and extended by the private sector. Some close observers have positied to me that such data releases are an example of the private-collective model of innovation, a concept Eric von Hippel and Georg von Krogh coined in 2003, where the costs of getting the data out are borne by the private sector. With enough investment to lower the cost of access to such data, the value derived for the public may mirror that of open source software.

The role of NOAA is not only to gather data but to transform it into actionable information, or “environmental intelligence,” said Kathryn Sullivan, under Secretary of Commerce for Oceans and Atmosphere and NOAA Administrator, this past week, at the White House.

Climate data about wind, rain, and snow gives the heads of state and households the foresight to look, think, and analyze ahead of time, she observed. NOAA collects 20 terabytes every day of such data, about 2 terabytes of which are published as “feedstock,” she said, creating value and insight at Esri, the Weather Channel, and many other organizations.

“Those two terabytes are feedstock,” said Sullivan. “Imagine if we can get the other 18 through the door.”