From optimizing electricity grids to predicting weather patterns, artificial intelligence and big data could play a major role in decarbonizing the global economy. But without new frameworks and rules of the road, digital technologies could end up doing more harm than good for the climate.
LONDON – Long before the real-world effects of climate change became so abundantly obvious, the data painted a bleak picture – in painful detail – of the scale of the problem. For decades, carefully collected data on weather patterns and sea temperatures were fed into models that analyzed, predicted, and explained the effects of human activities on our climate. And now that we know the alarming answer, one of the biggest questions we face in the next few decades is how data-driven approaches can be used to overcome the climate crisis.
Data and technologies like artificial intelligence (AI) are expected to play a very large role. But that will happen only if we make major changes in data management. We will need to move away from the commercial proprietary models that currently predominate in large developed economies. While the digital world might seem like a climate-friendly world (it is better to Zoom to work than to drive there), digital and internet activity already accounts for around 3.7% of total greenhouse-gas (GHG) emissions, which is about the same as air travel. In the United States, data centers account for around 2% of total electricity use.
The figures for AI are much worse. According to one estimate, the process of training a machine-learning algorithm emits a staggering 626,000 pounds (284,000 kilograms) of carbon dioxide – five times the lifetime fuel use of the average car, and 60 times more than a transatlantic flight. With the rapid growth of AI, these emissions are expected to rise sharply. And Blockchain, the technology behind Bitcoin, is perhaps the worst offender of all. On its own, Bitcoin mining (the computing process used to verify transactions) leaves a carbon footprint roughly equivalent to that of New Zealand.
LONDON – Long before the real-world effects of climate change became so abundantly obvious, the data painted a bleak picture – in painful detail – of the scale of the problem. For decades, carefully collected data on weather patterns and sea temperatures were fed into models that analyzed, predicted, and explained the effects of human activities on our climate. And now that we know the alarming answer, one of the biggest questions we face in the next few decades is how data-driven approaches can be used to overcome the climate crisis.
Data and technologies like artificial intelligence (AI) are expected to play a very large role. But that will happen only if we make major changes in data management. We will need to move away from the commercial proprietary models that currently predominate in large developed economies. While the digital world might seem like a climate-friendly world (it is better to Zoom to work than to drive there), digital and internet activity already accounts for around 3.7% of total greenhouse-gas (GHG) emissions, which is about the same as air travel. In the United States, data centers account for around 2% of total electricity use.
The figures for AI are much worse. According to one estimate, the process of training a machine-learning algorithm emits a staggering 626,000 pounds (284,000 kilograms) of carbon dioxide – five times the lifetime fuel use of the average car, and 60 times more than a transatlantic flight. With the rapid growth of AI, these emissions are expected to rise sharply. And Blockchain, the technology behind Bitcoin, is perhaps the worst offender of all. On its own, Bitcoin mining (the computing process used to verify transactions) leaves a carbon footprint roughly equivalent to that of New Zealand.