The Climate Emergency is the biggest existential risk to our planet and to us. Up until the start of this year, the amount of greenhouse gases in the atmosphere was increasing uncontrollably. But, over the past few months, we have all seen the dramatic changes that the restrictions due to the Covid-19 pandemic have made to the environment – in China, as an example, they have seen a 25% reduction in carbon emissions and a 50% reduction in nitrous oxides. At the same time, we have seen Artificial Intelligence come to the fore to help fight the virus by predicting its spread and helping to make sense of the plethora of research that is been carried out. But these two things – the benefit to the environment and the use of AI – are not mutually exclusive. In fact, AI is becoming a significant environmental polluter itself.
To give one example of AI’s power-hungriness, researchers recently developed a robot hand that used AI to solve a Rubik’s Cube. The achievement was impressive but apparently that single effort consumed about 2.8 gigawatt-hours of electricity, the equivalent of a whole nuclear power station running for three hours.
That is just one example, but we know that AI is being used across many different capabilities, from image recognition and prediction to driving vehicles. As the applications for AI get more and more complex, the training needs of the AI get more demanding, and the energy required to run the computers to do the training becomes more and more significant.
Training a powerful AI algorithm, and particularly the fine-tuning that is required, can require running many computers for days or even weeks. The field of AI called Natural Language Understanding (NLU), which tries to make sense of the text, has seen some big advances recently, but those advances have come at a price. Researchers at UMass Amherst found that training a single large NLU model could consume as much energy as a car over its whole lifetime, including the energy needed to build it.
Now, as the environment is near the front of everyone’s minds, AI experts themselves are becoming more and more concerned about the impact that their work has on it. For example, many academics are being transparent about how much energy their algorithms consume. That policy should spread to enterprise applications as well. Business cases should include the ‘energy debt’ within any algorithms that are used, both in terms of dollars and the environmental impact. What we all want to avoid is solving one problem with AI but creating another, potentially bigger one, in the process.