Should the hidden carbon footprint of AI be regulated, Gabriele Röger?
Text: Gabriele Röger
Is a legal framework needed to regulate energy-intensive artificial intelligence applications? A debate between a computer scientist and an environmental economist.
Much of the huge progress made by artifi cial intelligence (AI) can be attributed to machine-learning techniques, foremost among them the method known as deep learning. Deep learning is the process whereby a neural network is trained using a very large set of examples. In the case of speech recognition, for instance, these examples might take the form of an extensive corpus of recorded speech and the corresponding text.
On an intuitive level, neural networks are inspired by the human brain: the training process either reinforces or weakens the connections between individual neurons. In other words, a neural network is a mathematical model with parameters that are adjusted by training so as to accurately reflect the examples provided. As a result of this training, in the application stage the system is able to process similar data with relatively low expenditure of time and energy: a smart speaker understands a spoken request to play a particular song, for example.
The amount of energy consumed during the training stage, however, is considerable, and a few years from now could make up a sizeable portion of global energy consumption. This sits awkwardly with the goal of significantly reducing carbon emissions in order to combat climate change. So is regulation the answer?
Regulating just the use of AI would produce limited results for the reasons stated above. However, a more comprehensive regulatory approach would be difficult to implement, as the highly energy-intensive training stage occurs in computing centers that can be located almost anywhere in the world.
While the amount of energy consumed is virtually invisible to the end user, the companies involved have a very clear interest in minimizing their costs and using renewable energy sources for reputational reasons. Amazon and Google, for instance, already operate wind and solar farms to power their computing centers. Moreover, the high utilization rate of these centers as a result of cloud computing improves their relative energy consumption.
Nevertheless, it is vital beyond this to reduce this energy consumption. There are good reasons to hope that technological developments will themselves contribute in this regard, in particular with the emergence of increasingly energy-efficient learning algorithms and dedicated hardware. This topic has received growing attention in the scientific community in recent years.
Researchers are currently exploring ways to measure and compare the environmental cost of algorithms, which will hopefully soon become a key criterion in the evaluation of new approaches alongside factors such as time expenditure and precision. As regards hardware, development is undergoing a shift away from the graphics processors used in the past toward dedicated chips such as tensor processing units able to perform the same calculations faster, while consuming less energy.
Yet, these approaches only target the energy use of AI applications themselves. At the same time, however, artificial intelligence is already contributing to energy savings in a host of other areas: for example, energy consumption in buildings can be dramatically reduced by taking into account information such as usage habits or weather forecasts. In the renewables sector, AI can help with energy availability forecasts and grid stabilization.
In general terms, many AI applications are geared toward more efficient use of resources, and are therefore likely to make a growing contribution to the goal of improving our carbon footprint. This benefit must be weighed up against the energy consumed by the technology itself. Overall, the savings brought about by artificial intelligence can be expected to outweigh their additional energy consumption by a signifi cant margin. With this is mind, aside from the difficulty of implementing it, regulation might actually have a negative impact in terms of climate targets.
Gabriele Röger is a postdoctoral researcher in the Artificial Intelligence research group at the Department of Mathematics and Computer Science. She deals primarily with automated planning and search in large state spaces.
More articles in the current issue of UNI NOVA.