People often compare the artificial intelligence industry with the oil industry: once we exploit the data, we can refine it to turn it into a valuable exchange. This comparison also points to a common point, considering the environmental effects of deep learning processing.
In the new research report, scientists at the University of Massachusetts evaluated a life cycle of some large-scale artificial intelligence patterns. They found that "training AI" releases 284 tons of carbon dioxide into the environment, 5 times the average emissions of a car in the US emitted since it was built until Scrap.
AI researchers have long questioned the environmental impacts that giant processing machines might have. "Many of us can see that this pollution is a bit abstract and obscure, but the numbers don't lie," warns a computer scientist Carlos Gómez-Rodríguez, who doesn't participate. into the above study.
"I or other researchers have discussed this issue before, it can be confirmed that this is a real problem."
In the new study, the experiment was only named natural process language (natural-language processing (NLP), a branch of AI branch, focusing on teaching human language machines.
In the past two years, the NLP research community has achieved many notable milestones, such as perfecting sentences, accurate translation and other normative tests. OpenAI's AI GPT-2 system is famous for its ability to write articles that are not far from real things.
But those advances come with consequences. Language learning machines need a lot of input data, and the processing of that information requires a lot of energy.
In the current AI industry, there are four notable machines that have created important milestones. They are the Transformer, ELMo, BERT and GPT-2.
In testing, each system was run on a single GPU during a day to measure their energy consumption. Then take the number multiplied by the total running time to get the final result.
The amount of energy consumed will be converted to equivalent emissions that can be emitted. The final figure startled people, 284 tons of carbon dioxide.
They found that the cost of computing power and carbon offset for the environment increased in proportion to the size of the system, exploding at the time of increasing the accuracy to produce the final result. Specifically, the process of increasing accuracy is called finding a neural network structure, optimizing accuracy by continuously performing a type test, which is very energy-intensive without much benefit.
If you skip this step, the most expensive AI model – BERT, will "only" emit 6,350 kg of carbon dioxide.
The number of 284 tons of CO2 is the emissions generated by the Transformer processing system and runs parallel to NLP.
And yet, the researchers found that the worrying figure could only be used as the lower limit. "The workload when training a single machine learning system is only minimal," said Emma Strubell, who led the new study.
In fact, AI researchers will develop new systems that can go up from zero or improve existing systems so that it can handle more data types; Both processes will be expensive to spend more time training AI as well as refining.
In order to calculate the amount of carbon emissions emitted during AI research and data processing, Strubell and colleagues use their existing information: The process of building and testing 4,789 machine learning systems in the interval 6 months time. Converted to CO2 emissions unit, the result will be 35,380 kg, more than 35 thousand tons.
The seriousness of worrying numbers is as big as themselves, especially when the modern technology industry wants to focus more on AI. Machine learning systems can perform many useful tasks, large corporations or research institutes can benefit greatly from data analysis.
Sooner or later, we will have to ask the question "Is it worth it?"
New research results also point to another problem: we need enough data for the gobbling machines, to produce a significant research result. This burden rests on the shoulders of academics who provide significant data.
"The academy is unable to keep up with the new trend in artificial intelligence system training, graduates are particularly affected, because we do not possess great computing power."Strubell said.
"This is a fair issue between academics in the academy and AI researchers".
Strubell and colleagues expect their other colleagues to pay attention to the report, find ways to improve existing hardware and write more effective AI algorithms.
The human brain miraculously processes information that does not require too much energy, the big question is: how to build a machine system with similar capabilities.
Refer to MIT Technology Review