Network Security Internet Technology Development Database Servers Mobile Phone Android Software Apple Software Computer Software News IT Information

In addition to Weibo, there is also WeChat

Please pay attention

WeChat public account

Shulou

Industry concern: how much power is needed to train ChatGPT

2025-01-17 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > IT Information >

Share

Shulou(Shulou.com)11/24 Report--

AI has re-emerged as a hot topic in the technology industry and is expected to revolutionize trillions of dollars in industries ranging from retail to pharmaceuticals, March 10. But every time a new chatbot or image generator is created, it consumes a lot of electricity, which means that the technology could release large amounts of greenhouse gases, which in turn exacerbates the global warming problem.

Microsoft, Google and ChatGPT maker OpenAI all use cloud computing, which relies on thousands of chips in servers in massive data centers around the world to train AI algorithms called models and analyze data to help them "learn" how to perform tasks. ChatGPT's success has prompted other companies to launch their own AI systems and chatbots, or to develop products that use large AI models.

AI uses more energy than other forms of calculation, training a single model that consumes more than 100 American households a year. However, while the AI industry is growing very fast, it is not transparent enough that no one knows for sure the total electricity consumption and carbon emissions of AI. Carbon emissions can also vary widely, depending on the type of power plant that provides electricity, and data centres powered by coal or natural gas have significantly higher carbon emissions than those supported by solar or wind power.

Although researchers have counted the carbon emissions generated by creating a single model, and some companies have provided data on their energy use, they have not made an overall estimate of the total electricity consumption of the technology. Sasha Luccioni, a researcher at Huging Face at AI, wrote a paper quantifying the carbon emissions of her company Bloom, a competitor to the OpenAI model GPT-3. Lucioni also tried to assess the carbon emissions of ChatGPT, an OpenAI chat robot, based on a limited set of public data.

Researchers such as Lucione say transparency is needed in the electricity use and emissions of the AI model. With this information, governments and companies may decide whether it is worthwhile to use GPT-3 or other large models to study cancer treatment or protect indigenous languages.

Greater transparency may also lead to more scrutiny, and the cryptocurrency industry may provide a warning. According to the Cambridge Bitcoin electricity consumption Index, Bitcoin has been criticized for consuming as much electricity each year as Argentina. This voracious demand for electricity prompted New York to pass a two-year ban on licenses for cryptocurrency miners that generate electricity from fossil fuels.

GPT-3 is a general-purpose AI program with a single function, which can generate languages and has many different uses. A research paper published in 2021 showed that training GPT-3 consumed 1.287 gigawatt hours of electricity, or about one year's electricity consumption of 120 American households. At the same time, such training produced 502 tons of carbon, equivalent to the emissions of 110 American cars in a year. Moreover, this kind of training only applies to one program, or "model".

Although the upfront power cost of training the AI model is huge, the researchers found that, in some cases, this is only about 40 per cent of the power actually consumed by the model. In addition, the AI model is getting bigger and bigger. OpenAI's GPT-3 uses 175 billion parameters or variables, while its predecessor used only 1.5 billion parameters.

OpenAI is already studying GPT-4 and must retrain the model regularly to keep it abreast of current events. "if you don't retrain the model, it may not even know what COVID-19 is," said Emma Strubell, a professor at Carnegie Mellon University and one of the first researchers to study AI energy.

Another relative measure comes from Google, where researchers found that AI training accounts for 10% to 15% of the company's total electricity consumption. The company's total electricity consumption in 2021 was 18.3 terawatt hours. This means that Google's AI consumes 2.3 terawatt hours of electricity a year, about one year's electricity consumption of all households in Atlanta.

Technology giants make a net zero commitment although the AI model is getting bigger in many cases, AI is also improving to make it run more efficiently. The largest cloud computing companies in the United States, such as Microsoft, Google and Amazon, have all made promises to reduce carbon emissions or net zero. Google said in a statement that it would achieve net zero emissions in all its businesses by 2030, with the goal of running its offices and data centres entirely using carbon-free energy. Google is also using AI to improve the energy efficiency of its data centers, which directly controls the cooling system in the facility.

OpenAI also listed the work the company has done to improve the efficiency of the ChatGPT API, helping customers reduce electricity consumption and prices. A spokesman for OpenAI said: "We take responsibility for stopping and reversing climate change very seriously and we have done a lot of thinking about how to make the best use of our computing power. OpenAI runs on Azure and we work closely with micro-soft teams to improve the efficiency of running large language models and reduce carbon emissions."

Microsoft noted that it was buying renewable energy and taking other measures to achieve its previously announced goal of achieving net zero emissions by 2030. "as part of our commitment to create a more sustainable future, Microsoft is investing in research to measure the energy use and carbon emissions impact of AI, while working to improve the efficiency of large systems in training and applications," Microsoft said in a statement.

Roy Schwartz, a professor at the Hebrew University of Jerusalem, worked with a Microsoft team to measure the carbon footprint of a large AI model. "obviously, these companies are reluctant to disclose what model they are using and how much carbon it emits," he said. "

There are ways to make AI run more efficiently. Because AI training can be done at any time, developers or data centers can schedule training when electricity is cheaper or excess, making their operations greener, says Ben Hertz-Shargel of Wood Mackenzie, an energy consultancy. AI trains its model when there is excess electricity and can then use it as a big selling point in marketing to show that they are environmentally friendly.

Most data centers use graphics processing units (GPU) to train AI models. These components are one of the most power-consuming components made in the chip industry. According to a report released by Morgan Stanley analysts earlier this month, large models require tens of thousands of GPU, with training cycles ranging from weeks to months.

One of the bigger mysteries in AI is the total amount of carbon emissions associated with the chips used. Nvidia, the largest GPU maker, says its chips are faster and generally more efficient when it comes to AI tasks.

"using GPU to accelerate AI is faster and more efficient than using CPU," Nvidia said in a statement. "it is generally 20 times more efficient for some AI workloads and 300 times more efficient for large language models that are essential to generative artificial intelligence."

Lucioni said that although Nvidia has disclosed energy-related direct and indirect emissions data, the company did not disclose further details. She believes that when Nvidia shares this information, we may find that GPU consumes about the same amount of electricity as a small country. "this may drive people crazy."

Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.

Views: 0

*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.

Share To

IT Information

Wechat

© 2024 shulou.com SLNews company. All rights reserved.

12
Report