Greenhouse gas emissions from training ChatGPT-3 consume 50,000+ gallons of gasoline. This is concerning, isn’t it? So, today let us learn facts about the energy consumption in AI models and the importance of Generative AI for a more sustainable world.
How Much Energy Does ChatGPT Use Per Day?
The energy usage of the ChatGPT 3 model per day is around 2916KWh.
When you enter keywords in Conversational AI models like ChatGPT(Generative Pretrained Transformer), based on the input received, the model generates a response dynamically mimicking human types. However, this process requires access to high-performance computing resources, including powerful CPUs, GPUs, or specialized AI accelerators like TPUs.
To handle large-scale deployments and accommodate high volumes of user interactions, GPT may leverage distributed computing techniques of its 175 billion parameters for search. This consumes energy because the Internet runs on servers aligned with hardware.
We asked ChatGPT itself to determine its daily consumption but did not get a credible answer.
So, we have to find a way to assume the energy consumption. After going through this research paper, training the GPT-3 model required the equivalent of 405 years’ worth of energy consumption from Nvidia V100 GPUs, which are known to consume around 300W of power. In simpler terms, it would take approximately 405 V100 GPUs working simultaneously for one year to complete the training process.
Using this formula, you can find the energy consumption of ChatGPT 3, which is 300W (V100 consumption of power)*24 hours per day * 365 days per year * 405 years = 1064 MWh approximately.
This resulted in an emission of approximately 460 metric tons of CO2 equivalent.
Now, using this Greenhouse Gas Equivalencies Calculator, this data is equivalent to
- Greenhouse gas emissions from 1,179,951 miles driven by an average gasoline-powered passenger vehicle or
- CO2 emissions from 58 homes energy use for one year
Per day energy consumption of ChatGPT 3
The energy usage of GPT 3 per day is 300W(V100 consumption of power)*24 hours per day * 405 years = 2916 KWh approximately
It is equal to 1.3 tons of Carbon Dioxide equivalent which is similar to CO2 emissions from
- 1413 pounds of coal burned or
- 153,446 smartphones charged
Note: These figures may vary as ChatGPT releases new versions over time or there could be updates in methodologies and variables.
How Much Energy Does ChatGPT Consume Per Month
The energy usage of GPT 3 per month can be calculated as
300W(V100 consumption of power)*24 hours per day *30 days(on an average) * 405 years = 87,480 KWh approximately.
- Energy use of 4 homes for 1 year
- This is comparable to the greenhouse gas emissions produced by driving an average gasoline-powered passenger vehicle for 97,013 miles.
- This is equivalent to the CO2 emissions generated by consuming 3,717 gallons of diesel fuel.
After knowing this information, interest lies in knowing Open AI ChatGPT energy use per query but it may not be advisable to simply divide the energy consumed for training ChatGPT (1064 MWh) by the number of queries in a single day. Trained algorithms can be utilized multiple times across several days, making it challenging to accurately track the number of ChatGPT queries over time.
Also Read: ChatGPT’s Thirsty Data Centers are draining Water Resources
How Much Electricity or Energy does AI Consume?
The current AI energy usage trends suggest that the technology may consume up to 29.3 TWh of electricity annually. This is equivalent to the total annual electricity consumption of the entire country of Ireland.
By 2027, global AI-related electricity usage could surge by 85.4–134.0 TWh annually, mainly driven by newly manufactured servers. This increase is comparable to the annual electricity consumption of countries like the Netherlands, Argentina, and Sweden.
How to Make Generative AI Greener
You can follow these tips to make Generative AI greener:
1. Leverage Large Generative Models Rather than Creating New Ones: Companies, except big vendors or cloud providers, don’t need to create big models. These companies usually have the data and lots of computing power in the cloud, so they don’t have to go through the hassle of building them. This reduces the need for extensive training, which consumes significant computational resources.
2. Implement Energy-Conserving Computational Methods: Local processing on tiny microcontrollers saves power consumption a thousand times more efficiently without transferring data to external servers. For example, use techniques like sparsity regularization or low-precision arithmetic, and TinyML for data processing to reduce the computational workload.
3. Apply Large Model Only When It Offers Importance: Avoid using unnecessarily large models for tasks where smaller models suffice. Before resorting to machine learning or artificial intelligence, developers should explore various alternative solutions through research and analysis. Only when a large model provides substantial value should it be considered, ensuring that resources are used efficiently to address the problem at hand.
4. Be Selective About When You Use Generative AI: Exercise caution in employing generative AI, especially for tasks like generating blog posts or amusing stories, which may not warrant the heavy computational resources required. Avoid deploying generative models for tasks where simpler, traditional methods are sufficient, thereby reducing unnecessary energy consumption.
5. Promote Energy Sources for Cloud Provider or Datacenter: Support initiatives for using renewable energy sources to power AI infrastructure and operations, to minimize the carbon intensity of AI and software.
6. Re-Use Resources: Reusing technology whenever possible, and recycling materials for newer tech components like laptops and processors also minimizes the environmental impact of resource extraction.
7. Include AI Activity in your Carbon Monitoring: Track the carbon footprint associated with your AI activities, including training and inference processes. Companies should publicize the carbon monitoring data to enable informed decisions by customers regarding AI-related engagements. Emissions calculation relies on data from suppliers and processing firms like research labs and AI service providers such as OpenAI.
8. Explore Smart Algorithm Techniques: Invest in research and development of energy-efficient algorithms for generative AI. They guide software in interpreting visuals, and audio, enabling them to derive insights. They are like recipes, guiding software, and hardware as chefs and kitchen appliances, respectively. Just as recipe streamlines cooking tasks, efficient algorithms optimize data processing, reducing the workload on software and hardware components. Enhanced efficiency diminishes computing power requirements, contributing to greener AI practices.
9. Invest in Energy: For instance, companies could partner with organizations like EARTHLY to assess their carbon footprint using industry standards, or use numerous packages and online tools, such as CodeCarbon, Green Algorithms, and ML CO2 Impact, which are accessible for estimating emissions within your code during runtime. They would provide customized proposals for carbon mitigation strategies, allowing companies to monitor their environmental impact through the platform.
Also See: Australia-based Neara’s AI Technology to Protect Utilities from Extreme Weather
Google Search Energy Consumption Per Year
Google’s energy usage surged to 22.29 TWh in the year 2022, up from 12.7 TWh in 2019.
However, Google has undertaken many initiatives to address its energy consumption. It has committed to powering its operations with 100% renewable energy by investing in renewable energy projects such as wind and solar farms, and carbon offset projects. To compensate for its carbon emissions, it is focusing on improving the energy efficiency of its data centers and facilities.
The below graph shows the Energy consumption of Google for the years 2018 to 2022
We embrace technological advancements but it is essential to know the energy footprint of AI models like OpenAI’s ChatGPT which handles millions of queries each day. It depends on factors like the size of the model, the complexity of tasks, and the hardware used for computation. Moreover, to create a sustainable world, you can check out the positive and negative environmental impact of artificial intelligence.