Hugging Face estimated the emissions for its own large language model, BLOOM, in order to test its approach. The amount of energy used to train the model on a supercomputer, the amount of energy needed to manufacture the supercomputer's hardware and maintain its computing infrastructure, and the amount of energy used to run BLOOM once it had been deployed were all variables. The researchers used a software tool called CodeCarbon to track the carbon emissions from BLOOM over the course of 18 days.

According to Hugging Face, BLOOM's training resulted in 25 metric ton of carbon emissions.

50 metric tons of carbon emissions is the equivalent of around 60 flights between London and New York, which is less than the emissions associated with other LLMs. BLOOM was trained on a French computer that doesn't produce carbon emissions. Some parts of the US and China have energy grids that rely more on fossil fuels.

According to Hugging Face, using the model emits 19 kilograms of carbon dioxide per day, which is similar to the emissions produced by driving 54 miles in a new car.

More than 500 and 75 metric tons of carbon dioxide were estimated to be emitted by OpenAI's GPT3 and Meta's OPT during training. Older, less efficient hardware was used to train GPT 3. There is no standard way to measure carbon emissions, and these figures are based on estimates or limited data, so it is not certain what the figures are for.

To account for a larger part of the life cycle in order to help the artificial intelligence community get a better idea of their impact on the environment is what our goal was.

Emma Strubell, an assistant professor in the school of computer science at Carnegie Mellon University, wrote a paper on the impact of artificial intelligence on the climate in 2019. This new research was not done by her.

According to Strubell, the paper presents the most thorough, honest, and knowledgeable analysis of the carbon footprint of a large ML model to date.