This browser is not actively supported anymore. For the best passle experience, we strongly recommend you upgrade your browser.
| 2 minute read

Innovations in AI sustainability

Generative artificial intelligence (AI) has recently attracted a lot of attention. It is an artificial intelligence algorithm that is capable of generating different types of content, including text, images, videos or other data, for example in response to a user query. Examples of generative AI include Gemini, ChatGPT, AlphaCode, Midjourney and DALL-E. Generative AI is being used in a wide range of technologies, including software development, finance, healthcare, entertainment and many more. Its use is estimated to significantly increase in the coming years. 

There are various concerns surrounding the use of generative AI, one of which relates to the energy usage of AI algorithms. According to Goldman Sachs, a ChatGPT query needs nearly 10 times as much electricity to process as a Google search. Analysts estimate that the power demand of data centres, which house the processing infrastructure used to run and train generative AI algorithms, will grow by 160% by 2030. To put this into context, data centres worldwide presently consume 1-2% of overall power – this is likely to increase to 3-4% by the end of the decade, a 2-fold increase!. This undoubtedly will also affect the carbon emissions of these data centres, which according to may more than double between 2022 to 2030.

Generative AI offers great possibilities to solve a variety of the world’s problems including accelerating the energy transition and the race to meet net zero targets. However, the energy requirements of utilising these revolutionary algorithms may partially offset the benefits. Just as Generative AI stemmed from innovation in computer science, innovation has a role to play is assuring AI is sustainable and its environmental footprint is managed. Generally, AI use involves two stages: training – models learn and develop by digesting vast amounts of data; and inference – applying trained AI algorithms to solve real-world problems. According to industry estimates, the environmental footprint is split with the training stage responsible for about 20%, while inference accounts for the lion’s share at about 80%.

With this information in hand, researchers are exploring actionable steps we can take to make AI more sustainable. For example, capping power usage during inference to reduce power consumption with only a small trade-off on time taken to complete AI tasks. Another evolving area is time-shifting AI workloads to align with lower energy demands, or planning larger projects for cooler months, to reduce data centre electricity usage for cooling. 

Innovations are also being explored to the data centres themselves to reduce energy usage. For example, Microsoft has been exploring submerged data centres for several years with a submerged shipping-container-sized data centre being retrieved from the water in 2020 after 2 years of studying the performance and reliability of the data centre’s servers. While underwater data centres may reduce energy demands for cooling, accessibility and maintenance are still issues to be resolved. Another avenue for research involves data centre waste heat recovery. The promise here is great with one white paper estimating more than 13 billion kWh of electricity per year are being converted to heat and mostly released unused into the environment in Germany alone. Using this waste to heat swimming pools or greenhouses, specifically vertically farming, could present a win-win scenario for sustainability.

While AI innovators are protecting their research via the IP system at an accelerating rate, the next wave of AI innovation may not be in the algorithms themselves or their use cases, but how to make them more sustainable. Protecting these innovations will help innovators maximise the commercial value of their R&D, and ensure the path to net-zero continues moving forward.

 

However, alongside existing concerns over privacy, intellectual property and security, a new worry has emerged around AI- how much energy it uses. And the numbers are scary. Consider this: A query run through the AI chatbot tool ChatGPT needs nearly 10 times as much electricity to process as a Google search, according to estimates by Goldman Sachs.

Subscribe to receive more articles like this here.

Tags

artificial intelligence, energy & environment, sustainability