AI Magazine December 2023 | Page 164

Power surge It is clear to Iron Mountain Data Centers that by far the greatest challenge in supporting generative AI is the huge surge in power loads . Generative AI models use graphics processing unit ( GPU ) chips which require 10 to 15 times the energy of a traditional CPU . Many models have billions of parameters and require fast and efficient data pipelines in their training phase , which can take months to complete . ChatGPT 3.5 , for instance , has 175 billion parameters and was trained on more than 500 billion words of text . To train a ChatGPT 3.5 model requires 300 to 500MW of power . Currently , a typical data centre requires 30 to 50MW of power . One of IMDC ’ s larger campuses , in Northern Virginia , has capacity for 10 data centres on it . The whole of the power load for this campus would be required to train ChatGPT 3.5 . While LLMs are definitely at the most power-hungry end of the generative AI boom , every generative model IMDC has worked with has processor and power needs which grow exponentially , either doubling or tripling each year .
Forecasting the power requirements of generative AI over time is hard to do with any accuracy , but most analysts agree that it will ramp up current requirements hugely . If one estimates current data centre compound growth at a relatively modest
164 December 2023