In 2023, the world has been captivated by the growing potential of artificial intelligence, and the opportunity it holds for the future of technology. From industry-specific chatbots to niche AI capabilities, the fascination held by tech-enthusiasts and industry-leaders alike has been unmatched when it comes to these intelligent systems.
But what’s really going on behind the scenes? Recently, users of one of the most popular models of AI - ChatGPT - were met with an error message on the site, reading ‘ChatGPT is at capacity.’ Developed by OpenAI, ChatGPT is a language model, designed to generate ‘human-like’ responses, articles, essays and stories to user-written prompts and questions. A user could give the instruction ‘write an article about the stock market’, for example, and be handed a fully-fleshed out, grammatically correct piece on that very topic.
Naturally, this tool was becoming increasingly popular with the general public, not just for its ability to assist in written communications, but also for the novelty of having a computer literally do your homework for you. Therefore, when ChatGPT suddenly announced it had hit capacity, it made waves online and raised some big questions about the sustainability of AI itself.
What Does ‘Hitting Capacity’ Mean?
Every piece of technology, every website and every online tool requires a certain amount of processing power to help it run, with servers located in massive data centres across the country. Whilst the exact locations of these data centres is undisclosed, they’re likely to be placed strategically - to allow as many users as possible to access the software without suffering long loading times. Currently, OpenAI sources its servers through Microsoft Azure - one of several big name companies offering outsourced data centres to technology companies for a fee. Brands such as Google, Amazon and Microsoft have a significant number of data centres and servers available, and these servers can be accessed with a simple click of a button - meaning that it isn’t a lack of resources causing ChatGPT to hit capacity.
Instead, ChatGPT rose to popularity far faster than anyone could have imagined, reaching over over 100 million active users in just two months - marking it as the fastest growing application in history, easily surpassing the figures of Facebook, Instagram and TikTok. From a business perspective, this means that the product is in demand, and could easily be monetised and privatised for their growing audience. Announcing that the product had ‘hit capacity’, means users will be seeking alternative ways to access the service and will be willing to pay for memberships and subscriptions to keep using the AI.
What The Popularity of ChatGPT Means for Power Usage
ChatGPT’s unprecedented success has been a milestone for AI technologies, with alternative softwares and tools being developed and launched every single day - designed to follow in its footsteps. More AI software will require more power, more data centres, more equipment and more storage facilities, demanding increasing levels of electricity to maintain the services being rolled out. High-speed servers will need to be cooled, internet connections will need to be maintained, generators will need to run and all will require serious power to keep the tools running smoothly.
In 2020, it was reported that data centres had consumed between 1-2% of the global electricity demand, using between 196 terawatt hours (TWh) and 400 TWh in that year alone. These services are consuming and using an enormous amount of power, with figures that are likely to keep on growing.
What does this mean for sustainability?
It’s impossible to deny that the use of AI is going to rise in popularity, and so the demand on energy resources is going to increase exponentially too. So where does sustainability fit into this industry-spanning growth?
In 2008, the Joint Research Centre introduced a ‘Code of Conduct’ to review and regulate the efficiency of global data centres, with a list of best practices to help ensure that eco-friendly, sustainable and energy efficient processes will take precedence in these energy-heavy industries. By 2017, over 450 data centres had signed up and had taken significant steps to reduce the energy emissions of their servers, through sourcing alternative cooling processes, the introduction of solar panels and the change to green energy providers across their centres.
Microsoft Azure, the springboard for ChatGPT, has also taken action towards a more sustainable digital output, through water positive initiatives, net-zero goals and the promise to be powered by 100% green energy by 2025.
But is it enough?
The sheer scale of potential for chatbots, AI, models and data is monumental and the growth of these individual products is unmatched - but is the industry growing faster than the research can manage? Do these efforts towards sustainability make enough of a dent in the overall energy consumption of worldwide data centres and technology giants?
At Clean Energy Capital, we believe that green, clean energy should be the basepoint for every industry, and should be as accessible, affordable and attainable as the unsustainable energy alternatives. Our services can help these data centres (as well as other large power users) across the country find more sustainable solutions for powering their sites, and reduce the overall cost of managing and maintaining a large technological space.
‘The explosion in externally located computing power since the inception of cloud has put significant strain on regional grid networks. Developers and operators are looking increasingly to behind-the-meter power solutions to provide growth in areas with minimal power capacity. Virtual/AI, for example, Chat GPT, will continue to apply this pressure.’
Sam Dight - CEC, Director
More industries and businesses need to make the commitment to following eco-friendly practices and finding renewable, sustainable ways of delivering their products in 2023 - and Clean Energy Capital can help.