It seems that everyone is experimenting with AI at the moment, maybe by putting ChatGPT through its paces by writing nursery rhymes in the style of Eminem* or explaining blockchain "like I'm five". But did you know that every time you use ChatGPT, OpenAI may be losing money?
Every time you ask ChatGPT a question, the query is sent to datacentres where supercomputers crunch through masses of data to predict the best response. Very few computer chips have the sheer power required to perform these calculations. The chips that are capable (graphical processing units, or GPUs) are difficult to come by due to a worldwide shortage, leading Elon Musk to remark "GPUs at this point are considerably harder to get than drugs". It's understandable then that the cost for using a GPU is high, especially when combined with the ever-increasing cost of energy. It's estimated that ChatGPT costs nearly $700,000 a day.
This doesn't even take into account the money invested in developing AI models in the first place. AI model development requires skilled developers to build and train the models and huge amounts of training data, which has a significant cost as I have talked about previously.
There are social costs associated with this too, with a worry that such high costs will allow only the richest people or societies to access AI. Neither should we forget the environmental costs, with datacentres having colossal carbon footprints due to the energy used keeping GPUs running, and bottlenecks in the supply of rare earth metals.
So what can we learn from this?
Firstly, AI progress cannot rely solely on improvements to AI models. Developments in computing efficiency and cleaner energy could yield as great, if not greater, improvements in access to AI than development of the AI models. Fortunately, there are many companies working on these problems. Interestingly, some of these problems may be solved using AI techniques themselves. For example, Google DeepMind has recently announced the first improvement to a widely-used sorting algorithm for over a decade, which was discovered using AlphaDev, a deep reinforcement learning agent.
Secondly, these developments to enabling technologies should be encouraged and rewarded. This is a key function of the patent system. I'm excited about the progress that some big companies are making and I'm certain that smaller companies will have a significant impact both in AI and these adjacent areas, too. If this is an area you are working in, I'd highly recommend that you put in place a robust IP strategy and consider protecting your innovations with patents to ensure you get rewarded for your hard work.
*For those of you wondering, here is a nursery rhyme in the style of Eminem. I won't be singing it to my son any time soon!
Now gather 'round children, listen to my rhyme, I'm about to drop some words, one line at a time. I'm not your typical nursery poet, that's for sure, I'm Eminem, and I'm 'bout to rock the nursery floor.
Mary had a little lamb, well, I got a story too, About a fella, a fan of mine, who was feelin' blue. He carried dreams within, a world he sought to span, With passion and determination, my biggest admirer, Stan.
Slim Shady's in the nursery, causing a ruckus, Spittin' rhymes so fast, it'll make your head buzz. But don't worry, little ones, it's all in good fun, Just sit back, relax, and enjoy the Eminem run.