With AI investment arguably exceeding $1 trillion over the last few years many people are concerned about a bubble. Unlike the constant doomsayers claiming ‘AI has peaked’ despite the last improvement being released a week ago, a bubble could be a concern. But frankly I don’t think we are in that much of a bubble.
Anthropic has revenues around 7 billion a year at this point. Google has been profitable for a long time. OpenAI has raised the most money relative to its revenues but it has 700 million weekly active users. Once they roll out ads and monetization the economics will change quite a lot.
Amazon wasn’t profitable for 20 years. OpenAI has only been a for profit corporation for three years. A ton of money is being spent. It feels like a lot of other bubbles we’ve had in the last few decades. Let’s go through the logic of further investment.
Let’s say you are investing in racks of Nvidia chips. What determines your return on investment for those chips? Mainly utilization and your margin charged to cloud users. If there is a lot of demand for AI inference you will have high utilization. If your utilization is high you will be able to raise prices to increase your margin. It boils down to demand for inference and to an extent training.
So an investment on racks of Nvidia chips depends on demand for AI inference to be profitable. So we have to ask will AI inference demand go up or down? What could make demand go down? What would make demand go up?
An example of something that makes inference demand go up is the invention of chain of thought ‘Reasoning’ AI models These models use more inference to produce the same amount of higher quality output. If you were buying racks of Nvidia chips and you heard about chain of thought you would try to double your order.
Something that might make inference demand go down is the original DeepSeek announcement. They managed to make a competitive model using far less training resources than anyone else. We had a minor stock market crash in reaction.
Here is my basic argument for why investment in racks of chips are a good idea at this time.
The ‘smarter’ models get the more demand for AI there will be.
The more ways we figure out how to compose LLMs the more demand for AI there will be.
We’re currently in an immense competition between 5+ frontier AI labs to improve LLM based AI to the absolute limit. At this point we are seeing improvements month over month. When AI was useless, prior to GPT3, inference demand was very low. Today we have millions of people using ChatGPT each week. The better models get the more people want to use them.
Next we are seeing more and more ways to compose models. Claude Code takes a single human command and splits it up into tasks which are converted into a myriad of smaller inference tasks. One human prompt might result in a dozen AI generated prompts. Instead of one inference call you are getting a dozen inferences calls serving just one human request. This approach lets AI serve human demands that weren’t possible for LLMs before with even more inference.
Basically, the smarter the output of AI, the more value humans get out of it, the more demand there will be for inference. Even if we make efficiency improvements in inference that should simply increase demand. If you are making a profit per request you will increase the number of requests you make as the price per requests goes down.
What’s the other side on this? Well, if AI stops getting smarter a lot of companies are going to make far lower returns on investment than they hoped. But I think that is a really stupid bet to make. You don’t bet against further improvements in AI when improvements are coming out month over month.
In conclusion as long as LLM performance continues to improve we aren’t in an AI bubble. Once gains start to slow the bubble is over. My view is that if we get to a point where improvements are coming more slowly than once a year we will have hit the plateau in LLM based AI. But for now we are seeing month over month improvements in AI performance. I don’t think we are in a bubble.
