I’ve been researching this a bit… I’ve come to the conclusion that there is no AI bubble. In fact, we’re only just getting started down this road. Unless there’s some massive 100x efficiency breakthrough in training AI and inference, the entire world is going to be building seemingly endless AI data centers (and the normal compute kind, e.g. for stuff like AWS, Google/YouTube, Meta, banks) for at least a decade. Probably a little longer (12-15 years before demand levels out).
Everyone thinks that “AI data center” means ChatGPT, Claude, Gemini, etc but there’s 10,000x more demand for AI than those services. Think: Pharmaceutical companies trying to find proteins, scientists (and big agriculture!) trying to model the weather, and other businesses trying to automate stuff. Not just software; robots and things like conveyor belts.
Another example: Ever use one of those self-checkouts that’s mostly just a camera pointing down, where you place the stuff you’re purchasing? That uses AI too.
Having said that, there is a great big bubble in AI: OpenAI, specifically. That will definitely pop one day. And hopefully, the DRAM bullshit will go along with it.
Yeah, the LLM and picture generation bubble will burst but that isn’t ‘AI’, it’s a tiny subset of tasks that happen to be easy to train because the companies involved have helped themselves to all of the text and images created by humanity.
The other uses of AI are harder to train, because we don’t have centuries worth of robotic motion data or a YouTube of folded protein data. Those are the uses that will have the most impact in the future, as they are developed.
LLMs are the only thing that is hyped. The other models and applications have existed already back when ChatGPT first hit the public and they have not had any special break through that would explain exponential growth in investment or a need for compute power. Language models had that with the transformer structure, everything else just develops iteratively.
The bubble we see now is because of language models and we can try and conflate it with other deep models and call it all AI, but it doesn’t change the fact that the generative models are the only ones requiring these resources and are looking for a problem to solve.
Depends on when the AI bubble pops.
I’ve been researching this a bit… I’ve come to the conclusion that there is no AI bubble. In fact, we’re only just getting started down this road. Unless there’s some massive 100x efficiency breakthrough in training AI and inference, the entire world is going to be building seemingly endless AI data centers (and the normal compute kind, e.g. for stuff like AWS, Google/YouTube, Meta, banks) for at least a decade. Probably a little longer (12-15 years before demand levels out).
Everyone thinks that “AI data center” means ChatGPT, Claude, Gemini, etc but there’s 10,000x more demand for AI than those services. Think: Pharmaceutical companies trying to find proteins, scientists (and big agriculture!) trying to model the weather, and other businesses trying to automate stuff. Not just software; robots and things like conveyor belts.
Another example: Ever use one of those self-checkouts that’s mostly just a camera pointing down, where you place the stuff you’re purchasing? That uses AI too.
Having said that, there is a great big bubble in AI: OpenAI, specifically. That will definitely pop one day. And hopefully, the DRAM bullshit will go along with it.
Those other things aren’t the bubble though, the bubble is about generative AI, not other machine learning methods
Yeah, the LLM and picture generation bubble will burst but that isn’t ‘AI’, it’s a tiny subset of tasks that happen to be easy to train because the companies involved have helped themselves to all of the text and images created by humanity.
The other uses of AI are harder to train, because we don’t have centuries worth of robotic motion data or a YouTube of folded protein data. Those are the uses that will have the most impact in the future, as they are developed.
LLMs are a bubble, AI is not.
LLMs are the only thing that is hyped. The other models and applications have existed already back when ChatGPT first hit the public and they have not had any special break through that would explain exponential growth in investment or a need for compute power. Language models had that with the transformer structure, everything else just develops iteratively.
The bubble we see now is because of language models and we can try and conflate it with other deep models and call it all AI, but it doesn’t change the fact that the generative models are the only ones requiring these resources and are looking for a problem to solve.
I agree with it not being all chatgpt type, but considering that even nvidia was hyping it up as war tech I think this is a bit of wishful thinking.
Waiting for the “waiting for ai bubble to pop” bubble to pop at this point.