Data centers existing makes sense, but this specific aggressive AI data center buildout (with special-purpose hardware) doesn’t: the two AI companies you mentioned, OpenAI and Anthropic, aren’t making a profit, and they don’t appear to have a viable path to one. OpenAI claims it’ll be wildly profitable in just a few years, but they don’t go into how.
“aren’t making a profit” gets into the mess that is book keeping and is a giant rabbit hole people actively avoid because it is just easier to get angry at stupidity rather than complex malfeasance.
But what makes something an “AI data center” outside of the branding?
The reality is that it is a shit ton of computers connected to a really fast internet connection. Preferably through a properly managed set of switches but you do you. And the reason that we still mostly use GPUs for “AI” rather than highly specialized hardware (although, nvidia DID just buy groq a few months back…) is for that reason. They might do linear algebra of quarter precision floats REALLY well but they also do linear algebra of single and double precision floats pretty well too. And the CPUs and mobos (that are mostly optimized for data movement to offload to said GPUs) are no slouches either.
Which is what most of these companies are planning for. openai is, arguably, really fucking stupid. Whereas anthropic have shown decent signs of “diversifying” as it were. And nvidia… if we lived in a world where they could get enough RAM I think they would be fine. As it stands… Jensen (and a LOT of people) are kinda fucked and I expect to see a hard pivot over the next 12 months.
Because if we banned ALL generative AI tomorrow? The people who think you can’t use a computer without installing litellm first are gonna be fucked. But everyone else will just put other workloads on there and be… “fine” is a strong word but they won’t go bankrupt. And the data centers themselves will still be incredibly valuable.
I wish GPUs in AI data centers (or worse, the ones purchased and not installed yet) were more general-purpose than they appear to be. That’s the part that makes them AI data centers: the optimized hardware.
I do agree things are complex. And I like reading about the intricacies of that complexity. The overall picture is still a pretty bad one, though.
Data centers existing makes sense, but this specific aggressive AI data center buildout (with special-purpose hardware) doesn’t: the two AI companies you mentioned, OpenAI and Anthropic, aren’t making a profit, and they don’t appear to have a viable path to one. OpenAI claims it’ll be wildly profitable in just a few years, but they don’t go into how.
“aren’t making a profit” gets into the mess that is book keeping and is a giant rabbit hole people actively avoid because it is just easier to get angry at stupidity rather than complex malfeasance.
But what makes something an “AI data center” outside of the branding?
The reality is that it is a shit ton of computers connected to a really fast internet connection. Preferably through a properly managed set of switches but you do you. And the reason that we still mostly use GPUs for “AI” rather than highly specialized hardware (although, nvidia DID just buy groq a few months back…) is for that reason. They might do linear algebra of quarter precision floats REALLY well but they also do linear algebra of single and double precision floats pretty well too. And the CPUs and mobos (that are mostly optimized for data movement to offload to said GPUs) are no slouches either.
Which is what most of these companies are planning for. openai is, arguably, really fucking stupid. Whereas anthropic have shown decent signs of “diversifying” as it were. And nvidia… if we lived in a world where they could get enough RAM I think they would be fine. As it stands… Jensen (and a LOT of people) are kinda fucked and I expect to see a hard pivot over the next 12 months.
Because if we banned ALL generative AI tomorrow? The people who think you can’t use a computer without installing litellm first are gonna be fucked. But everyone else will just put other workloads on there and be… “fine” is a strong word but they won’t go bankrupt. And the data centers themselves will still be incredibly valuable.
I wish GPUs in AI data centers (or worse, the ones purchased and not installed yet) were more general-purpose than they appear to be. That’s the part that makes them AI data centers: the optimized hardware.
I do agree things are complex. And I like reading about the intricacies of that complexity. The overall picture is still a pretty bad one, though.