512gb of ram would let you run very good AI models, it was able to self run a very large deepseek for example. I believe they cost around 10k? It would cost a lot more than anything else out there to run a full deepseek or similarly sized models.
The problem is they can run inference on models, but they aren’t great for training them.
Apple keeps improving the memory bandwidth speed though and future generations might be good at training as well assuming their Metal software keeps improving, which isn’t as good as CUDA.
512gb of ram would let you run very good AI models, it was able to self run a very large deepseek for example. I believe they cost around 10k? It would cost a lot more than anything else out there to run a full deepseek or similarly sized models.
The problem is they can run inference on models, but they aren’t great for training them.
Apple keeps improving the memory bandwidth speed though and future generations might be good at training as well assuming their Metal software keeps improving, which isn’t as good as CUDA.
Edit: 10k before the ram price apocalypse anyway.