☆ Yσɠƚԋσʂ ☆

  • 1.75K Posts
  • 1.7K Comments
Joined 6 years ago
cake
Cake day: January 18th, 2020

help-circle





  • Completely agree, and now it’s just hip to say how much you hate AI. This kind of performative action doesn’t really accomplish anything, but it lets people feel good about themselves and gain social acceptance. Actually building an alternative takes work. The whole Linux analogy is very apt here because we’ve always had alternatives to corporate offerings, but most people don’t want to invest the time into learning how to use them.


  • I’d argue it’s inevitable for the simple reason that the whole AI as a service business model is a catch 22. Current frontier models aren’t profitable, and all the current service providers live off VC funding. And if models become cheap enough to be profitable, then they’re cheap enough to run locally too. And there’s little reason to expect that models aren’t going to continue being optimized going forward, so we are going to hit an inflection point where local becomes the dominant paradigm.

    We’ve seen the pendulum swing between mainframe and personal computer many times before. I expect this will be no different.




  • Seems to me there’s a huge amount of incentive for Chinese companies to pursue these things since China isn’t investing in a massive data centre build outs the way the US is. And their chips are still behind. Another major application is in robotics where on device resources are inherently limited. The only path forward there currently is by making the software side more efficient. It also looks like Chinese companies are embracing the whole open weights approach and treating models as shared infrastructure rather than something to be monetized directly.

    And local models have been improving at a really fast pace in my opinion. Stuff like Qwen 3.5 is not even comparable to the best models you could run locally a year ago.



  • Right, so far no American company managed to make any actual profits of selling LLMs as a service, and the cost of operating the data centres is literally an order of magnitude higher than the profits they pull in. And the kicker is that if models get efficient enough to bring the costs down, then they become efficient enough to run locally. So the whole business model fundamentally doesn’t make sense. Either it’s too expensive to operate, or nobody will want to use it as a service because running your own gives you privacy and flexibility.

















  • Exactly, and a lot of big companies in US are heavily reliant on Chinese models already. For example, Airbnb uses Qwen cause they can self host it and customize it. Cursor built their latest composer model on top of Kimi, and so on. There are far more companies using these tools than making them, so while open models hurt companies that want to sell them as a service, they’re lowering the cost for everyone else.




  • Do elaborate. The tech industry has gone through many cycles of going from mainframe to personal computer over the years. As new tech appears, it requires a huge amount of computing power to run initially. But over time people figure out how to optimize it, hardware matures, and it becomes possible to run this stuff locally. I don’t see why this tech should be any different.