- cross-posted to:
- technology@lemmy.ml
- technology@lemmy.world
- cross-posted to:
- technology@lemmy.ml
- technology@lemmy.world
Huang says ‘The AGI said you need to give me your money. The AGI said it, I don’t make the rules you guys.’.
Sounds like a pump & dump.
Yes and I just figured out how to levitate too.
What a moron
I’ll remember this in 12 months when 50% of new LLMs are still getting the “I want to wash my car, the car wash is 100 yards from my house, should I walk or drive?” question wrong. I mean, they’ll add that one to the training data so they’ll probably be getting that one right by then, but there will always be examples of why THESE THINGS AREN’T INTELLIGENT.
I use LLMs quite a bit, they’re neat tools and helpful, but anyone who tries to say they’re AGI is either an idiot or just too heavily invested in the bubble.
I think I’ve shit a gold brick, Jensen what’s your address, I’ll mail it to you for confirmation.
What are its thoughts on black leather jackets
I was thinking, what is the closest thing humans have made to a LLM? Basically it’s a corporation. Words go round and round and at some point decisions are made. Yet how good are the guardrails for corporate activities. We have been struggling for generations to stop corporations being amoral or feral, and they have people at the top that can be held accountable. At least in principle. So what of AIs which reflect both the best and worst human responses but do so chaoticly. Putting that tech in charge of anything is foolish at best.




