

AI is like a circular saw. Are circular saws useful?
Of course.
Can you cut your entire hand off if you don’t use it correctly? Absolutely.


AI is like a circular saw. Are circular saws useful?
Of course.
Can you cut your entire hand off if you don’t use it correctly? Absolutely.


I’m willing to bet you could convince a subset of those people that such an LLM is in fact the second coming of Christ. So not just some tool “approved” by God, but that the LLM is the God itself.
Then (to the “true believers” minds) whatever it says will be unquestionable. And then whomever is pulling the strings behind the scenes can commit whatever atrocities they desire.


I’m familiar with TempleOS — but it really doesn’t have any applicability here. It’s just something written by a guy with some mental illness who thought God was telling him what he wanted in an Operating System. But even for the faithful it’s just a tool — like how a temple itself may be an important holy place, but isn’t itself worshipped by the people who use it. Nobody considers a church to actually be their God.
That’s vastly different from an LLM that purports to be itself divine. We can setup an LLM that actually claims to be the second coming of Jesus, and there will be people will do whatever it tells them to because of belief. If you suck in enough people for enough years slowly enough to build up a cult following, and abuse them just enough to keep them in line, you’ll be able to tell them to do all sorts of truly atrocious things — and some subset will in fact go through with them.
And yes, people can do that already (see Jim Jones, David Koresh, or any other cult leader that convinced all their followers to kill themselves and their families) — but an LLM could have a vastly larger reach around the globe. We may not need for the LLM itself to become Skynet — one or two bad actors behind the scenes of a “divine” LLM might be enough to bring down humanity all by itself.


Mark my words — but we’re going to see a time (in our lifetimes) where a group of people is going to worship an AI as “divine”. And you won’t be able to convince them otherwise. An AI-centric cult is all but inevitable at this point. And it will be self-reinforcing.
Sure — as with every tool. Hammers are great for many things, but don’t do all that well driving screws. Money is one of the most used tools humans have ever devised, but you can’t use it for everything.
AI in coding may only be good for a finite set of situations — but that set is massive. You’re dealing with regular languages that can be mathematically proven to be correct (in the sense that they will generate a working program, and not in the sense that they program will in fact function the way the user intends). This is a less open-ended scenario than something like an AI generated video, and so it’s easier for AI to excel at it, especially for non-novel algorithms.
But if you use it like an idiot, you’re going to get burned — and this guy was an idiot who doesn’t understand what he’s doing, or the tools researchers in software development have made over the last few decades. AI shouldn’t be touching your production environment — at all. And it shouldn’t have to — code needs to be stored in a versioning source repository of some sort (and backed up so you are unlikely to ever lose it), deployment needs to be fully scripted and should be able to rebuild your environments from scratch (from code right to production), and developers and development tools (like AI tools) should only have access to development environments, and not production environments.
So unless you’re a total dumbass, an AI agent (or even a shitty human developer) should never have the kind of access to do what happened here. They violated some pretty basic principals of software development, and got burned. This guy sawed his own hand off because he misused the tools to take a bunch of shortcuts, without building in any backups or reproducibility. The AI isn’t the proximal fault here — trusting it when you have no way to reproduce your environment when things go wrong is the problem, and that’s 100% on the human sitting at the keyboard (PEBKAC).