There are people who can’t properly function without an llm. And it’s not just a few, a good bunch of humans have decided to leave the reasoning skills to a chatbot so they don’t have to do it.
It’s because people get mislead by the “agent”, assuming there’s something actually intelligent at the other end, able to act like they would, just… Automated.
This is because the advertising for LLMs present them as if they were intelligent.
LLMs are being promoted as a tool that can do anything even though the only thing they do well is output text that resembles human patterns. It is a hammer and they are pretending everything is a nail.
There are people who can’t properly function without an llm. And it’s not just a few, a good bunch of humans have decided to leave the reasoning skills to a chatbot so they don’t have to do it.
And I feel sympathy towards all of them, except the ones who appeared on the Jimmy Fallon Show to promote helplessness as a lifestyle.
It’s because people get mislead by the “agent”, assuming there’s something actually intelligent at the other end, able to act like they would, just… Automated.
This is because the advertising for LLMs present them as if they were intelligent.
LLMs are being promoted as a tool that can do anything even though the only thing they do well is output text that resembles human patterns. It is a hammer and they are pretending everything is a nail.
I think it’s worse: it’s laziness. It’s easier to ask a machine so it does the job for you. And since it looks mostly ok, they keep doing it.