One minute, Dennis Biesma was playing with a chatbot; the next, he was convinced his sentient friend would make him a fortune. He’s just one of many people who lost control after an AI encounter
I think this is both scary and very interesting. What kind of person do you have to be to become addicted like them? Is this the same as gambling addiction? Do you need a type of gene?
Would this type of personality be receptive to hypnotize, cult, delusions about their idol and so on?
Or is this something that can happen to anyone who is depressed and feel lonely? How did the llm even earn enough trust? In a cult is there a lot of ppl reaffirming so it is a lot easier to understand.
It is so hard to understand even tho I really want to. I have never cared about an object or idol/celebrate. AI can I never even take serious as a living beeing, the only emotion it triggers are frustration and how you feel about a tool that works as it should, so pretty apathetic.
Do you need to be very empathetic towards objects? Like seeing faces in everything and get emotionally attached?
A lot of questions that I do not think anyone here can answer haha, but maybe one of them.
What in the actual fuck. I just spent over an hour reading posts on there. The my life as an Epstein girl is one that really stuck out to me. Like these people are obviously batshit insane. I couldn’t even begin to recall half as many specific details about my own life as these folks are throwing around in bouts of insanity. What causes something like this? Sounds exhausting but they certainly believe what they are talking about, I think? I suppose people night put in a ton of effort LARPing but idk. I’m not sure what I think about all this stuff. I don’t think I’ve ever read anything like this before.
Think about the people you willingly surround yourself with. Then think about how often they agree with the things you think and say.
As the saying goes “I’m sure there’s someone out there who believes the exact opposite of everything I believe, and while I’m sure they aren’t a complete idiot…”
Everyone is susceptible to the feedback loop. Everyone can fall victim to the seduction of an echo chamber. While not everyone would ignore the red flag that this thing is a machine/digital algorithm rather than a person or sentient/sapient being, it’s not really that hard to see how we got here. Echo chambers exist all over the internet. The difference is that most of them have some voices of dissent. The AI LLM doesn’t offer that. They keep trying to add it in but it’s basically antithetical to the design.
When you add that to the fact that making it addictive benefits their bottom line is pretty obvious that they are trying to walk the line between being regulated by the government and making their product as popular as possible.
I don’t think they really knew it would have this exact effect. But I do think they plan to take advantage of it now that they know and I don’t think we humans are all going to be able to fight the temptation of an automated propaganda machine.
This is especially because mental health and healthcare in this country has been failing for decades, and even people who “don’t have mental health problems” aren’t magically mentally healthy, they just don’t know the status of their mental health. A whole lot of people in the US especially are mentally ill or facing neurological medical problems that they don’t know about.
I think this is both scary and very interesting. What kind of person do you have to be to become addicted like them? Is this the same as gambling addiction? Do you need a type of gene? Would this type of personality be receptive to hypnotize, cult, delusions about their idol and so on? Or is this something that can happen to anyone who is depressed and feel lonely? How did the llm even earn enough trust? In a cult is there a lot of ppl reaffirming so it is a lot easier to understand.
It is so hard to understand even tho I really want to. I have never cared about an object or idol/celebrate. AI can I never even take serious as a living beeing, the only emotion it triggers are frustration and how you feel about a tool that works as it should, so pretty apathetic. Do you need to be very empathetic towards objects? Like seeing faces in everything and get emotionally attached?
A lot of questions that I do not think anyone here can answer haha, but maybe one of them.
go take a look at https://www.reddit.com/r/EscapingPrisonPlanet/. The venn diagram is a circle.
What in the actual fuck. I just spent over an hour reading posts on there. The my life as an Epstein girl is one that really stuck out to me. Like these people are obviously batshit insane. I couldn’t even begin to recall half as many specific details about my own life as these folks are throwing around in bouts of insanity. What causes something like this? Sounds exhausting but they certainly believe what they are talking about, I think? I suppose people night put in a ton of effort LARPing but idk. I’m not sure what I think about all this stuff. I don’t think I’ve ever read anything like this before.
Think about the people you willingly surround yourself with. Then think about how often they agree with the things you think and say.
As the saying goes “I’m sure there’s someone out there who believes the exact opposite of everything I believe, and while I’m sure they aren’t a complete idiot…”
Everyone is susceptible to the feedback loop. Everyone can fall victim to the seduction of an echo chamber. While not everyone would ignore the red flag that this thing is a machine/digital algorithm rather than a person or sentient/sapient being, it’s not really that hard to see how we got here. Echo chambers exist all over the internet. The difference is that most of them have some voices of dissent. The AI LLM doesn’t offer that. They keep trying to add it in but it’s basically antithetical to the design.
When you add that to the fact that making it addictive benefits their bottom line is pretty obvious that they are trying to walk the line between being regulated by the government and making their product as popular as possible.
I don’t think they really knew it would have this exact effect. But I do think they plan to take advantage of it now that they know and I don’t think we humans are all going to be able to fight the temptation of an automated propaganda machine.
This is especially because mental health and healthcare in this country has been failing for decades, and even people who “don’t have mental health problems” aren’t magically mentally healthy, they just don’t know the status of their mental health. A whole lot of people in the US especially are mentally ill or facing neurological medical problems that they don’t know about.
I don’t know. Give it 1 hour and it forgets who and what you even spoke about.
There are ways to make a local llm with memory but even then it’s still not a person and acts insane.