A bill under consideration in New York would provide a private right of action, allowing people to file lawsuits against chatbot owners who violate the law.
LLMs and chatbots should not be giving medical advice. You are afraid of the private healthcare system, not the lack of access to the most janky bandaid fix for its failures.
The line between medical advice and personal research is pretty freaking gray, so banning medical advice. Does that also ban talking to llms about anything that is medical adjacent?
Does medical adjacent mean personal disabilities? Drug related interests? Pet health? Stretches? Pain support?
Anything that falls under “Health, Wellness, and Fitness”?
…etc
It’s a slippery slope and we don’t need to be sliding down it
People are so vicious over this tech they would rather have disabled poor people with cancer suffer and die under inadequate care than do anything about the inadequate care. Ban the tech, but let this all go on.
If you are perfectly able and well, you can ignore all advice that isn’t perfect.
The perspective they seem to lack is frightening. The empathy they refuse to engage is massive. This is able-ism.
Tech companies are bad, but use of tech will cure and ease cancer, HIV, and chronic disease. Bring on the downvotes.
“Would rather have disabled people with cancer suffer and die…”
My guy, that’s not a lack of LLM access, it’s a completely fucked US healthcare system that forces people onto the internet because they can’t get what they need from the state, you goofy-ass weirdo.
Do hallucinating LLMs, that have done such things as convince a child to commit suicide before, really count as “information machines”? The Mayo clinic website might take a single whole other braincell to read through but at least it’ll be written properly.
I mean, the fact that you consider these programs to have enough credibility to be called “information machines” is exactly why they’re so potentially dangerous.
LLMs and chatbots should not be giving medical advice. You are afraid of the private healthcare system, not the lack of access to the most janky bandaid fix for its failures.
Neither should Wikipedia or Google. So I guess by your logic nobody should search or learn about medical conditions on a computer.
You know damn well there’s an important difference related to the confidence of a bot that has been a key problem since this whole thing started.
The line between medical advice and personal research is pretty freaking gray, so banning medical advice. Does that also ban talking to llms about anything that is medical adjacent?
Does medical adjacent mean personal disabilities? Drug related interests? Pet health? Stretches? Pain support?
Anything that falls under “Health, Wellness, and Fitness”?
…etc
It’s a slippery slope and we don’t need to be sliding down it
People are so vicious over this tech they would rather have disabled poor people with cancer suffer and die under inadequate care than do anything about the inadequate care. Ban the tech, but let this all go on.
If you are perfectly able and well, you can ignore all advice that isn’t perfect.
The perspective they seem to lack is frightening. The empathy they refuse to engage is massive. This is able-ism.
Tech companies are bad, but use of tech will cure and ease cancer, HIV, and chronic disease. Bring on the downvotes.
“Would rather have disabled people with cancer suffer and die…”
My guy, that’s not a lack of LLM access, it’s a completely fucked US healthcare system that forces people onto the internet because they can’t get what they need from the state, you goofy-ass weirdo.
Well yes of course but also restricting access to information machines doesn’t exactly help much either.
Do hallucinating LLMs, that have done such things as convince a child to commit suicide before, really count as “information machines”? The Mayo clinic website might take a single whole other braincell to read through but at least it’ll be written properly.
I mean, the fact that you consider these programs to have enough credibility to be called “information machines” is exactly why they’re so potentially dangerous.