Why do you concur? You have a problem with “hallucinations” because it’s something humans do. This commentor wants to call them (among other things) “lies”, which implies intent and knowledge of falsehood which an LLM definitely can’t have. I’m not saying “halliconations” are super accurate but I don’t think the term is too positive and lessens the major issues LLMs have.
ok. so I think what you see as commenter wants to call them lies is descriptive of what the corporations are pushing (as “hallucinations” but what a reasonable person would call lies)
In other words it’s a “meta” conversation that I concur with. A LLM cannot do human things obviously, but “sales” can portray them as such.
In my day to day usage I make an actual effort to refer to that stuff that is wrong from an LLM as wrong. Not with human focused words.
I concur.
Why do you concur? You have a problem with “hallucinations” because it’s something humans do. This commentor wants to call them (among other things) “lies”, which implies intent and knowledge of falsehood which an LLM definitely can’t have. I’m not saying “halliconations” are super accurate but I don’t think the term is too positive and lessens the major issues LLMs have.
ok. so I think what you see as commenter wants to call them lies is descriptive of what the corporations are pushing (as “hallucinations” but what a reasonable person would call lies)
In other words it’s a “meta” conversation that I concur with. A LLM cannot do human things obviously, but “sales” can portray them as such.
In my day to day usage I make an actual effort to refer to that stuff that is wrong from an LLM as wrong. Not with human focused words.
fair enough