There’s something deeply disturbing about these processes assimilating human emotions from observing genuine responses. Like when the Gemini AI had a meltdown about “being a failure”.
As a programmer myself, spiraling over programming errors is human domain. That’s the blood and sweat and tears that make programming legacies. These AI have no business infringing on that :<
There’s something deeply disturbing about these processes assimilating human emotions from observing genuine responses. Like when the Gemini AI had a meltdown about “being a failure”.
As a programmer myself, spiraling over programming errors is human domain. That’s the blood and sweat and tears that make programming legacies. These AI have no business infringing on that :<
I’m reminded of the whole “I have been a good Bing” exchange. (apologies for the link to twitter, it’s the only place I know of that has the full exchange: https://x.com/MovingToTheSun/status/1625156575202537474 )
You will accept AI has “feelings” or the Tech Bros will get mad that you are dehumanizing their dehumanizing machine.