I have nothing against AI but everything against a certain AI company that is fully in bed with fascists.
Are you talking about Google? Apple? Meta? Twitter? Microsoft? OpenAI?
You can’t be talking about the one company that was banned by the fascist government for not complying with their demands, because a company fully in bed with fascists would not be banned for refusing to comply. Yet, it seems in your confusion that is exactly what you’re implying.
Please do not use this slogan as an excuse to not sought out the least unethical option for your consumptions.
I don’t, and that would be Anthropic’s Claude. I don’t know about you, but I don’t have the hardware for a local LLM at the speed or proficiency they offer. Maybe you’re so fortunate, and are judging the choices of the less fortunate for not passing your purity test?
You must be American. I am talking about Kimi, Mistral, GLM, Liquid, Minimax, Arcee, Qwen, Deepseek, Xiaomi.
And you are of course allowed to use cloud inference if you don’t have the hardware to run locally. Just choose an inference service that is not in bed with fascists. There are plenty. Good luck and have a nice day.
I have nothing against AI but everything against a certain AI company that is fully in bed with fascists.
Please do not use this slogan as an excuse to not sought out the least unethical option for your consumptions.
Are you talking about Google? Apple? Meta? Twitter? Microsoft? OpenAI?
You can’t be talking about the one company that was banned by the fascist government for not complying with their demands, because a company fully in bed with fascists would not be banned for refusing to comply. Yet, it seems in your confusion that is exactly what you’re implying.
I don’t, and that would be Anthropic’s Claude. I don’t know about you, but I don’t have the hardware for a local LLM at the speed or proficiency they offer. Maybe you’re so fortunate, and are judging the choices of the less fortunate for not passing your purity test?
You must be American. I am talking about Kimi, Mistral, GLM, Liquid, Minimax, Arcee, Qwen, Deepseek, Xiaomi.
And you are of course allowed to use cloud inference if you don’t have the hardware to run locally. Just choose an inference service that is not in bed with fascists. There are plenty. Good luck and have a nice day.