Why would it have to? It and the person doing the task already knows to do any task put in front of it. It’s one of a hundred photos for all it and the person knows.
You are extending context and instructions that doesn’t exist. The situation would be, both are doing whatever task is presented to them. A human asking would fail and be removed. They failed order number one.
You could also setup a situation where the ai and human were both capable of asking. The ai won’t do what it’s not asked, that’s the comprehension lacking.
When we are talking about LLM chat bots, they have a conversational interface. I am not talking about other types of machine learning. I don’t have time to keep responding.
. I am not talking about other types of machine learning.
Then you are making up you own conversation instead of following the thread?
The person presented a specific task to an AI, where does a chatbot come in? You seem to be confused about what Ai is, and that’s what I pointed out, thanks for making it clear.
Seriously? A chatbot is one function of an ai, not the other way around. So when you give the ai a different task or set of instructions, it’s no longer the chatbot anymore, it’s whatever function that’s needed for that task.
I weep for humanity if you’re any indication of the general education on ai….
If you ask it to create an image, are you seriously expecting it to have a conversation and point out where you messed up? That’s not how any of this works lmfao. “Hey I need to point out that ducks don’t have scales, and the sky isn’t green.” No it does what it’s asked. But now suddenly it’s different? Why?
A person would have the agency to ask, " why do you think it’s fake?"
Why would it have to? It and the person doing the task already knows to do any task put in front of it. It’s one of a hundred photos for all it and the person knows.
You are extending context and instructions that doesn’t exist. The situation would be, both are doing whatever task is presented to them. A human asking would fail and be removed. They failed order number one.
You could also setup a situation where the ai and human were both capable of asking. The ai won’t do what it’s not asked, that’s the comprehension lacking.
When people use a conversational tool, they expect it to act human, which it INTENTIONALLY DOES but without the sanity of a real human.
It’s not a conversation tool when you present it with a specific task….
Do you not understand even the basic premise of how ai works?
When we are talking about LLM chat bots, they have a conversational interface. I am not talking about other types of machine learning. I don’t have time to keep responding.
Then you are making up you own conversation instead of following the thread?
The person presented a specific task to an AI, where does a chatbot come in? You seem to be confused about what Ai is, and that’s what I pointed out, thanks for making it clear.
They are asking chatgpt. If you think that interface is not conversational, let me know how can help you.
Seriously? A chatbot is one function of an ai, not the other way around. So when you give the ai a different task or set of instructions, it’s no longer the chatbot anymore, it’s whatever function that’s needed for that task.
I weep for humanity if you’re any indication of the general education on ai….
If you ask it to create an image, are you seriously expecting it to have a conversation and point out where you messed up? That’s not how any of this works lmfao. “Hey I need to point out that ducks don’t have scales, and the sky isn’t green.” No it does what it’s asked. But now suddenly it’s different? Why?
Please see my above comments.