There is a fifth way of using AI:
Ask AI to hint the problems of your text or suggest rules to look up so you can can “solve” it yourself.
The problem of over-reliance on AI isn’t anything new - we’ve always learned by struggling through problems ourselves.
It’s like playing a puzzle game - if you just go look up the solutions instead of trying things yourself, not only do you lose the point of playing the game by reducing it to a series of bothersome tasks that just need to get done to get something at the end, but eventually you find yourself out of depth as you didn’t develop the proper understanding of the puzzles.
Because of this, I’ve been gravitating more towards the 4th and 5th ways of using AI for things that matter to me, things I need or want to understand deeply.
I try not to rush through things, to enjoy the process and instead of just asking for an answer to a question, I’m starting to ask it: ** How can I find the answer myself? What materials would an experienced person in this field look up in order to solve this problem?** And similar variations of these types of questions. The main idea is: I instruct it not to give a solution or code right away but instead to explore the problem together with me and teach me how to fish instead of giving me the fish. If I give him some part of the documentation and he gets an insight, I ask: How did this part of the docs help you get to the conclusion of that? How did you know what to look for ? And so on. Basically I assume the AI is a more experienced person next to me and we’re trying to pair program. He doesn’t know the solution from the back of his mind but he can easily “find it” and we’re walking through it together.
This shift happened because AI kept missing the mark on my questions - partly because I work with relatively niche tools, partly because when you’re learning, you don’t know what context is even relevant to give it and if you give irrelevant context usually you end up misleading it.
And it’s actually surprisingly fun and enjoyable to work on my problems now. There’s this shift of not seeing the problems as something to be solved but as something that needs to be understood, a game that needs to be played if you will. Obviously it takes longer as the article pointed out learning takes time.
There is a fifth way of using AI: Ask AI to hint the problems of your text or suggest rules to look up so you can can “solve” it yourself.
The problem of over-reliance on AI isn’t anything new - we’ve always learned by struggling through problems ourselves. It’s like playing a puzzle game - if you just go look up the solutions instead of trying things yourself, not only do you lose the point of playing the game by reducing it to a series of bothersome tasks that just need to get done to get something at the end, but eventually you find yourself out of depth as you didn’t develop the proper understanding of the puzzles.
Because of this, I’ve been gravitating more towards the 4th and 5th ways of using AI for things that matter to me, things I need or want to understand deeply.
I try not to rush through things, to enjoy the process and instead of just asking for an answer to a question, I’m starting to ask it: ** How can I find the answer myself? What materials would an experienced person in this field look up in order to solve this problem?** And similar variations of these types of questions. The main idea is: I instruct it not to give a solution or code right away but instead to explore the problem together with me and teach me how to fish instead of giving me the fish. If I give him some part of the documentation and he gets an insight, I ask: How did this part of the docs help you get to the conclusion of that? How did you know what to look for ? And so on. Basically I assume the AI is a more experienced person next to me and we’re trying to pair program. He doesn’t know the solution from the back of his mind but he can easily “find it” and we’re walking through it together.
This shift happened because AI kept missing the mark on my questions - partly because I work with relatively niche tools, partly because when you’re learning, you don’t know what context is even relevant to give it and if you give irrelevant context usually you end up misleading it.
And it’s actually surprisingly fun and enjoyable to work on my problems now. There’s this shift of not seeing the problems as something to be solved but as something that needs to be understood, a game that needs to be played if you will. Obviously it takes longer as the article pointed out learning takes time.
This is the real way it becomes a tool. It points you in the right direction or gives you the keywords you need to find that direction yourself.
Always look up its sources or ask for them explicitly and move away from the ai as soon as you can.
As soon as you can start reading documentation, do it. Don’t have an AI summarise it for you.