The “correct” way to use AI for coding (and anything really) is to ask for explanations / tutorials when you can’t find one online, then learn from that.
Never let it do something for you. That’s how you lose. If you’re not actively learning, you’re actively rotting, and that goes for life in general too.
I don’t think that’s a good idea, if you can’t find an explanation online that means that there’s not much info available in which case the best thing would be to ask on a forum, that way other people that look for that info will find it.
Not really, google results have been just that bad for the last 10 years. I can spend 10min looking for a piece of documentation on something and not find it. Or I can prompt an internet-connected AI and have it spit out links to relevant docs. It’s gotten THAT bad.
The “correct” way to use AI for coding (and anything really) is to ask for explanations / tutorials when you can’t find one online, then learn from that.
except the “explanation” frequently will be 100% “hallucinated” bullshit
That’s why I always ask it to cite sources. Basically googld ATP since google is turning to shit and all other search engines still aren’t quite as good
It could very easily use a completely different or hallucinated source.
But a lot of LLM products are now providing source links right in the response. I’ve found them useful, and hopefully they aren’t produced just by feeding the text back in and asking for a link.
The “correct” way to use AI for coding (and anything really) is to ask for explanations / tutorials when you can’t find one online, then learn from that.
Never let it do something for you. That’s how you lose. If you’re not actively learning, you’re actively rotting, and that goes for life in general too.
So Using it as my emotional dumbing machine is wrong ?
I don’t think that’s a good idea, if you can’t find an explanation online that means that there’s not much info available in which case the best thing would be to ask on a forum, that way other people that look for that info will find it.
Not really, google results have been just that bad for the last 10 years. I can spend 10min looking for a piece of documentation on something and not find it. Or I can prompt an internet-connected AI and have it spit out links to relevant docs. It’s gotten THAT bad.
except the “explanation” frequently will be 100% “hallucinated” bullshit
That’s why I always ask it to cite sources. Basically googld ATP since google is turning to shit and all other search engines still aren’t quite as good
It could very easily use a completely different or hallucinated source.
But a lot of LLM products are now providing source links right in the response. I’ve found them useful, and hopefully they aren’t produced just by feeding the text back in and asking for a link.