Four months ago I asked if and how people used AI here in this community (https://lemmy.world/post/37760851).
Many people said that didn’t use it, or used only for consulting a few times.
But in those 4 months AIs evolved a lot, so I wonder, is there people who still don’t use AI daily for programming?
Doesn’t really do anything for me. It doesn’t feel to me like it has changed all that much.
Sometimes I use it to translate to and from Japanese and English, does that count?
The great prof. Edsger Dijkstra explained much better than I ever could myself why trying to program computers using a natural language is a truly poor idea, in this now classic essay of his from 1978, On the foolishness of “natural language programming”:
https://www.cs.utexas.edu/~EWD/transcriptions/EWD06xx/EWD667.html
I don’t. Personally, I don’t believe that AI assisted coding has a future, and I don’t like the quality of what it produces. Either we achieve AGI or whatever the finance bros are hyping up this week, or we don’t. If we do, then AI can write code without having a human in the loop, so “vibe coding” is dead. If we don’t, then AI code stays where its at right now, which is garbage quality. After a few years vibe coding disasters, demand for human coding will increase, and my skills will be much more valuable than even before all this craziness. In that case, letting those skills atrophy today would be the wrong move.
And if I’m wrong? Well, if AI code generation makes it so anyone who doesn’t know how to code can do it… then surely I’ll be able to figure it out? My existing skills wouldn’t hurt.
I use it as an overconfident rubber duck to bounce ideas and solutions off of in my code, but I don’t let it write for me. I don’t want the skills I’ve practiced to atrophy
I don’t and never will. I’m one of the only people at my ~450 person company that doesn’t use an LLM.
I don’t, it’s not better than simply thinking about things myself. There isn’t institutional pressure to use it and if there was I would simply lie and not use it.
More like a manual. Google has become really shitty for complex queries, LLMs can find relevant keywords, documents much realiably. Granted, if you are asking questions about niche libraries it hallucinates functions quite often so I never ask it to write full pieces of code but just use it more like a stepping stone.
I find it amusing how shamelessly it lies about its hallucinations though. When I point out that a certain function it makes up does not exist the answer is always sth of the form “Sorry you are right that function existed before version X / that function existed in some of the online documentation” etc lol. It is like a halluception. If you ask it to find some links regarding these old versions or documentations they also somehow don’t exist anymore.
Yeah. I prefer not externalizing my ability to think.
I am simply not interested. I enjoy writing code. Writing prompts is another task entirely.
I imagine that one day I might ask an ai to teach me how something works, but not to write the code for me. Today, I sometimes have to slog through poorly written documentation, or off topic stack exchange posts to figure something out. It might be easier using an llm for that I guess.
I imagine that if I only cared about getting something working as fast as possible I might use one some day.
But today is not that day.
I don’t, and probably never will. A whole bunch of reasons:
- The current state of affairs isn’t going to last forever; at some point the fact that nobody’s making money with this is going to catch up, a lot of companies providing these services are going to disappear and what remains will become prohibitively expensive, so it’s foolish to risk becoming dependent on them.
- If I had to explain things in natural language all the time, I would become useless for the day before lunch. I’m a programmer, not a consultant.
- I think even the IntelliSense in recent versions of Visual Studio is sometimes too smart for its own good, making careless mistakes more likely. AI would turn that up to 11.
- I have little confidence that people, including myself, would actually review the generated code as thoroughly as they should.
- Maintaining other people’s code takes a lot more effort than code you wrote yourself. It’s inevitable that you end up having to maintain something someone else wrote, but why would you want all the code you maintain to be that?
- The use-cases that people generally agree upon AI is good at, like boilerplate and setting up projects, are all things that can be done quickly without relying on an inherently unreliable system.
- Programming is entirely too fun to leave to computers. To begin with, most of your time isn’t even spent on writing code, I don’t really get the psychology of denying yourself the catharsis of writing the code yourself after coming up with a solution.
You wrote this all a lot better than I could have, but to expand on 2) I have no desire whatsoever to have a “conversation” (nay, argument) with a machine to try and convince/coerce/deceive/brow-beat (delete as appropriate) it into maybe doing what I wanted.
I don’t want to deal with this grotesque “tee hee, oopsie” personality that every company seems to have bestowed on these awful things when things go awry, I don’t want its “suggestions”. I code, computer does. End of transaction.
People can call me a luddite at this point and I’ll wear that badge with pride. I’ll still be here, understanding my data and processes and writing code to work with them, long after (as you say) you’ve been priced out of these tools.
I only use it when I’m learning something very new and very dense and that’s to stand up an example based on a context I’m interested in or already familiar.
It helps me identify parts of the docs to focus on more quickly.
Otherwise no, I’m getting better without tools
Never used it to write my code. Others have given great reasons, which resonate with me, but the biggest one for me is that I enjoy writing code and designing programs. Why would I outsource one of the things I love to do? It’s really that simple for me.
Please, continue to “use AI daily”. Rot your brain, see if I care.
If my competitors want to shoot themselves in the foot that’s fine by me, I won’t stop them.
I use it for scripts or for esoteric error messages or problems I’m having in my dev environment.
I can’t be bothered to understand a specific error message that I’ve never seen before because of an update or whatever.
So getting it to explain errors to me is handy.
I always review the LLMs process and the resulting changes it suggests (including searching what it’s done if I don’t get it).
It’s essentially a context-aware search engine.Actual coding and problem solving? I enjoy that.
I have stopped using it, because the skill atrophy kicked in and I don’t want to turn into someone chatting with a bot every day.
I work as a software developer and over the last months, I slipped into a habit of letting ChatGPT write more and more code for me. It’s just so easy to do! Write a function here, do some documentation there, do all of the boilerplate for me, set up some pre-commit hooks, …
Two weeks ago I deleted my OpenAI account and forced myself to write all code without LLMs, just as I did before. Because there is one very real problem of excessive AI useage in software development: Skill atrophy.
I was actively losing knowledge. Sometimes I had to look up the easiest things (like builtin Javascript functions) I was definitely able to work with off the top of my head just a year ago. I turned away from being an actual developer to someone chatting with a machine. I slowly lost the fun in coding, because I outsourced the problem solving aspects that gave me a dopamine boost to the AI. I basically became a glorified copypaster.










