Serious question. We had a perfectly serviceable word, yet everyone decided to shift. Is it just that it’s shorter to type?
If so, I feel for your colleagues trying to parse your code when all your variables use abbreviations.
I remember it from the 2000’s. The internet was up and we could all share our own pet names for things or just make up a new word. mIRC and newsgroups were big on that. Weird times but cool for sure. Some of it stuck. I still use the exclamation ‘Blargh’ occasionally.
The first time I remember it was someone from California asked me to ‘code summin’ for their daughter. I had to make sure what they were asking of me before I agreed. 2002 maybe.
Could have been worse, it could have turned into “apping”.
Stop it! You’ll go blind.
Don’t give them any ideas!
“ohh i work as an executioner in Microsoft back in 2005. Kinda fun doing executioning”
Oh I’d LOVE to work as executioner at Microsoft!
Code was in use relating to the set of instructions used to control a computer in 1946; with it becoming a verb by 1986. Programming was from 1945 as a first use in regards to computers; meaning "cause to be automatically regulated in a prescribed way.
Now the funny thing is the noun ‘Program’ in regards to computers in 1945 meant “series of coded instructions which directs a computer in carrying out a specific task”
So if we really work through the etymology a bit, coded instructions was first, then Program/ming, then Code and coding; though certainly ‘encoding’ would have been used before programming given the definition of ‘coded instructions.’
So… Blame Ada Lovelace for not coming up with something catchy like ‘lacing’ which would have been far more camp (and much more accurate to the gender of early programmers).
And be grateful that we didn’t start calling it “apping”, even though the term “program” is effectively extinct these days.
For me apps are things that are fullscreen-only (on phones, Windows 8 apps, GNOME 3+), while programs are small CLI things or complex ones with discoverable GUIs where you have more control over the UI and placement.
Don’t know what tiling window systems would make programs by the above definition.
this is awesome, thanks!
Still more acceptable, in my opinion, than going from “using” to “leveraging”…
I think I noticed it widespread in the mid 2010s. Maybe around the same time that DIY and various hobby/handicrafts became just “making”.
I still remember when I bought a tool off someone and was chatting about what i was going to do with it and they declared out of the blue: “I am a maker!”. I just had to end the conversation as quickly and politely as possible. I don’t know what the point of that statement was, but it made them sound a bit unhinged.
That said I’m sure the term coding was in use before, but more like a sub-activity that ‘computer programmers’ or ‘software engineers’ might do as part of their job. Maybe ‘coder’ and ‘coding’ became more popular with the spread of the term ‘agile’ into the bullshitting-consultant / middle-management cultures; I think that’s when some people started using that term as an excuse for skipping ‘design’ and ‘engineering’ parts of any complex project.
I ended up very happy I didn’t go into software once I heard the hell of agile and Jira. Like, I can do this all in Python … what are y’all talking about?
Agile just sounds like a stupid modeling method. Has anyone ever said without irony “we’ll get it better next sprint”? Really? You want better software as a software company? Thank god I was sitting down.
Of course you fucking are. If your company is hoping for reversions via this may as well die. The whole point is control, not output, efficiency or quality.
Yep. I’m sure its fine in small competent teams with a workflow that they’re comfortable with.
But any large organisation I’ve worked in involves clueless middle managers and teams with some people who really should move on. Agile seems to make incompetence of managers less obvious, and makes them less accountable. Seems to remove control over both the workflow, cost base and the quality of product/service.
It also gives them this great universal technique for problem solving. “We found that Task X is not being done right” . OK we’ll write a job title “X doer”. Appoint person who doesn’t really know what X is, but neither does the interviewer. Make this person go to ‘stand ups’ and assign them some jiras. 3 months later, they wonder why X is still not being done and why their new hire left already.
The root cause for sure is incompetent management not the methods. But their project method seems to protect them and impair actual improvement.
This is a guess but I feel like it was around the time that most coding was done for things that weren’t explicitly “programs”, like web design CSS/Java and smartphone “apps”.
Yea this vibes.
For me, it felt like coding was a more attractive term for people who weren’t “proper” computer science and “engineering” types who weren’t confident that they knew what “program” meant or even “algorithm”, as they were working things as they went.
I’d guess that as computing involved more and more people with this non-standard background, coding became preferred. I certainly encountered people uncomfortable with my casual use of “algorithm” because it triggered their imposter syndrome, and my pointing out that they write algorithms with the code the write all the time certainly didn’t help.
Also anyone writing scripts, or even just using stuff like AWS Lambda / functions as a service, etc. etc.
Programming could also refer to lower paid jobs operating machines, like a CNC Programmer or Radio programs.
So the real computer software people started using the term “software engineering”. But that’s too long, so ‘coding’.
Iirc it kinda started getting used interchangeably in the 60s but spawned from the term/name Fortran automatic coding systems… I def would need to hit my search engine of choice and do some digging however. As for the mass public usage sometime 2010ish? With all that learn to code and code.org stuff coming into the mainstream.
Cool. I’m drunk and don’t care to do any digging, so if you’d like to do that on my behalf …
It’s probably predominantly because of the switch to mobile computing / smartphones / web being dominant, and everyone referring to programs there as “apps” / applications.
i.e. If you write a mobile app with a function-as-a-service backend, you will never compile what someone would refer to as a “program”, so calling yourself a “programmer” (as-in, someone who makes programs) feels inaccurate and a not helpful description for people. “Coder” (as-in, someone who writes code) is a vaguer in terms of the type of code you write and more accurate in terms of what you spend your time producing.
I think the window for having that debate was some time around 1992.
I started as a CS major in 1997, and the term was not used.
In your specific circles.
The CS department at the University of Washington with all sorts of tech companies starting up? I mean, sure, if you want to believe your timeline, you’re free to feel that way, but claiming this was standard by 1992 is ludicrous to me.
People at the University of Washington don’t refer to soda pop the same way as people at Berkley, or at MIT, or at Oxford. Why would they all have had the exact same term for writing software?
Edit: I’m being argumentative, I honestly have no idea what term was common then. At that point most people I knew referred to it as “computer stuff”
Since you’ve admitted to being argumentative, I’ll point out that you misspelled Berkeley. I was accepted there into EECS, though neither MIT nor Oxford. As it turns out, if you don’t apply, they don’t notice you.
I described myself as a coder for a while starting somewhere around '03 or '04 IIRC.









