I agree with the sentiment, but I do have to argue with some of his points (because it’s fun, and it’s okay to do things just for the hell of it). Excellent point, ineffectively articulated
I wrote my MSc on The Metaverse. Learning to built VR stuff was fun, but a complete waste of time. There was precisely zero utility in having gotten in early… But I’m struggling to think of anyone who has earned anything more than bragging rights by being first.
You’re your own counterexample. You got to experience the metaverse when it was still alive, which you wouldn’t have if you had waited for just a few years. And you got a Masters Degree out of it, not just bragging rights.
But I’m struggling to think of anyone who has earned anything more than bragging rights by being first. Some early investors made money
So you’re not struggling that much if you can start of the next sentence with an example of people who earned more than just bragging rights.
But I’m struggling to think of anyone who has earned anything more than bragging rights by being first. Some early investors made money - but an equal and opposite number lost money.
This grossly overestimates the ratio of successes to failures. You’re muuuuuuch more likely to lose money on the gamble of The Next Big Thing than win; for every HTTP there’s a Gopher and Usenet and a dozen others that all look the same from the outside looking in.
For every HTML 2.0 you might have tried, you were just as likely to have got stuck in the dead-end of Flash.
Flash is a terrible example, it ran its lifecycle already, sure, but it was HUGE back in the day. And people benefited from using it; some of my favorite animators and gamedevs cut their teeth on Flash, people’s work got recognized by a global audience, people landed jobs, Flash made it onto cable TV channels, people still light up at the mention of Homestar Runner to this day. People also made money, sure, but there are more benefits to playing with tech than “it makes money happen.”
Which brings me to my final gripe: this is all framed as if the only benefit of a technology is if it’s productive or profitable. When you discuss your favorite show with friends, are you considering whether the conversation can be converted into capital? When you watch a beautiful sunset, do you fret over whether the clouds will help you achieve your quarterly goals? Out on dates with your SOs, do you have to take a break in the bathroom to worry whether the evening is meeting KPIs?
Sometimes the benefit of things is just having the experience, instead of treating it as a means to an end. Yeah, don’t let the FOMO ruin your day, but maybe take some time to play around with a doomed technology before it becomes abandoned and the community ceases to be. Maybe you’ll become a recognized expert, maybe you’ll learn some valuable lessons you can transfer to tech with more longevity, or maybe you’ll just have fun.
And honestly, whats the fucking point of living, working and grinding and suffering, if not for the fun in between it all?
If this tech is as amazing as you say it is, I’ll be able to pick it up and become productive on a timescale of my choosing not yours.
While I agree, my boss sadly thinks otherwise.
I think, it all stems from an idea that in a new field you can grab a niche and be a
monopolysuccessful business if you move fast enough, and inherently implies nobody believes in fair competitionGive to Caesar what belongs to Caesar.
While my org is also pushing for this, I use it only at work.
For every HTML 2.0 you might have tried, you were just as likely to have got stuck in the dead-end of Flash.
This one hurt. I had a decade plus old piece of tech debt from when they fucking killed flash before I could move on to new projects.
I sympathize with the point of the article, but if someone’s seriously citing Flash, which had widespread success for a run of about 15 years before being overtaken by later developments (driven in part by a billionaire with an axe to grind), as a short-lived “dead end” that was best avoided, then how long do they think is a sensible amount of time to wait to see if something’s worth spending time and effort? Nothing remains on top forever.
Well, I’m still waiting for that new language everyone is talking about to mature enough for being really useful, it’s not even 50 years old yet, that C++ lang
Flash had its use. I think a better analogy for me is web frameworks.
I remember in the mid 2000s there seems to be a new one every week. “LOL, you aren’t using Ruby On Rails? Peasant!” “LOL, you aren’t using Django? Peasant!”
Still seems to be the case with Electron, React, Node, blah blah blah.
Running to stand still.
“LOL, you aren’t using Django? Peasant!”
… I’m working on learning Django to get a job… should I stop? What should I use instead?
My webserver I’ve had for a while supports basically that.
… I’m working on learning Django to get a job… should I stop? What should I use instead?
As the other comment said, Django isn’t going anywhere. I’d not start a new project in it, but it’ll be with us for a long time.
For a more modern (and better) python stack, but which is also definitely here to stay, I’d look at FastAPI with sqlalchemy.
Django isn’t going anywhere. The point is not to jump on the latest fad, which Django isn’t.
I’m the wrong person to ask. My goto language is older than I am and hasn’t had a meaningful change since I was born.
cobol? fortran? c? assembly? so many options
C.
I exaggerate a bit. C99 lets you declare variables anywhere inside the block, not just the top.
Which still got me into an argument with a coworker who wanted me to declare every variable at the top of the block “in case” we port the code to a compiler that doesn’t support it.
C99 was 20 years old at that point.
Newer versions of C have generics “support” but I haven’t seen it in the wild yet.
I wonder what the last programming language will be…
COBOL.
Fortran has a 2018 release. Assembly is tied to the cpu, so I assume it changes every iteration.
I just consider it’s origin to be ancient…
That’s irrelevant is it’s updated frequently.
I feel like the web framework question has stabilized in recent years. React and node (not a web framework but in a similar boat) are stable and common, and angular and a few others are good alternatives.
I hear they’re changing the language these things run on from JavaScript to TypeScript.
No thanks to the hamster wheel.
I mean, Typescript just compiles down to JavaScript. I’m also generally anti a million frameworks, but JavaScript to TypeScript is easy
Isn’t it also like opt in? If you don’t annotate a type it just defaults to Any, which is just unchecked like standard JS
Yup
As far as I know, it is. Type safety is optional but very useful sometimes
TS is a superset of js though, you can still use normal js and probably won’t have to even change the file extension or anything like literally 0 change
At least for Bitcoin it made some level of sense.
With AI?
If it really gets that good that everyone needs to use it, I can just ask AI how to use AI like a pro.
Or I can just tell AI to use AI for me (isn’t that what OpenClaw is?)
Author likely isn’t talking about generic chatbot use, but about agentic systems that are being pushed for automating everything to the point you don’t have to touch your keyboard or move cursors anymore.
Yeah it takes 0 skill. People thinking they’re Paul Allen because they typed a prompt is humorous. Its often the types who have absolutely no clue how computers work that are obsessed with llm slop.
Bitcoin makes sense when the government prints 40% more currency and inflates assets, like Covid.
Good point, although I was referring to the “don’t get left behind” notion specifically.
If it really gets that good that everyone needs to use it, I can just ask AI how to use AI like a pro.
As a pro who is forced to use AI, I did this a lot when I was learning how to use it.
Or I can just tell AI to use AI for me
Yep. Subagents are a very normal part of AI tools
Misread the title as “being left handed”, came into the comments to make a “Well, that’s just sinister!” joke and was highly confused by the comments - and the article - until I noticed my mistake. lol
That’s a very funny joke though, so I can see why you felt compelled to make this comment
Articulates the conclusion I’ve come to.
The idea that fetuses today will never catch up to a technology touted as easy enough for a dog to use is absurd.
If this stuff is so amazing, then I’ll pick it up when it matures.
It frankly doesn’t solve any problems I have. PR approvals are my bottleneck, not writing code.
And as more people outsource their thinking to LLMs, the less intelligent they are on their own.
Hope you have enough tokens and the server isn’t 404.







