The whole industry needs rebuilt from the foundations. GRTT with a grading ring that tightly controls resources (including, but not limited to RAM) as the fundamental calculus, instead of whatever JS happens to stick to the Chome codebase and machine codes spewed by your favorite C compiler.
Had to install (an old mind you, 2019) visual studio on windows…
…
…
First it’s like 30GB, what the hell?? It’s an advanced text editor with a compiler and some …
Crashed a little less than what I remember 🥴😁
Visual Studio is the IDE. VS Code is the text editor.
My PC is 15 times faster than the one I had 10 years ago. It’s the same old PC but I got rid of Windows.
“Let them eat ram”
Everything bad people said about web apps 20+ years ago has proved true.
It’s like, great, now we have consistent cross-platform software. But it’s all bloated, slow, and only “consistent” with itself (if even). The world raced to the bottom, and here we are. Everything is bound to lowest-common-denominator tech. Everything has all the disadvantages of client-server architecture even when it all runs (or should run) locally.
It is completely fucking insane how long I have to wait for lists to populate with data that could already be in memory.
But at least we’re not stuck with Windows-only admin consoles anymore, so that’s nice.
All the advances in hardware performance have been used to make it faster (more to the point, “cheaper”) to develop software, not faster to run it.
And that us poors still on limited bandwidth plans get charged for going over our monthly quotas because everything has to be streamed or loaded from the cloud instead of installed (or at least cached) locally.
I’m dreading when poorly optimized vibe coding works it’s way into mainstream software and create a glut of technical debt. Performance gonna plummet the next 5 years just wait.
Already happening with Windows. Also supposedly with Nvidia GPU drivers, with some AMD execs pushing for the same now
If only bad people weren’t the ones who said it, maybe we would have listened 😔
I almost started a little rant about Ignaz Semmelweis before I got the joke. :P
Bloated electron apps are what makes Linux on desktop viable today at all, but you guys aren’t ready for that conversation.
Yes, in that the existence of bloated electron apps tends to cause web apps to be properly maintained, as a side effect.
But thankfully, we don’t actually have to use the Electron version, to benefit.
I can only think of a couple Electron apps I use, and none that are important or frequently used.
Uhhh like what?
Note, I don’t know how comprehensive this wiki list is, just quick research
https://en.wikipedia.org/wiki/List_of_software_using_Electron
From those, I’m only currently using a handful.
balenaEtcher, Discord, Synergy, and Obsidian
The viability of linux isn’t dependent on them though
Agreed. I wasn’t the one that claimed that
I hate that our expectations have been lowered.
2016: “oh, that app crashed?? Pick a different one!”
2026: “oh, that app crashed again? They all crash, just start it again and cross your toes.”
I’m starting to develop a conspiracy theory that MS is trying to make the desktop experience so terrible that everyone switches to mobile devices, such that they can be more easily spied on.
I bought a desktop PC for a little over 2k in late 2011, and still use it. I’m a back-end developer, and certainly I would like to be able to upgrade my 16 GB RAM to 32 GB in an affordable way.
Other than that, it’s perfectly fine. IDE, a few docker containers, works.
And modern gaming is a scam anyway. Realistic graphics do not increase fun, they just eat electricity and our money. Retro gaming or not at all.
Imagine how things were if they were built to be maintained for 15+ years.
2011 means it’s probably DDR3, which is still fairly affordable
wow, you are right! I didn’t bother to check this whole time of needless suffering, but for what I earn with it in less than an hour I could probably buy 2x8 GB DDR-3, lol!
It just seemed a fair assumption that it would be insanely expensive …
The same? Try worse. Most devices have seen input latency going up. Most applications have a higher latency post input as well.
Switching from an old system with old UI to a new system sometimes feels like molasses.
I work in support for a SaaS product and every single click on the platform takes a noticeable amount of time. I don’t understand why anyone is paying any amount of money for this product. I have the FOSS equivalent of our software in a test VM and its far more responsive.
Except for KDE. At least compared to cinnamon, I find KDE much more responsive.
AI generated code will make things worse. They are good at providing solutions that generally give the correct output but the code they generate tends to be shit in a final product style.
Though perhaps performance will improve since at least the AI isn’t limited by only knowing JavaScript.
I still have no idea what it is, but over time my computer, which has KDE on it, gets super slow and I HAVE to restart. Even if I close all applications it’s still slow.
It’s one reason I’ve been considering upgrading from6 cores and 32 GB to 16 and 64.
Have you gone through settings and disabled unnecessary effects, indexing and such? With default settings it can get quite slow but with some small changes it becomes very snappy.
Have you tried disabling the file indexing service? I think it’s called Baloo?
Usually it doesn’t have too much overhead, but in combination with certain workflows it could be a bottleneck.
Upgrade isn’t likely to help. If KDE is struggling on 6@32, you have something going on that 16@64 is only going to make it last twice as long before choking.
wail till it’s slow
Check your Ram / CPU in top and the disk in iotop, hammering the disk/CPU (of a bad disk/ssd) can make kde feel slow.
plasmashell --replace # this just dumps plasmashell’s widgets/panels
See if you got a lot of ram/CPU back or it’s running well, if so if might be a bad widget or panel
if it’s still slow,
kwin_x11 --replace
or
kwin_wayland --replace &
This dumps everything and refreshes the graphics driver/compositor/window manager
If that makes it better, you’re likely looking at a graphics driver issue
I’ve seen some stuff where going to sleep and coming out degrades perf
Hmm, I haven’t noticed high CPU usage, but usually it only leaves me around 500MB actually free RAM, basically the entire rest of it is either in use or cache (often about 15 gigs for cache). Turning on the 64 gig swapfile usually still leaves me with close to no free RAM.
I’ll see if it’s slow already when I get home, I restarted yesterday. Then I’ll try the tricks you suggested. For all I know maybe it’s not even KDE itself.
Root and home are on separate NVMe drives and there’s a SATA SSD for misc non-system stuff.
GPU is nvidia 3060ti with latest proprietary drivers.
The PC does not sleep at all.
To be fair I also want to upgrade to speed up Rust compilation when working on side projects and because I often have to store 40-50 gigs in tmpfs and would prefer it to be entirely in RAM so it’s faster to both write and read.
Don’t let me stop you from upgrading, that’s got loads of upsides. Just suspecting you still have something else to fix before you’ll really get to use it :)
It CAN be ok to have very low free ram if it’s used up by buffers/cache. (freeable) If Buff/cache gets below about 3GB on most systems, you’ll start to struggle.
If you have 16GB, it’s running low, and you can’t account for it in top, you have something leaking somewhere.
I want to avoid building react native apps.
Windows 11 is the slowest Windows I’ve ever used, by far. Why do I have to wait 15-45 seconds to see my folders when I open explorer? If you have a slow or intermittent Internet connection it’s literally unusable.
Even Windows 10 is literally unusable for me. When pressing the windows key it literally takes about 4 seconds until the search pops up, just for it to be literally garbage.
Found out about this while watching “Halt and Catch Fire” (AMC’s effort to recreate the magic of Mad Men, but on the computer).
In 1982 Walter J. Doherty and Ahrvind J. Thadani published, in the IBM Systems Journal, a research paper that set the requirement for computer response time to be 400 milliseconds, not 2,000 (2 seconds) which had been the previous standard. When a human being’s command was executed and returned an answer in under 400 milliseconds, it was deemed to exceed the Doherty threshold, and use of such applications were deemed to be “addicting” to users.
if it only occurs hours or days after boot, try killing the startmenuexperiencehost process. that’s what I was doing until I switched to linux
I am using windows like once a week at maximum and then it only takes about 10 minutes. So I kind of do not really care and am glad, that I do not need to use it more often.
The Windows bloat each new generation is way out of control.
It takes forever to boot I know that and that’s from fast food which is extra pathetic.
fast food
Too many nuggies
Maybe if Windows quit pigging out on tendies and slimmed down it would be as baf
Probably that’s the folder explorer or whatever itself crashing.
yeah
and like why does it crash? it worked fine on Windows 10
I’ve given up trying to understand modern PC software. I can barely keep up with the little microcontrollers I work with. They aren’t so little.
I’ll keep saying this: my 2009 i5 750 still feels as fast as my 2 years old workstations and can play almost everything I want with the 1060.
The tech debt problem will keep getting worse as product teams keep promising more in less time. Keep making developers move faster. I’m sure nothing bad will come of it.
Capitalism truly ruins everything good and pure. I used to love writing clean code and now it’s just “prompt this AI to spit out sloppy code that mostly works so you can focus on what really matters… meetings!”
What really matters isn’t meetings, it’s profits.
They often are worse, because everything needed to be an electron app, so they could hire the cheaper web developers for it, and also can boast about “instant cross platform support” even if they don’t release Linux versions.
Qt and GTK could do cross platform support, but not data collection, for big data purposes.
There’s no difference whatsoever between qt or gtk and electron for data collection. You can add networking to your application in any of those frameworks.
I don’t know why electron has to use so much memory up though. It seems to use however much RAM is currently available when it boots, the more RAM system has the more electron seems to think it needs.
Chromium is basically Tyrone Biggums asking if y’all got any more of that RAM, so bundling that into Electron is gonna lead to the same behavior.
Ib4 “uNusEd RAm iS wAStEd RaM!”
No, unused RAM keeps my PC running fast. I remember the days where accidentally hitting the windows key while in a game meant waiting a minute for it to swap the desktop pages in, only to have to swap the game pages back when you immediately click back into it, expecting it to either crash your computer or probably disconnect from whatever server you were connected to. Fuck that shit.
I mean unused RAM is still wasted: You’d want all the things cached in RAM already so they’re ready to go.
I mean I have access to a computer with a terabyte of RAM I’m gonna go ahead and say that most applications aren’t going to need that much and if they use that much I’m gonna be cross.
Wellll
If you have a terabyte of RAM sitting around doing literally nothing, it’s kinda being wasted. If you’re actually using it for whatever application can make good use of it, which I’m assuming is some heavy-duty scientific computation or running full size AI models or something, then it’s no longer being wasted.
And yes if your calculator uses the entire terabyte, that’s also memory being wasted obviously.
I don’t want my PC wasting resources trying to guess every possible next action I might take. Even I don’t know for sure what games I’ll play tonight.
Well you’d want your OS to cache the start menu in the scenario you highlighted above. The game could also run better if it can cache assets not currently in use instead of waiting for the last moment to load them. Etc.
Yeah, for things that will likely be used, caching is good. I just have a problem with the “memory is free, so find more stuff to cache to fill it” or “we have gigabytes of RAM so it doesn’t matter how memory-efficient any program I write is”.
“memory is free, so find more stuff to cache to fill it”
As long as it’s being used responsibly and freed when necessary, I don’t have a problem with this
“we have gigabytes of RAM so it doesn’t matter how memory-efficient any program I write is”
On anything running on the end user’s hardware, this I DO have a problem with.
I have no problem with a simple backend REST API being built on Spring Boot and requiring a damn gigabyte just to provide a /status endpoint or whatever. Because it runs on one or a few machines, controlled by the company developing it usually.
When a simple desktop application uses over a gigabyte because of shitty UI frameworks being used, I start having a problem with it, because that’s a gigabyte used per every single end user, and end users are more numerous than servers AND they expect their devices to do multiple things, rather than running just one application.
they’re actually slower than windows vista, there is said it.
Vista honestly wasn’t as bad as we all said/remember, but it was the start of Windows optimization downturn. It worked great on top of the line systems with tons of power, and was the best looking Windows Microslop ever developed.
It just happened to also coincide with the start of netbooks and low power computers going mainstream, and marketing thought that the F1 requiring OS should also be sold on a 3 door hatchback with 60 horsepower.
Your mileage with Vista was wildly hardware-dependent. Prior to Vista, if you could run one version of Windows, the next version would run just about as well.
The Indexer and Glass were memory hungry. If you gave it a decent amount of ram, it could look like a dream while it did. If you turned off Aero on an under-specced machine, it could also run pretty well, but if you turned off Aero, you didn’t have much of a reason not to just run 98se.
The other shoe was drivers. Noone was ready for WDDM and a LOT of the small to mid-sized hardware vendors emergency released slow, buggy, memory-hungry drivers that just made Vista feel horrible.
I had some off-the-shelf compaqs that ran beautifully, My dual P3/scsi workstation with tons of ram, ran like hot garbage.
Websites are probably a better example; as the complexity and bloat has increased faster than the tech.
oh, yes, somebody made this a long time ago, in response to the performance of new webpages https://motherfuckingwebsite.com/
I love it
Well yeah, why would I learn html when I can learn React?!?
(/s but I actually did learn React before I had a grasp of semantic Html because my company needed React devs and only paid for React-specific education)
I dislike a lot the framing of this.
Yes, the average software runs much less efficient. But is efficiency what the user want? No. It is not.
How many people will tell you that they stick to windows instead of switching to linux because linux is all terminal? And terminal is quicker, more efficient for most things. But the user wants a gui.
And if we compare modern gui to old gui… I don’t think modern us 15x worse.
But the user wants a gui.
Firstly, plenty of Linux instances have GUI. I installed Mint precisely because I wanted to keep the Windows/Mac desktop experience I was familiar with. GUIs add latency, sure. But we’ve had smooth GUI experiences since Apple’s 1980s OS. This isn’t the primary load on the system.
Secondly, as the Windows OS tries to do more and more online interfacing, the bottleneck that used to be CPU or open Memory or even Graphics is increasingly internet latency. Even just going to the start menu means making calls out online. Querying your local file system has built in calls to OneDrive. Your system usage is being constantly polled and tracked and monitored as part of the Microsoft imitative to feed their AI platforms. And because all of these off-platform calls create external vulnerabilities, the (abhorrently designed) antivirus and firewall systems are constantly getting invoked to protect you from the online traffic you didn’t ask for.
It’s a black hole of bloatware.
I am not saying linux is terminal. I am saying that people tell you that linux is all terminal and that they want a gui.
Linux gui is much prettier than Windows anyway.
TVs became SmartTVs and now need the internet to turn on. The TVs need an OS now to internet to do TV.
Antennae broadcast TV seems like an ancient magic.
We’ve deprecated a lot of the old TV/radio signal bandwidth in order to convert it to cellphone signal service.
But, on the flip side, digital antennae can hold a lot more information than the old analog signals. So now I’ve got a TV with a mini-antennae that gets 500 channels (virtually none of which I watch). My toddler son has figured out how to flip the channel to the continuous broadcast of Baby Einstein videos. And he periodically hijacks the TV for that purpose, when we leave the remote where he can reach.
So there’s at least one person I can name who likes the current state of affairs.
I always have to remind myself being able to stream audio from a cellphone while driving across a city is also a pretty crazy development.
There isn’t anything fundamentally slower about using a GUI vs just text in a console. There’s more to draw but it scales linearly. The drawing things on the screen part isn’t the slow bit for slow programs. Well, it can be if it’s coded inefficiently, but there are plenty of programs with GUIs that are snappy… Like games, which generally draw even more complex things than your average GUI app.
Slow apps are more likely because of an inefficient framework (like running in a web browser with heavy reliance on scripts rather than native code), inefficient algorithms that scale poorly, poor resource use, bad organization that results in doing the same operation more times than necessary, etc.
The terminal is quicker. Not because of the image is drawn more quickly but because it is more efficient to do anything.
Technically true, but there’s a threshold on responsiveness. If both user interfaces respond in milliseconds, it doesn’t matter if one is more efficient
Can you elaborate on that? I disagree but would like to understand why you think that. Maybe you’re referring to something I wouldn’t disagree with.











