

Virtual keyboards have never been great
I’m actually surprised that nobody ever fundamentally reinvented text input for touchscreens in a way that caught on.
Off-and-on trying out an account over at @tal@oleo.cafe due to scraping bots bogging down lemmy.today to the point of near-unusability.


Virtual keyboards have never been great
I’m actually surprised that nobody ever fundamentally reinvented text input for touchscreens in a way that caught on.


I’d like to be able to get touchpads with physical buttons on laptops. Very few manufacturers do them, especially if you want three.


I once had dinner with a Stanford professor, years back, who was talking about the fact that he liked teaching in Python because he spent way less time teaching the language and more the higher level stuff that he was actually trying to get across than when he was using C++. Lower barrier to entry for new users. I’d guess that probably in the intervening years, a lot of classes have decided to use it for similar reasons. If you want to teach, I dunno, signal processing and your students maybe don’t have a great handle on the language yet, you want to be spending time on the signal processing stuff, not on language concepts.


My impression from what code I’ve looked at is that little computation is done by the Python code itself, so there’s little by way of gains to be had by trying to use something higher-performance, which eliminates a lot of the reason one would use some other languages.
Python’s cross-platform, albeit with a Unix heritage, so it doesn’t create barriers there. It’s already widely-used, a mature language that isn’t going anywhere and with a lot of people who know it.
It’s got an ecosystem for distributing libraries over the network, and there’s a lot of new code going out and being distributed rapidly.
Python isn’t statically-typed. Static typing can help write more-robust code. If you’re writing, say, the next big webserver, I’d want to have that checking. But for code that may often be running internally in a research project — and this is an area with a lot of people doing research — a failure just isn’t that big a deal. So, again, some of the reasons that one might use another language aren’t there.
And I imagine that there’s also inertia. Easier to default to use what others would use.
If you have another language in mind, you might mention that, see if there might be more-specific things. I could come up with more meaty plausible guesses if what you were wondering is something like “why isn’t everyone using SmallTalk?” or something.


DDR4 RAM is presently cheaper than DDR5, but it has also increased dramatically in price recently.
https://pcpartpicker.com/trends/price/memory/
DDR4:
https://lemmy.today/pictrs/image/ed889201-f9e6-46ec-81a8-832f6bfc63ed.jpeg

DDR5:
https://lemmy.today/pictrs/image/35d03746-8d9c-443f-808f-8c88f2914b73.jpeg



I have never seen a DIMM fail, and don’t believe that failure is remotely on the order of something like hard drives or fans.
I’d imagine, based on the fact that modern DIMMs often have heat spreaders, that there’s probably some level of thermal stress that can cause problems. But it’d be pretty low on my concern list as computer components go.


But a 7 year life on the old R5 1600 is insane
I think that it depends a lot on what one is doing.
So, a lot of games are bound on single-thread performance.
I have a Ryzen 9 7950X3D in my desktop. That’s the blingiest desktop AMD processor from 2023. That’s a six-year difference between release of those two processors and moving from a midrange to a top-end processor.
But despite all that.
https://www.cpubenchmark.net/compare/2984vs5234/AMD-Ryzen-5-1600-vs-AMD-Ryzen-9-7950X3D
It benches at less than twice the single-thread performance of the Ryzen 5 1600. It’s faster. But it’s not really transformatively faster. It used to be, in, say, up until the early 2000s, that you’d double serial computation performance every 18 months. A 6-year difference between processors, as between those two, used to mean that the newer one would run pretty much everything about sixteen times faster, even setting aside differences in the processor bin.
Parallel processing has improved at a better clip, either via adding more cores to CPUs or the massively-parallel computation on GPUs. So if software can really utilize parallel computation effectively, then one might get larger gains over that period. And for some software — and games are an area where some entrants can do that — they can take advantage. But for a lot of software, hardware just isn’t changing as quickly as it once did.
And for games, it’s very common that the way in which they can take advantage of more parallel compute is nice-to-have but not really essential ways, like bumping resolution up or adding some extra visual effects. It’s not “the game becomes unplayable because the game logic can’t keep up” or something like that, the way it typically would have been in the 1990s.
There are definitely things that one can do where parallel compute makes a larger difference. If you’re a computer programmer compiling software and your particular environment can do parallel builds, then you can often get a pretty linear performance increase in the number of cores. If you do 3D rendering or video rendering, you’re probably bounded by the CPU, and software is often written to take advantage of parallelism there. But the vast majority of software is mostly-limited by serial compute. And serial compute performance just hasn’t been increasing very quickly for quite some years.


How can NVIDIA sell graphics cards without a working driver.
I don’t use Kali Linux, but it sounds like it’s based on Debian’s testing release. Debian hasn’t packaged Blackwell drivers yet, so I wouldn’t be surprised if Kali doesn’t have them packaged either. You can download Blackwell drivers from Nvidia, but the Debian guys won’t have made sure that things don’t break with them.
https://wiki.debian.org/NvidiaGraphicsDrivers
https://www.nvidia.com/en-us/drivers/details/259042/
Supported Products
GeForce RTX 50 Series
NVIDIA GeForce RTX 5090 D v2, NVIDIA GeForce RTX 5090 D, NVIDIA GeForce RTX 5090, NVIDIA GeForce RTX 5080, NVIDIA GeForce RTX 5070 Ti, NVIDIA GeForce RTX 5070, NVIDIA GeForce RTX 5060 Ti, NVIDIA GeForce RTX 5060, NVIDIA GeForce RTX 5050
But you can’t install it with the graphics card inserted, and you can’t install it with it not inserted.
I don’t know why you wouldn’t be able to install the driver with the graphics card inserted.
It freezes forever at loading Ramdisk.
The initrd contains drivers that aren’t directly built into the kernel.
Typically, the way this works on Debian with third-party drivers is that you have the proper linux-headers package matching your current kernel installed. Then a third-party package registers a DKMS module with the driver source, and when you install a new kernel, the driver gets recompiled for that kernel. That driver gets dropped into the initrd, the ramdisk with the out-of-kernel stuff required to boot.
I don’t use Nvidia hardware, so I can’t tell you if that’s what’s supposed to happen, but I would guess so.
If you’re not booting with it, my guess is that something isn’t working as part of that process. Either the Nvidia script didn’t register the module or it didn’t get rebuilt or the installed driver has some issue and isn’t working when you try to load it.
You can probably run sudo dkms status and it’ll show DKMS modules and their current status. That might be a starting point.


If you set up a Kerberos server and use that for sign-in on both the Linux and MacOS systems, then you can have matching goetic demon accounts on both Linux and MacOS. :-)
Kismet can use a GPS sensor and multiple WiFi strength readings as one moves around to do a pretty good job of mapping WAPs.
I’ve been kind of disappointed that F-Droid doesn’t appear to have any program using Android’s Location Services with high-resolution positioning to build a map of the location of nearby Bluetooth devices.
I have, on occasion, not been able to remember where I set my Bluetooth headphones.


linuxmemes
I believe that that’s a MacOS X system. One can only imagine what kind of madness goes on over there.


Ah, thanks. Looks like they enabled zram in Fedora 33:
https://fedoraproject.org/wiki/Changes/SwapOnZRAM#Why_not_zswap?


I commented elsewhere in the thread that one option that can mitigate limited RAM for some users is to get a fast, dedicated NVMe swap device, stick a large pagefile/paging partition on it, and let the OS page out stuff that isn’t actively being used. Flash memory prices are up too, but are vastly cheaper than RAM.
My guess is that this generally isn’t the ideal solution for situations where one RAM-hungry game is what’s eating up all the memory, but for some things you mention (like wanting to leave a bunch of browser tabs open while going to play a game), I’d expect it to be pretty effective.
dev tasks, builds…etc
I don’t know how applicable it is to your use case, but there’s ccache to cache compiled binaries and distcc to do distributed C/C++ builds across multiple machines, if you can coral up some older machines.
It looks like Mozilla’s sccache does both caching and distributed builds, and supports Rust as well. I haven’t used it myself.


The big unknown that’s been a popular topic of discussion is whether Valve locked in a long-running contract for the hardware before the RAM price increases happened. If they did, then they can probably offer favorable prices, and they’re probably sitting pretty. If not, then they won’t.
My guess is that they didn’t, since:
They announced that they would hold off on announcing pricing due to still working on figuring out the hardware cost (which I suspect very likely includes the RAM situation).
I’d bet that they have a high degree of risk in the number of units that the Steam Machine 2.0 will sell. The Steam Deck was an unexpectedly large success. Steam Machine 1.0 kinda flopped. Steam Machine 2.0 could go down either route. They probably don’t want to contract to have a ton of units built and then have huge oversupply. Even major PC vendors like Dell and Lenovo got blindsided and were unprepared, and I suspect that they’re in a much less-risky position to commit to a given level of sales and doing long-running purchases than Valve is.
I’ve even seen some articles propose that the radical increase in RAM prices might cause Steam Machine 2.0’s release to be postponed, if Valve didn’t have long-running contracts in place and doesn’t think that it can succeed at a higher price point than they anticipated.


Honestly, a system with 64GB of memory is pretty well-provisioned compared to a typical prebuilt computer system from a major vendor.
I’ve felt that historically, PC vendors have always scrimped too far on RAM. In late 2025 with our RAM shortage, it’d be understandable, but in many prior years, it just looked like a false economy to me. Especially on systems with rotational drives — the OS is going to use any excess RAM for caching, and that’s usually a major performance gain if one has rotational drives sitting around.
EDIT: And battery. At least in 2025, a lot of people are using SSD storage, and caching that in RAM isn’t as huge a win as it is with rotational drives. But lithium batteries have gotten steadily cheaper over the years. The fact that smartphone, tablet, and laptop vendors aren’t jamming a ton of battery in their devices in 2025 is kinda crazy to me.


GPU prices
Outside of maybe integrated GPUs, I doubt it, because they need their own memory and are constrained by the same bottleneck — DRAM.
Or at least CPU prices?
I’ve read one article arguing that CPU prices will likely drop during the RAM shortage.
I don’t know if that’s actually true — I think that depends very much on the ability of CPU manufacturers to economically scale down their production to match demand, and I don’t know to what degree that is possible. If they need to commit to a given amount of production in advance, then yeah, probably.
Go back a couple years, and DRAM manufacturers — who are currently making a ton of money due to the massive surge in demand from AI — were losing a ton of money, because they couldn’t inexpensively rapidly scale production up and down to match demand. I don’t know what the economics are like for CPUs.
https://finance.yahoo.com/news/fear-dram-glut-stifling-micron-155958125.html
November 5, 2018
To be clear, the oversupply concerns that have plagued Micron Technology (NASDAQ:MU) shares for weeks now are completely valid. Micron stock has fallen as much as 40% just since June on this deteriorating dynamic.
In short, the world doesn’t need as many memory chips as Micron and rivals like Samsung (OTCMKTS:SSNLF) and SK Hynix are collectively making. The glut is forcing the price of DRAM (dynamic random access memory) modules so low that it’s increasingly tougher to make a buck in the business.
We had a glut of DRAM as late as early this year:
https://evertiq.com/news/56996
Weak Demand and Inventory Backlogs
Both the DRAM and the NAND markets are still in a state of oversupply, with excess inventory leading to significant price declines through Q4 2024 and Q1 2025. This is driven by multiple factors such as weak consumer demand.
Memory manufacturers ramped up production during previous periods of strong demand, but the market failed to meet these forecasts. This has resulted in inventory backlogs that now weigh on prices.


That’s fair, but my understanding is that VRChat, despite the name, isn’t a VR-only thing.


Honestly, I kinda wish that Bethesda would do a new release of Skyrim that aims at playing well with massive mod sets. Like, slash load time for huge mod counts via defaulting to lazy-loading a lot more stuff. Help avoid or resolve mod conflicts. Let the game intelligently deal with texture resolutions; have mods just provide a single high-resolution image and let the game and scale down and apply GPU texture compression appropriate to a given system, rather than having the developers do tweaking at creation time. Improve multicore support (Starfield has already done that, so they’ve already done the technical work).


I haven’t used it, but my understanding is that it’s vaguely like Second Life, popular with folks creating adult-content-oriented-worlds.
From a technical standpoint, that might actually be a pretty good example of a game that would benefit from cloud gaming, since I assume that it’s not all that latency-critical, not the way an FPS would be.
I guess that there would potentially be privacy issues with adult content stuff that would argue against cloud hosting, but in the case of VRChat, the service itself is already living in the cloud, so…shrugs
Biometrics are irrevocable. If you’re worried about stolen personal data, they are not what I would be moving to.