Scottish: got the painters in.
Some things cross language boundaries.
Scottish: got the painters in.
Some things cross language boundaries.


Indeed.
In some ways, this kind of thing is ideal for Rust. It’s at it best when you’ve a good idea of what your data looks like, and you know where it’s coming from and going to, and what you really want is a clean implementation that you know has no mistakes. Reimplementing ‘core code’ that hasn’t changed much in twenty years to get rid of any foolish overflows or use-after-free bugs is perfect for it.
Using Rust for exploratory coding, or when the requirements keep changing? I think you’ve picked the wrong tool for the job. Invalidate a major assumption and have to rewrite the whole damn thing. And like you say; an important choice for big projects as choosing a tool that a lot of people will be able to use. And Window is very big.
They’re smoking crack, anyway. A million lines per dev per month? When I’m doing major refactoring, a couple thousand lines per week in the same language, mostly moving existing stuff into a new home, is a substantial change. Three orders of magnitude more with a major language conversion? Get out of here.
The Centos “eight pointed star”?


Menu bar at the top at least makes some sense - it’s easier to mouse to it, since you can’t go too far. Having menus per-window like Linux, or like Windows used to before big ugly ribbons became the thing, is easier to overshoot. (Which is why I always open my menu bars by pressing ‘alt’ with my left thumb, and then using the keyboard shortcuts that are helpfully underlined. Window likes to hide those from you now since they’re ‘ugly’, and also makes you mouse over the pretty icons to get the tooltip that tells you what they are, which is just a PITA. Pretty != usable.)
Mac OS has had the menu at the top since before it was a multitasking OS. They had them there on the first Mac I ever used, a Mac Classic 2 back in 1991 or so, and it was probably like that before then too. It’s not like they’ve been ‘innovating’ that particular feature and annoying their users.


The actual fix is probably ‘enable mixed ASCII / Windows-1252 calls to Windows UTF-16 functions’, when some strings have different codepages to others’, or something silly. But that fix sounds better.
A rising tide lifts all boats - every improvement is welcome


I had 32GB of RAM in my desktop as 4x8GB; one of the sticks failed a couple of years ago, and it was cheaper to replace it with 64GB = 4x16GB than it was to get a replacement 8GB.
That’s convenient for work purposes (in fact, I could actually do with more) but massive pointless overkill for most games. Even games which do “big loads” - Witcher 3, say - aren’t noticeably quicker from RAM cache than they are off of an NVMe drive.
Generally, companies are trying to maximise profit, which means that the price will be reduced only when it’s stopped selling at the previous and they want to make sales the next, more price-conscious, segment of the market. They might want some quick bucks if the company is in financial trouble, or to ‘make the news’ with a sale if they need some publicity.
BG3 sold shedloads, is still selling shedloads, was on multiple games-of-the-year list and generally ranks amongst the best games of all time, often at the top; and Larian seem sufficiently flush with cash from the success of it. So like you say, don’t hold your breath waiting for a big sale, it doesn’t make sense for them to do that.


Data centre GPUs tend not to have video outputs, and have power (and active cooling!) requirements in the “several kW” range. You might be able to snag one for work, if you work at a university or at somewhere that does a lot of 3D rendering - I’m thinking someone like Pixar. They are not the most convenient or useful things for a home build.
When the bubble bursts, they will mostly be used for creating a small mountain of e-waste, since the infrastructure to even switch them on costs more than the value they could ever bring.


There’s times when I want to find “exact matches and nothing but” - searching for error messages, for instance - and that’s made much harder than it should be by AI bullshit search engines that don’t want you to switch off their “helpful” features. Considering moving to Kagi instead.
Mine was my local Forgejo server, NAS server, DHCP -> DNS server for ad blocking on devices connected to the network, torrent server, syncthing server for mobile phone backup, and Arch Linux proxy, since I’ve a couple of machines that basically pull the same updates as each other.
I’ve retired it in favour of a mini PC, so it’s back to being a RetroPie server, have loads of old games available in the spare room for when we have a party, amuses children of all ages.
They’re quite capable machines. If they weren’t so I/O limited, they’d be amazing. They tend to max out at 10 megabyte/second on SD card or over USB / ethernet. If you don’t need a faster disk than that, they’re likely to be ideal in the role.


Zelda 3? You get fast travel quite early and the world is packed with stuff, it’s not absurdly huge. Doesn’t have that bloody owl in it either, telling you the obvious at great length.
Certainly not Wind Waker, anyway. Now there is a slow game.


No unexpected crashes, no game breaking bugs. Performance was… dubious. It looks amazing, but UE5 has scalability issues. None of the graphics options seemed to do anything for frame count.


The studio is mostly ex-Ubisoft employees. So yeah, it’s their first game as that studio, but they’re by no means novice developers. Fair play to them for following their passion though, it’s paid off.


“Mostly perfectly unless they’ve got anti-cheat, and you’ll be limited to 30 fps for most of the fancy-graphics titles.”
Actually pretty damn good. Considering the difficulty I had getting frames out of E33 on desktop, having it play reasonably on the go is impressive. 60 fps @ 4K made my PC sound like a vacuum cleaner and was warming up the whole house; really needs some of the upscaling trickery to be comfortable to play.


Best story, for sure. Most emotionally affecting is Majora’s, for me, but TP is close.
Don’t think the gameplay holds up. The Wii version is pure waggle, but even on the Gamecube, there’s a lot of filler - empty space and backtracking. Doesn’t respect your gaming time.


systemd-networkd gets installed by default by Arch, integrates a bit better with the rest of SystemD, doesn’t have so many VPN surprises, and the configuration is a bit more obvious to me - a few config files rather than NetworkManager’s “loads of scripts” approach. Small niggles rather than big issues.
Really, I just don’t want duplication of services - more stuff to keep up-to-date. And if I’ve got SystemD anyway, might as well use it…


NetworkManager dependencies can now be disabled at build time…
Nice. It was a damned nuisance that Cinnamon brought its own network stack with it. All my headless servers and my Plasma gaming desktop use systemd-networkd, which meant that my Cinnamon laptop needed different configuration. Now they can all be the same.
Hopefully the new release will bash a few of the remaining Wayland bugs; Plasma is great but I prefer Cinnamon for work, and it’s just too buggy for gaming on a multi-monitor setup at the moment.


Especially since any version of Git from the last view years has a passionate hatred of symlinks for this reason, which is a bit annoying if you’ve a legit usecase. They’re either very out-of-date, or have done some very foolish customisation…
Criminal waste of elotes, though. I’ll have them if they don’t want them.
The problem is that the volume of slop available completely overwhelms all efforts at quality control. Zealotry only goes so far at turning back the tsunami of shite.