• 0 Posts
  • 572 Comments
Joined 1 year ago
cake
Cake day: December 6th, 2024

help-circle

  • Unlike in Windows, in Linux the graphics UI concerns are outside the kernel

    is in the sense that the Linux Kernel isn’t in any way form or shape optimized for any kind of graphics features, unlike in Windows. The software design concerns about graphical interfaces are in user space and, as you say, the “kernel talks to hardware and user space driver talks to kernel”.

    I don’t see how what I wrote is inconsistent with what you wrote: the kernel only delivers access to the graphics and leaves to user space the details of what’s done with it, most notably (by comparison with Windows) all the software design concerns from having a graphical user interface - I just described in high-level Software Architecture terms the “why” for the “how” that you described in Software Design terms.

    Granted, with GPUs the complexity is so much more than with traditional systems (such as networking or data stores) that a lot of the performance improvement happens is in the graphics drivers, which is not quite kernel but kernel-adjacent, so it’s a little less perfectly split between kernel-space and user-space than what I wrote made it seem.




  • Unlike in Windows, in Linux the graphics UI concerns are outside the kernel, so graphics layers sit pretty close to the hardware, so even just from that higher performance was already expected as the adaptor layers such as Wine/Proton improved because the kernel itself is faster and gets less in the way.

    Then, of course, the Linux kernel tends to be developed by people with lots of experience, overseen by people genuinelly at the Software Architecture level of experience and given its architecture benefits from know-how both from the server-side and the front-end sides of software development.

    My point being that the Linux architecture is not only far more mature and controlled by far more experienced people than the Windows one, but it also gets way less in the way of graphics applications trying to squeeze as much performance as possible from the system hence in Linux improving such graphics applications or the adapter layers for them can go further in delivering better performance than in Windows were the kernel becomes a performance bottleneck sooner.

    What we’re seeing now is those two effects delivering, especially once the Wine/Proton adaptor layers matured and entered a stage of more performance and stability improvements than feature implementations.


  • In at most 5 years from now there will never be a better time to be a senior developer.

    Shit is already cracking (just look at all the spectacularly stupid bugs and deeply flawed design choices that MS has had recently with Windows 11) and we’ve barelly started experiencing what happens when AI coded software goes through the full software life-cycle - the kind of thing where code done by junior developers almost innevitably fails is the indirect stuff like maintenability or security practices and AI is basically a junior developer which NEVER LEARNS no matter how much you explain something to it even whilst telling you it has totally understood what you mean.

    Imagine that all of a sudden a large fraction of corporate software development is made entirelly by teams of junior developers who have such a high turnaround that every week the team changes (so you can’t ever help them learn to avoid certain kinds of errors and they never improve). Now fast forward this a couple of years and imagine what happens when all the software done by those devs has been on the Internet a couple of years exposed to all kinds of attackers, gone thorough a couple of cycles of bug-fixing and working new business requirements into it or they’ve been used for long enough by users that the data storages have been storing a year of two of data, all stored according to junior developer’s idea of what and how data should be stored.

    Shit crashing with the most stupid bugs, hacked by script kiddies running 1990s scripts because even there is no proper defensive coding for Internet exposed software or it’s riddled with holes from mis-integration of different parts, years of use leading to systems massivelly slowing down to a crawl because databases don’t have the right indices and where the same data is stored multiple times and thus riddled with inconsistent data, pretty much instant spaghettification of the code-base and especially at the design level with each different block of code generated by AI being inconsistent in coding style and software design with every other AI generated block of code, constant and massive integration problems with systems not being at all prepared for upstream data format flaws or with erroneous assumptions and just about every software change breaking downstream systems, the entire life-cycle of software systems from greenfield project to “so unmaintainable it’s cheaper to rewrite it from scratch” running in less than a year rather than 5 or 10 years with entirelly new vibe-coded versions of the software coming out every year WITH DIFFERENT BUGS and DIFFERENT USER INTERFACES with DIFFERENT QUIRKS.

    Basically every concern above junior developer level being mishandled in random ways and places even in the same code-base.

    This shit is what the adoption of AI coding will deliver us.

    Again, look at Windows 11 - we’re already seeing the rates of bugs and the gravity of them going back to how it was back in the 90s and the stupidity and obviousness of the errors exceeding even that early era of early professionalization of software development.



  • All those seemingly disparate things all serve a simple objective:

    It’s all about suppressing any attempts from civil society to change the system at a time when it has become obvious that the way the country is managed and working is not delivering anything but decay for most people. The only “disruptive” social and political thing left to run in peace is the safe, controlled outlet for public anger that are the far-right parties as these blame foreigners and social minorities (NEVER the wealthy) for all the ills of the country, thus deflecting that rage away from the people who, having always held power in Britain made things the way they are.

    You see this in plenty of other countries in Europe, but the UK is way more autoritarian (if more elegant and discrete) than the rest, possibly because its a far more classist and stratified society, with power and wealth having been in the same families for far longer than in the rest of Europe (literally the last great social upheaval in Britain was in the 17th century and it was top-down and all about religion).


  • The Snowden Revelations showed that the UK had even more pervasive Civil Society Surveillance than the US, literally levels of surveillance of the public that would make the Stasi in East Germany green with envy.

    Further, whilst in the US they walked it back a bit after those revelations, in the UK they just passed a law to retroactivelly make all of it legal, newspapers got a bunch of D-Notices (the UK’s censorship mechanism) to shut up and the editor of the newspaper who brought it all out was canned.

    Even if you know nothing about all the other scandals that have come out over the years around that in the UK (such as Green Party MEPs being under surveillance or Environmentalist groups having been infiltrated by coppers), one’s perspective on why those controlling power in Britan would want to “validate” and gatekeep where people go and what people do online, should at least be informed by what Snowden revealed and how those holder power in Britain dealt with it.


  • nanny state

    It’s almost the opposite.

    The UK is incredibly classist and it has almost no social mobility. You are born poor you’ll die poor. Same for working class, same for rich - your origins pretty much 100% dictate were you will end up no matter your competence and work ethics. Wealth there has been dynastic longer than the US has existed and most Land is in the hands of a handful of people mainly for the great achievement of their great-great… grandparent having licked royal ass.

    In this environment, the richest are educate in their special schools (curiously called “public schools” because they’re supposedly open to everybody … as long as they can pay the very expensive fees that only the upper middle class and above can afford) to be maybe the fakest, most hypocrite people on the face of the planet, a way of being in life they genuinelly think that’s just “the proper way to be”. This shit then permeates society from the top down, the “higher” one’s social class the faker they are.

    (I know this from personal experience having lived there, were I worked in professions and had hobbies that brought me in contact with all social classes, all of this with an outsider perspective of having also lived in Northern and Southern Europe)

    It’s not by chance that Theatre in the UK is maybe the best in the World - faking ALL THE TIME is how a lot of people there naturally live their lives.

    British politicials invariably come from at least the middle class, generally higher. Even the party currently in power there, called Labour and supposedly representing the working class, is headed by an upper middle class public prosecuter who has a Knighthood - you don’t get a title of Knight from the Queen (back when he got it, now it’s a King) if you’re not either from the elites already or dedicated to preserve the class system in the UK and keeping power in the hands of the “proper people” (or you’re a famous Musician or Thespian - which is how they whitewash the image of the “honors system” to make it seem meritocratic).

    So British politics nowadays is entirelly about painting policies as seemingly good when they’re no such thing and let me tell you that, compared with at least the rest of Europe, they’re spectacularly good at doing it - “World Class” at forms of deceit such as misdirection, misportrayal and generally at setting things up so that what it says on the box is almost the opposite of what it de facto does.

    (It’s not by chance that 1984 was written by a Briton - the concept of doublespeak is just a slight exagerated, less elegant and more obvious version of what the upper classes in Britain do)

    All this to say that what looks like “nanny state” to outsiders is in reality just part of the mechanics that keep that incredibly calcified and social segregated structure as it is and and has always been (with the notable exception of a period of around 3 decades after WWII) - this is just part of the massive Civil Society Surveillance aparatus they have there (already a decade ago Snowden showed that the UK had even more of it than the US) that detects and suppresses threats to the “established order” of the elites entirelly controlling politics and their scions being destined to wealth and power no matter how inept.

    Hell, even sexism there is of the “benevolent” kind - “women are fragile emotional creatures that need to be protected”, which then justifies things like for example “they’re not really capable for high pressure upper management positions”. Fake “goodness” that’s really about being condescent and suppressing people from being all they can be “for their own good”.

    At the highest levels Britain is “world class” at fakeness and being anti-meritocratic.



  • Exactly.

    You layer your crisis preparations based on probability and even capability to actually handle a certain kind of crisis (if you’re living in a major city, don’t expect to be able to handle nuclear apocalypse).

    So some cash at home can absorb problems like power going down or even electronic payment systems going down, having money in more than one bank account makes you safer against bank systems being down or even bank mistakes (or identity theft) zeroing or making inaccessible your bank account, having money in more than one country (for example, bank accounts in multiple countries) or outside the banking systems makes you more more resilient to banking crisis (like in 2008).

    And that’s just one side of things. Other things to consider are, for example, how will you power your electronics if the power is down, which can happen for a whole lot of reasons (for example, I’m in Portugal and were I had was hit by a 1 day brownout that hit the entire Iberian Peninsula and some months latter a freak storm that trashed all the high voltage lines around here which for me specifically meant 4 days without power but for some it meant weeks and even a month) - fortunately after the first event I got a power bank, so during the second one my phone and table always had power.

    And then, of course there’s food and water. Years ago when I lived in the UK and after the 2008 crash, I got some cans of freeze dried food. Never used them, still have 10 years before the expire by date. However I also always have some canned food around and I definitely needed it during those periods without power. Also I keep some bottled drinking water in my pantry and again, definitely needed it when power went down (as water goes down not that long after since the water pumps stop working, which is why when power goes down I also fill some big containers with water to use for washing and on the toilet, again things you don’t tend to think about until it happens).

    I also have a windup radio from back when I got the freeze dried food, and that too was very useful to figure out what the hell was going on the two times power went off for long periods.

    Anyways, all this is agreeing and extending your point: in my experience quite a number of small things can help you go through the far more likely smaller “crisis” without being unduly inconvenienced whilst people with zero preparation are getting desperate because they “can’t buy anything because my card doesn’t work” or “don’t have any drinking water at home”.



  • Well, my mini Mini-PC with Kodi also doubles down as the home NAS so technically I’m not really using SMB for the videos and music anymore, though those files are also shared over SMB if for some reason I want to copy them elsewhere.

    That Mini-PC has an N100 CPU which does the decoding in hardware, so it has no issues whatsoever doing it (CPU usage when playing 1080p is less that 10%) hence there really is no point for me in decoding the video files elsewhere and sending over a ton of data raw data over the network - it might even have worse performance.

    In fact that Mini-PC is also my bittorrent server with an always active VPN and it’s connected to the fiber router with Gigabit ethernet, which on the other side connects to 1 Gbps fiber to my ISP, so the whole thing is pretty performant (and even with all that CPU usage is still pretty low).

    Streaming can make sense if you have multiple TVs or devices and you want to share sessions across (i.e. watch a bit in one and then continue watching in a different one from where you stopped), but for a single device setup, not so much IMHO.


  • Same thing, but the waiter took my order directly without going via their website/app/whatever-the-fuck-was-behind-the-qr-code when I refuse to use their online system as I had already seen their menu outside and knew what I wanted.

    Funnily enough about 5 minutes later two people sat in the table next to mine and they also asked to order direct from the waiter who commented to them “Yeah, we actually have quite a number of people who don’t want to use the online system”.

    Mind you, I’m in a country where that shit isn’t at all common and it just comes out as them trying to skint on service and most places I’ve seen that opened up with digital ordering systems ended up closing down after a while.

    I bet that if they didn’t give people the option to order via the waiter their business would’ve already gone down.



  • It’s more that the consequences of a management fucking-up have many times the impact of the consequences of the fuck-ups of individual members of the team, because a single manager’s choices impact the work of multiple people.

    Further, managers who do things like setting the deadlines themselves, actively pushing the devs to lower their estimates of the time it takes to do something or, even better, actually take a junior developer’s estimations at face value, are the ones responsible for the deadlines and thus to blame when they’re missed - if in some way or another you forced certain deadlines on the team or fully trusted information from the least knowledgeable, it’s on you, not on others.

    Some managers are actually pretty good in that they don’t do that kind of shit and even properly manage things like client/external dependencies, but in my experience they’re a minority.

    Granted, when the deadlines are missed for such managers it’s not usually their fault: sometimes it’s the devs’ fault and others the fault of somebody upstream in the process (such as the client, an external provider or a business analyst).

    That said, I can see where the stereotype about managers fucking up projects would come from, especially in certain countries where the management culture in Tech is one of bullshit, incompetence and even bullying, so they have way fewer competent managers and way more abuse than countries with better management culture in Tech.


  • The problem is that even if somebody out there is willing to go to the work of making device drives, bootloaders and OS releases for the hardware, it still can’t happen because the information about how to talk to the hardware isn’t open, same thing for the bootloader and even if those things are reverse engineered often even the ability to install an OS there is locked down on the hardware.

    As you correctly say, the dream of any software just running anywhere isn’t possible because that’s not how hardware works, but the current situation is not one were what’s blocking it isn’t just the natural architectural structure of hardware, it’s one where the hardware makers have purposefully and to quite an extreme level locked down their hardware so that even if people are willing to do the work of doing what it takes to run an OS there, they can’t because necessary info to use the hardware isn’t available and the hardware is even locked down against OS installations for those who don’t have the necessary cryptographic keys.



  • In all fairness, I’ve had that issue only with the first game I pirated to play in Linux, which I actually own but the official version won’t run in Linux (under Steam, so that was using Proton) hence why I got a pirate version (which, once a couple of missing DLLs were added, worked fine - so the pirated version is the superior product).

    My point does stand that if you’re used to using things like Steam or Lutris to run your games in Linux, with pirated repacks there’s no help from scripts that make sure there are no missing DLLs, so either it’s a recent game from a good repacker like Dodi or you’re probably going to have to check the logs for missing DLLs and add them via Winetricks.

    Switching to proton-ge as the runner in Lutris does often solve the problem running a game in Linux (pirated or otherwise), just not always.