• Jarix@lemmy.world
      link
      fedilink
      English
      arrow-up
      10
      ·
      17 hours ago

      How so?

      Didn’t it just highlight how stable the software is?

      I assume bitflipping crashes most softwares. If your software is so stable that hardware errors that effect everyone equally(which may be my erroneous assumption I’ll admit) then it is staying that if Firefox is crashing on you, it might be time to run some diagnosis on your hardware.

      A litmus test as a browser

      • xxce2AAb@feddit.dk
        link
        fedilink
        English
        arrow-up
        8
        ·
        edit-2
        17 hours ago

        Fair question. I find it unnerving, because there’s very little a software developer can meaningfully do if they cannot rely on the integrity of the hardware upon which their software is running, at least not without significant costs, and ultimately if the problem is bad enough even those would fail. This finding seems to indicate that a lot of hardware is much, much less reliable than I would have thought. I’ve written software for almost thirty years and across numerous platforms at this point, and the thought that I cannot assume a value stored in RAM to reliably retain it’s value fills me with the kind of dread I wouldn’t be able to explain to someone uninitiated without a major digression. Almost everything you do on any computing device - whether a server or a smart phone relies on the assumption of that kind of trust. And this seems to show that assumption is not merely flawed, but badly flawed.

        Suppose you were a car mechanic confronted with a survey that 10 percent of cars were leaking breaking fluid - or fuel. That might illustrate how this makes me feel.

        • Jarix@lemmy.world
          link
          fedilink
          English
          arrow-up
          4
          ·
          edit-2
          16 hours ago

          Hmm thanks, also please massively digress if you would like to.

          I interpreted it like 10% is a lot if it’s 10% of a million. That 100,000. So if there’s a million things that crash Firefox that’s a high number.

          If Firefox only crashes 10 times a year because it runs that well, 10% or that 1 time it crashes from a bitflip is impressive that the rare bitflip takes up such a high percentage of total crashes because Firefox just doesn’t crash very often.

          If your dread is found to be justified that won’t be too surprising, to me, if hardware is getting made less reliable these days thing. Enshitification being the norm, and tech being in everything nowadays

          We obviously need more context from Mozilla, but this could be a canary in the mine type situation.

          But it would be kind of neat if Firefox became something of a reliable test for bitflipping unintentionally

          • xxce2AAb@feddit.dk
            link
            fedilink
            English
            arrow-up
            1
            ·
            57 minutes ago

            I agree, and there are a number of other biases to consider. Here’s some I can think of:

            • Firefox will mainly be running of desktops, laptops and smartphones. I would expect QA to be significantly better for this type of device than, say, consumer grade routers or TV boxes. But more concerning to me is stuff like cheap ATMs, industrial control systems (although Siemens have great QA) and elevator control systems etc. Infrastructure, not consumer toy, and Mozilla obviously aren’t the right people to say anything about the state of any of that.
            • While Mozilla is currently estimating approximately 200 million installs, some of those - especially on Linux - will have disabled telemetry. I know I do. With that said, I can’t recall the last time I had a FF CTD (crash to desktop) but I suspect when I did, it wasn’t even a bug but an OOM (out-of-memory) kill because I was browsing on something like a 2Gb RAM micro-portable with insufficient swap. FF is one impressively stable piece of software these days.
            • Firefox usage is not evenly globally distributed, and I have no way to reliably assess whether FF has a larger or smaller proportional usage in regions that may rely more on older or refurbished hardware, which I would expect to have higher HW error rates (although I cannot prove that either - I can’t find any good public aggregate data for RAM MTBF trends over time, but I’d be very interested if somebody else knows where to find authoritative answers on that).

            (Un)fortunately, this may be the most Mozilla can provide in terms on insight. Their users tend to be particularly sensitive of perceived or practical privacy violations, so I understand - and appreciate - their caution in gathering data.