There have been times it’s been used against a whole carful of people, and cars are bigger than seven inches.
There have been times it’s been used against a whole carful of people, and cars are bigger than seven inches.


Even in 2000, I feel like they should have been able to compromise, e.g. by doing his sclera and covering up breathing holes on the computer, but still having the rest be makeup, prosthetics and a costume.


ECC genuinely is the only check against memory bitflips in a typical system. Obviously, there’s other stuff that gets used in safety-critical or radiation-hardened systems, but those aren’t typical. Most software is written assuming that memory errors never happen, and checksumming is only used when there’s a network transfer or, less commonly, when data’s at rest on a hard drive or SSD for a long time (but most people are still running a filesystem with no redundancy beyond journaling, which is really meant for things like unexpected power loss).
There are things that mitigate the impact of memory errors on devices that can’t detect and correct them, but they’re not redundancies. They don’t keep everything working when a failure happens, instead just isolating a problem to a single process so you don’t lose unsaved work in other applications etc… The main things they’re designed to protect against are software bugs and malicious actors, not memory errors, it just happens to be the case that they work on other things, too.
Also, it looks like some of the confusion is because of a typo in my original comment where I said unrecoverable instead of recoverable. The figures that are around 10% per year are in the CE column, which is the correctable errors, i.e. a single bit that ECC puts right. The figures for unrecoverable/uncorrectable errors are in the UE column, and they’re around 1%. It’s therefore the 10% figure that’s relevant to consumer devices without ECC, with no need to extrapolate how many single bit flips would need to happen to cause 10% of machines to experience double bit flips.


It wasn’t originally my claim - I replied to your comment as I was scrolling past because it had a pair of sentences that seemed dodgy, so I clicked the link it cited as a source, and replied when the link didn’t support the claim.
Specifically, I’m referring to
A single bit flipped by a gamma ray will not cause any sort of issue in any modern computer. I cannot overstate how often this and other memory errors happen.
This just isn’t correct:


That study doesn’t seem to support the point you’re trying to use it to support. First it’s talking about machines with error correcting RAM, which most consumer devices don’t have. The whole point of error correcting RAM is that it tolerates a single bit flip in a memory cell and can detect a second one and, e.g. trigger a shutdown rather than the computer just doing what the now-incorrect value tells it to (which might be crashing, might be emitting an incorrect result, or might be something benign). Consumer devices don’t have this protection (until DDR5, which can fix a single bit flip, but won’t detect a second, so it can still trigger misbehaviour). Also, the data in the tables gives figures around 10% for the chance of an individual device experiencing an unrecoverable error per year, which isn’t really that often, especially given that most software is buggy enough that you’d be lucky to use it for a year with only a 10% chance of it doing something wrong.


The press widely covered AV as if it was incredibly expensive and didn’t solve any problems, so presented it as if we’d be throwing away beds at children’s hospitals, support for pensioners and equipment for soldiers just to introduce pointless bureaucracy. If the choice was the one most voters thought they were making, then voting against it would have been the sensible option.
Some of the charity is self-serving, e.g. eradicating diseases means he’s less likely to catch them (and really any billionaire not funnelling funds to pandemic prevention etc. is being moronic), and founding charter schools on land he owns so over the life of the school they pay more in rent for the lease than they cost to build is just a tax dodge. Most billionaires are just so evil that they won’t spend money on themselves if other people who aren’t paying also benefit, so in comparison, Gates’ better ability to judge what’s in his interests makes him look good.


The main point of 32-bit Windows 10 wasn’t to make it run on non-64-bit hardware, it’s that x86 processors can’t run in 16-bit mode if they were booted in 64-bit mode, so if you’ve got an old 16-bit Windows/DOS/CPM app that you’ve absolutely got to run natively instead of through DOSBox and have to use modern Windows instead of an older version, it needs to be 32-bit. By the time Windows 11 released, Microsoft had decided that nearly no one still wanted to do that anymore.
You should be leaving enough stopping distance between yourself and the next car that someone can merge easily and you have time to react by slowing down or moving to the next lane to make space for them. If you don’t have that much stopping distance, then you’re already in danger if the car in front brakes suddenly, e.g. if they need to do an emergency stop because of something you’ve not seen, they have a medical event making them lose consciousness and accidentally step on the brake pedal, or their car breaks down in a way that forces the breaks on.
In a lot of the world they’re regulated as novelty items, so free from the regulation that stops harmful chemicals being in things like kitchen utensils and childrens’ toys, despite many of the same potential risks being present. You don’t need to use a corner-cutting regulation-ignoring retailer like Wish to get your fix of toxic plasticisers etc…


As it says in the article, it’ll be smaller and quieter, so less offensive for most people’s living rooms than a full-size desktop. It’s not meant to replace your existing PC if you have one, unless it was getting old and you were about to replace it anyway. If you don’t have a PC, or don’t have one in the living room, then it might be a better option than anyone else’s prebuilt.

The Scott Trust Limited, which is effectively still The Guardian, and was created to guarantee its financial and editorial independence. There’s a reason why Snowden went to them specifically to leak his leaks etc.


They focus on the low end, but a lot of people who only want low end GPUs care about Minecraft, and their OpenGL performance is abysmal, so Minecraft runs like molasses. If you play a lot of Minecraft, Nvidia’s cards manage to be much better value. There’s no reason why this couldn’t be fixed if they’re willing to invest in writing native OpenGL drivers instead of going through a compatibility player or mitigated if they invested more in making the compatibility layer faster, but that would mean deciding they’re losing a lot of sales this way, as it wouldn’t be cheap.
It’ll be maintained for a while, so we might get to 3.14.15 or 3.14.16, which will be a better approximation and better because of more bugfixes.
It tends to attract negative attention if you admit there’s a civil war going on.


Usually when people post a source, the numbers say that at median screen sizes and distances from the screen, 4K isn’t perceptibly better than 1440p, and the person writing it up as an article has misunderstood the conclusion as saying 4K isn’t better than 1080p rather than that it isn’t better than 1440p. TVs tend not to be made with 1440p resolution, so upgrading from 1080p gets you right to 4K, skipping the sweet spot.


It’s still a luxury yacht decked out with nearly all the things you’d expect from a half a billion dollar superyacht. Only part of it is customised for research. If the main goal was to turn half a billion dollars into a research boat, this isn’t the boat that would get made.
It’s its default use case - adding MBA idiocy to things that were already fine and didn’t need changing.


Investors have been happy to incentivise companies to hire idiot CEOs and managers who say the right buzzwords but reduce output by making bad decisions and only hiring people who don’t think they’re bad decisions, so an automated buzzword-dispensing idiot isn’t necessarily going to seem to investors like a downgrade compared to what they think most workers are. They’re just as likely to think AI lets them invest in companies where even the lowest tier employees are potential CEO material, and continue not noticing that the per-employee efficiency keeps going down. Data showing that layoffs nearly never pay for themselves doesn’t stop stock prices soaring whenever one happens, so I wouldn’t expect data showing AI makes companies less profitable to stop stock prices going up when a company announces a new dumb way they’ll use it.
That’s what’s keeping the lights on. If they sunk the extra billions into making their discrete cards genuinely superior to Nvidia’s (which already means taking it for granted that selling comparable products for less money makes them knockoff rather than superior), then Nvidia could stop them recouping the development costs by eating into their own margins to drop their prices. Over the last decade or two, ATi/AMD’s big gambles have mostly not paid off, whereas Nvidia’s have, so AMD can’t afford to take big risks, and the semi-custom part of the business is huge long-term orders that mean guaranteed profit.