• 0 Posts
  • 353 Comments
Joined 3 years ago
cake
Cake day: June 16th, 2023

help-circle




  • Yeah, the croplands came up in a discussion here…

    A farm was shutting down because a datacenter operator bought the land, a fully functioning farm. It was more profitable to sell the land than keep it viable for food production…

    Now the chances of that land ever being appropriate for farming again…



  • In hopes of making you feel better, the cache amount consumed hardly matters. It’s evictable. So if you read a gigabyte in once that you’ll never ever need again, it’ll probably just float in cache because, well, why not? It’s not like an application needs it right now.

    If you really want to feel better about your reported memory usage, sync; echo 3 > /proc/sys/vm/drop_caches. You’ll slow things down a bit as it rereads the stuff it actually needs to reuse, but particularly if your system has a lot of I/O at bootup that never happens again, a single pass can make the accounting look better.

    You could at least do it once to see how much cache can be dropped so you can feel good about the actual amount of memory if an application really needs it.

    Though the memory usage of VMs gets tricky, especially double-caching, since inside the VM it is evictable, but the host has no idea that it is evictable, so memory pressure won’t reclaim stuff in a guest or peer VM.




  • Yeah, very good analogy actually…

    I remember back in the day people putting stuff like ‘Microsoft Word’ under ‘skills’. Instead of thinking ‘oh good, they will be able to use Word competently’, the impression was ‘my god, they think Word is a skill worth bragging about, I’m inclined to believe they have no useful skills’.

    ‘Excel skills’ on a resume is just so vague, people put it down when they just figured out they can click and put things into a table, some people will be able to quickly roll some complicated formula, which is at least more of a skill (I’d rather program a normal way than try to wrangle some of the abominations I’ve seen in excel sheets).

    Using an LLM is not a skill with a significant acquisition cost. To the extent that it does or does not work, it doesn’t really need learning. If anything people who overthink the ‘skill’ of writing a prompt just end up with stupid superstitions that don’t work, and when they first find out that it doesn’t work, they just grow new prompt superstitions to add to it to ‘fix’ the problem.


  • Unless Nvidia gets in on remanufacturing, those GPUs are never going to repurposed for residential usefulness. B300 was designed from the onset for datacenter ai use exclusively, with no concessions for theoretical video out integrated to a board that overall demands over 15 kw. You couldn’t even power it with a 60A 220V circuit.

    Some of the storage could get more consumer support, SAS is unusual but if there were a glut then various solutions could emerge. Similarly EDSFF cages aren’t really a hot consumer item, especially not e1.l, but I could imagine a glut driving home friendly adaptions.

    DRAM modules are somewhere in between, though practically speaking they won’t be workable outside of their initial application.

    There was a time when home and datacenter got closer together, but there’s been quite the divergence the last few years.





  • You don’t have to target every distribution, target a vaguely credible glibc, and of course the kernel, and you are covered.

    As a distribution platform themself, they don’t have to sweat packaging N different ways, they package the way they want. Bundle all the libraries (which is not different then the way they do it in Windows, the bundle so many libraries).

    They don’t get the advantage of the platform libraries and packaging, but that is how they treat Windows already because the library situation in Windows is actually really messy, despite being ostensibly a more monolithic ecosystem.