• Windows Latest discovered Discord and other Chromium and Electron-based applications with high RAM usage
  • RAM usage spikes from 1GB to 4GB on Discord both in and out of voice chat
  • xthexder@l.sw0.com
    link
    fedilink
    English
    arrow-up
    19
    ·
    13 hours ago

    Windows Latest discovered Discord and other Chromium and Electron-based applications with high RAM usage

    Lol, this is news? Where have they been the last 15 years?

    In other news, the sky is blue.

  • UnderpantsWeevil@lemmy.world
    link
    fedilink
    English
    arrow-up
    27
    ·
    15 hours ago

    I remember how the combination of Internet mass distribution of file data and the blossoming gray market for file-share applications really super-charged the technology of file compression.

    I wonder if we’ll see skyrocketing RAM prices put economic pressure on the system bloat rampant through modern OSes.

      • UnderpantsWeevil@lemmy.world
        link
        fedilink
        English
        arrow-up
        10
        ·
        11 hours ago

        I mean, ymmv. The historical flood of cheap memory has changed developer practices. We used to code around keeping the bulk of our data on the hard drive and only use RAM for active calculations. We even used to lean on “virtual memory” on the disk, caching calculations and scrubbing them over and over again, in order to simulate more memory than we had on stick. SSDs changed that math considerably. We got a bunch of very high efficiency disk space at a significant mark up. But we used the same technology in our RAM. So there was a point at which one might have nearly as much RAM as ROM (had a friend with 1 GB of RAM on the same device that only had a 2 GB hard drive). The incentives were totally flipped.

        I would argue that the low-cost, high-efficiency RAM induced the system bloat, as applications could run very quickly even on a fraction of available system memory. Meanwhile, applications that were RAM hogs appeared to run very quickly compared to applications that needed to constantly read off the disk.

        Internet applications added to the incentive to bloat RAM, as you could cram an entire application onto a website and just let it live in memory until the user closed the browser. Cloud storage played the same trick. Developers were increasingly inclined to ignore the disk entirely. Why bother? Everything was hosted on a remote server, lots of the data was pre-processed on the business side, and then you were just serving the results to an HTML/Javascript GUI on the browser.

        Now it seems like tech companies are trying to get the entire computer interface to be a dumb terminal to the remote data center. Our migration to phones and pads and away from laptops and desktops illustrates as much. I wouldn’t be surprised if someone finally makes consumer facing dumb-terminals a thing again - something we haven’t really experienced since the dawn of personal computers in the 1980s.

        But TL; DR; I’d be more inclined to blame “bloat” on internet web browsers and low cost memory post '00s than on AI written-code.

        • nosuchanon@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          2 hours ago

          I mean, ymmv. The historical flood of cheap memory has changed developer practices. We used to code around keeping the bulk of our data on the hard drive and only use RAM for active calculations. We even used to lean on “virtual memory” on the disk, caching calculations and scrubbing them over and over again, in order to simulate more memory than we had on stick. SSDs changed that math considerably. We got a bunch of very high efficiency disk space at a significant mark up. But we used the same technology in our RAM. So there was a point at which one might have nearly as much RAM as ROM (had a friend with 1 GB of RAM on the same device that only had a 2 GB hard drive). The incentives were totally flipped.

          I would argue that the low-cost, high-efficiency RAM induced the system bloat, as applications could run very quickly even on a fraction of available system memory. Meanwhile, applications that were RAM hogs appeared to run very quickly compared to applications that needed to constantly read off the disk.

          Internet applications added to the incentive to bloat RAM, as you could cram an entire application onto a website and just let it live in memory until the user closed the browser. Cloud storage played the same trick. Developers were increasingly inclined to ignore the disk entirely. Why bother? Everything was hosted on a remote server, lots of the data was pre-processed on the business side, and then you were just serving the results to an HTML/Javascript GUI on the browser.

          Now it seems like tech companies are trying to get the entire computer interface to be a dumb terminal to the remote data center. Our migration to phones and pads and away from laptops and desktops illustrates as much. I wouldn’t be surprised if someone finally makes consumer facing dumb-terminals a thing again - something we haven’t really experienced since the dawn of personal computers in the 1980s.

          It is definitely coming and fast. This was always Microsoft’s plan for an internet only windows/office platform. Onedrive and 365 is basically that implementation now that we have widespread high speed internet.

          And with the amount of SaaS apps the only thing you need on a local machine is some configuration files and maybe a downloads folder.

          Look at the new Nintendo Switch cartridges as an example. They don’t contain the game, just a license key. The install is all done over the internet.

  • kalpol@lemmy.ca
    link
    fedilink
    English
    arrow-up
    19
    ·
    15 hours ago

    And here I am resurrecting Dell laptops from 2010 with 1.5gb DDR RAM and Debian

    • mcv@lemmy.zip
      link
      fedilink
      English
      arrow-up
      10
      ·
      14 hours ago

      I remember when they changed the backronym for Emacs from “Eight Megabytes And Constantly Swapping” to Eighty. Megabytes. Or when a Netscape developer was proud to overtake that memory use.

      What’s the point of more RAM and faster processors if we just make applications that much less efficient?

      • The Quuuuuill@slrpnk.net
        link
        fedilink
        English
        arrow-up
        7
        ·
        14 hours ago

        “unused ram is wasted ram”

        yeah yeah yeah, great. but all you motherfuckers did that and i’m fucking out of ram.

        • pftbest@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 hour ago

          This phrase is just plain wrong. Unused ram is used for the page cache by the kernel. You must always have some ram free or else the whole system will not operate without a page cache. Larger page cache allows to cache more files from the file system.

        • Korhaka@sopuli.xyz
          link
          fedilink
          English
          arrow-up
          3
          ·
          12 hours ago

          I want to run more than 1 process thanks. So fuck off with you trying to eat 3GB to render a bit of text.

  • BlueBockser@programming.dev
    link
    fedilink
    English
    arrow-up
    23
    ·
    18 hours ago

    Yeah, the RAM shortage is definitely to blame on Electron. Won’t someone please think of the poor AI companies who have to give an arm and a leg to get a single stick of RAM!

    • floofloof@lemmy.ca
      link
      fedilink
      English
      arrow-up
      1
      ·
      2 hours ago

      I wouldn’t mind so much if they were giving their own arms and legs, but they seem to be giving ours.

    • HugeNerd@lemmy.ca
      link
      fedilink
      English
      arrow-up
      2
      ·
      7 hours ago

      If you have a better way of generating videos of absurdly obese Olympic divers doing the bomb from a crane, I’d love to hear it.

  • Kissaki@feddit.org
    link
    fedilink
    English
    arrow-up
    33
    ·
    edit-2
    23 hours ago

    I guess the prices give us a new kind of issue ticket template; “new RAM is too expensive for me, please consider optimizing”

    Less abstract, more concrete than “take less of a share please”

    • Kairos@lemmy.today
      link
      fedilink
      English
      arrow-up
      2
      ·
      17 hours ago

      Electron should be a system dependency entirely so that every single app doesn’t have to be individually updated whenever there’s a chromium CVE which seems to be weekly.

  • Lightsong@lemmy.world
    link
    fedilink
    English
    arrow-up
    6
    ·
    16 hours ago

    I have couple of old 8 gb sticks from my old 960 GPU pc. Is there any way for me to stick it onto my new pc and have only certain app use it and nothing else?

    • towerful@programming.dev
      link
      fedilink
      English
      arrow-up
      4
      ·
      edit-2
      15 hours ago

      Only for multi CPU mobos (and that would be pinning a thread to a CPU/core with NUMA enabled where a task accessed local ram instead of all system ram). Even then, I think all ram would run at the lowest frequency.
      I’ve never mixed CPUs and RAM speeds. I’ve only ever worked on systems with matching CPUs and ram modules.

      I think the hardware cost and software complexity to achieve this is beyond the cost of “more ram” or “faster storage (for faster swap)”

    • Logical@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      15 hours ago

      As to whether it’s possible to get certain apps use specific physical RAM sticks, I am not sure, but that seems unlikely and would probably require some very low level modifications to your operating system. But even before you get to that point you’d have to physically connect them to your new motherboard, which will only work if there are both free RAM slots on it, and your new motherboard has slots for the same generation of RAM that your old PC uses.

  • GnuLinuxDude@lemmy.ml
    link
    fedilink
    English
    arrow-up
    54
    ·
    1 day ago

    The proliferation of electron programs is what happens when you have a decade of annoying idiots saying “unused memory is wasted memory,” hand-in-hand with lazy developers or unscrupulous managers who are externalizing their development costs onto everybody else by writing inefficient programs that waste more and more of our compute and RAM, which necessitates the rest of us having to buy even better hardware to keep up.

    • CeeBee_Eh@lemmy.world
      link
      fedilink
      English
      arrow-up
      12
      ·
      22 hours ago

      annoying idiots saying “unused memory is wasted memory,”

      The original intent of this saying was different, but ya it’s been co-opted into something else

      • pftbest@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 hour ago

        So what was the original saying? As I see it, this phrase is wrong no matter how you look at it. Because all ram is used at all times, for example if you have 32GB of free ram, the kernel will use all of it as a page cache to speed up the file system. The more free ram you have the more files can be cached, avoiding access to the disk when you read them.

  • Blackmist@feddit.uk
    link
    fedilink
    English
    arrow-up
    40
    ·
    1 day ago

    I wouldn’t mind them all using HTML for UI if they’d learn to share the same one, and only load it when they need to show me something.

    No, Razer, your “mouse driver” does not need to load Chrome at all times, when I’ll only ever look at it once.

    • A_norny_mousse@feddit.org
      link
      fedilink
      English
      arrow-up
      30
      ·
      1 day ago

      No, Razer, your “mouse driver” does not need to load Chrome at all times, when I’ll only ever look at it once.

      It’s funny; on Linux such devices work perfectly but many users complain that they “aren’t supported” because there’s no UI (that sits uselessly in your notification area and eats memory).