• Phoenixz@lemmy.ca
    link
    fedilink
    English
    arrow-up
    34
    ·
    2 days ago

    So the editor asked AI to come up with an image for the title “Gamers desert Intel in droves” and so we get a half-baked pic of a CPU in the desert.

    Am I close?

  • imetators@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    8
    ·
    2 days ago

    Intel and their last couple of processor generations were a failure. AMD, on the other hand, been consistent. Look at all these tiny AMD APUs that can run C2077 on a 35W computer that fits in the palm of a hand? Valve is about to drop a nuclear bomb on nvidia, intel and microslop with Gabecube.

  • Lfrith@lemmy.ca
    link
    fedilink
    English
    arrow-up
    11
    ·
    2 days ago

    So happy I chose to go with AM4 board years ago. Was able to go from Zen+ CPU to X3D CPU.

    I remember people said back then people usually don’t upgrade their CPU, so its not that much a selling point. But, people didn’t upgrade because they couldn’t due to constant socket changes on the Intel side.

    My fps numbers were very happy after the CPU upgrade, and I didn’t have to get a new board and new set of ram.

  • 🔰Hurling⚜️Durling🔱@lemmy.world
    link
    fedilink
    English
    arrow-up
    6
    ·
    2 days ago

    One thing that may or may not have something to do with people leaving Intel might be related to their relationship with Israel. Not trying to make this political, but it’s something I’ve seen some folks mention before.

    • Echo Dot@feddit.uk
      link
      fedilink
      English
      arrow-up
      6
      ·
      2 days ago

      I’ve always thought it’s a super weird place for them to have a fab just in general. It’s never been the most politically stable part of the world and surely you don’t want your several billion dollar infrastructure getting blown up, so why would you put it somewhere where that’s more likely?

      • 🔰Hurling⚜️Durling🔱@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 day ago

        Could be, like I said, not making it political. My current setup is running a 12th gen i9 and runs fine. I’m just repeating what I’ve seen other across Lemmy, Reddit, etc. have said in other places

  • bufalo1973@piefed.social
    link
    fedilink
    English
    arrow-up
    2
    ·
    1 day ago

    One can only dream about people fleeing x86-64 and going ARM or, even better, RISC-V.

    But no, it’s only changing the collar to the dog. But the dog stays the same.

    • Agent_Karyo@piefed.worldOP
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      1 day ago

      Why though? X Elite lags x86 on battery life, performance and compatibility (and you can’t really run Linux on X Elite).

      I am not a fan of Intel, AMD, Nvidia, but what’s the point of moving to ARM for the sake of moving?

      Unlike most, I actually have been running ARM on home server for almost a decade. For that use case it makes sense because it’s cheap and well supported.

      • bufalo1973@piefed.social
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 day ago

        It would be better to switch to RISC-V because it has no problems with patents and everyone can build a RISC-V CPU, not only 2 companies.

        • Agent_Karyo@piefed.worldOP
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 day ago

          I would be happy to, but it’s currently not an option for desktop/laptop.

          Would be great for an SBC where the OS and apps are open source and performance is less of an issue.

          ARM has all the same drawbacks as x86 and it’s not a Deus Ex machina that gives high performance at low power consumption because of magic.

          • bufalo1973@piefed.social
            link
            fedilink
            English
            arrow-up
            1
            ·
            1 day ago

            Imagine Europe pushing RISC-V and sharing upgrades with China¹. The power of the flagship would soon reach ARM or even x86-64 in a few years.

            ¹ China is already using RISC-V as much as they can.

            • Agent_Karyo@piefed.worldOP
              link
              fedilink
              English
              arrow-up
              2
              ·
              1 day ago

              I would support that, but it would require European unity and a strategic decision to make a permanent break with the US.

  • RamRabbit@lemmy.world
    link
    fedilink
    English
    arrow-up
    252
    ·
    3 days ago

    Yep. Intel sat on their asses for a decade pushing quad cores one has to pay extra to even overclock.

    Then AMD implements chiplets, comes out with affordable 6, 8, 12, and 16 core desktop processors with unlocked multipliers, hyperthreading built into almost every model, and strong performance. All of this while also not sucking down power like Intel’s chips still do.

    Intel cached in their lead by not investing in themselves and instead pushing the same tired crap year after year onto consumers.

      • nokama@lemmy.world
        link
        fedilink
        English
        arrow-up
        46
        ·
        3 days ago

        And all of the failures that plagued the 13 and 14 gens. That was the main reason I switched to AMD. My 13th gen CPU was borked and had to be kept underclocked.

        • bufalo1973@piefed.social
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 day ago

          In the 486 era (90s) there was a not official story about the way Intel marked its CPUs: instead of starting slow and accelerate until failure, start as fast as you can and slow down until it doesn’t fail.

          • nokama@lemmy.world
            link
            fedilink
            English
            arrow-up
            3
            ·
            edit-2
            1 day ago

            It would cause system instability (programs/games crashing) when running normally. I had to underclock it through Intel’s XTU to make things stable again.

            This was after all the BIOS updates from ASUS and with all BIOS settings set to the safe options.

            When I originally got it I did notice that it was getting insanely high scores in benchmarks, then the story broke of how Intel and motherboard manufacturers were letting the CPUs clock as high as possible until they hit the thermal limit. Then mine started to fail I think about a year after I got it.

      • billwashere@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        2 days ago

        Or the 1200 different versions of CPUs. We just got some new Dell machines for our DR site last year and the number of CPU options was overwhelming. Is it really necessary for that many different CPUs?

        • real_squids@sopuli.xyz
          link
          fedilink
          English
          arrow-up
          5
          ·
          edit-2
          2 days ago

          Tbf AMD is also guilty of that, in the laptop/mobile segment specifically. And the whole AI naming thing is just dumb, albeit there aren’t that many of those

      • kieron115@startrek.website
        link
        fedilink
        English
        arrow-up
        13
        ·
        edit-2
        3 days ago

        I just read the other day that at least one motheboard manufacturer is bringing back AM4 since DDR4 is getting cheaper than DDR5, even with the “this isn’t even manufactured anymore” price markup. That’s only even possible because of how much long-term support AMD gave that socket.

        • Captain Aggravated@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          37
          ·
          3 days ago

          I think AMD also did a smart thing by branding their sockets. AM4, AM5, what do you think is going to be next? I bet it’s AM6. What came after the Intel LGA1151? It wasn’t LGA1152.

          • Junkers_Klunker@feddit.dk
            link
            fedilink
            English
            arrow-up
            16
            ·
            3 days ago

            Yea, for the customer it really doesn’t matter how many pins a certain socket has, only is it compatible or not.

          • 1Fuji2Taka3Nasubi@piefed.zip
            link
            fedilink
            English
            arrow-up
            6
            ·
            3 days ago

            AMD tried the Intel thing too by stopping support of past generation CPU on latter AM4 boards though. Only after public outcry did they scrap that. Wouldn’t put it past them to try it again on AM5.

            • Captain Aggravated@sh.itjust.works
              link
              fedilink
              English
              arrow-up
              4
              ·
              3 days ago

              Are there a lot of people wanting to plug Zen 1 chips into B550 motherboards? Usually it’s the other way around, upgrading chip in an old motherboard.

              • 1Fuji2Taka3Nasubi@piefed.zip
                link
                fedilink
                English
                arrow-up
                1
                ·
                2 days ago

                It can happen if the old motherboard failed, which was more likely than the CPU failing.

                There was talk of not providing firmware update for old chipsets to support new gen CPU as well, which is relevant to the cases you mentioned.

      • UnspecificGravity@piefed.social
        link
        fedilink
        English
        arrow-up
        17
        ·
        3 days ago

        As a person that generally buys either mid-tier stuff or the flagship products from a couple years ago, it got pretty fucking ridiculous to have to figure out which socket made sense for any given intel chip. The apparently arbitrary naming convention didn’t help.

        • real_squids@sopuli.xyz
          link
          fedilink
          English
          arrow-up
          7
          ·
          3 days ago

          It wasn’t arbitrary, they named them after the number of pins. Which is fine but kinda confusing for your average consumer

          • UnspecificGravity@piefed.social
            link
            fedilink
            English
            arrow-up
            16
            ·
            3 days ago

            Which is a pretty arbitrary naming convention since the number of pins in a socket doesn’t really tell you anything especially when that naming convention does NOT get applied to the processors that plug into them.

    • Valmond@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      53
      ·
      3 days ago

      They really segmented that market in the worst possible way, 2 cores and 4 cores only, possibility to use vms or overclock, and so on. Add windoze eating up every +5%/year.

      Remember buying the 2600(maybe X) and it was soo fast.

      • halcyoncmdr@lemmy.world
        link
        fedilink
        English
        arrow-up
        25
        ·
        3 days ago

        The 2600k was exceptionally good and was relevant well past the normal upgrade timeframes.

        Really it only got left behind because of its 4C/8T limit as everything started supporting lots of threads instead of just a couple, and just being a 2nd Generation i7.

        • Trainguyrom@reddthat.com
          link
          fedilink
          English
          arrow-up
          4
          ·
          3 days ago

          Really it only got left behind because of its 4C/8T limit as everything started supporting lots of threads instead of just a couple, and just being a 2nd Generation i7.

          Past me made the accidentally more financially prudent move of opting for the i7-4790k over the i5-4690k which ultimately lasted me nearly a decade. At the time the advice was of course “4 cores is all you need, don’t waste the money on an i7” but those 4 extra threads made all the difference in the longevity of that PC

    • wccrawford@discuss.online
      link
      fedilink
      English
      arrow-up
      37
      ·
      3 days ago

      All of the exploits against Intel processors didn’t help either. Not only is it a bad look, but the fixes reduced the speed of the those processors, making them quite a bit worse deal for the money after all.

      • MotoAsh@piefed.social
        link
        fedilink
        English
        arrow-up
        19
        ·
        3 days ago

        Meltdown and Spectre? Those also applied to AMD CPUs as well, just to a lesser degree (or rather, they had their own flavor of similar vulnerabilities). I think they even recently found a similar one for ARM chips…

          • MotoAsh@piefed.social
            link
            fedilink
            English
            arrow-up
            6
            ·
            3 days ago

            Yea that definitely sounds like Intel… Though it’s still worth pointing out that one of them was a novel way to spy on program memory that affects many CPU types and not really indicative of a dropped ball. (outside of shipping with known vulnerabilities, anyways)

            … The power stuff from 12/13th gens or what ever though… ouch, massive dropped ball.

    • brucethemoose@lemmy.world
      link
      fedilink
      English
      arrow-up
      7
      ·
      3 days ago

      Even the 6-core Phenom IIs from 2010 were great value.

      But to be fair, Sandy Bridge ended up aging a lot better than those Phenom IIs or Bulldozer/Piledriver.

        • PieMePlenty@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          ·
          2 days ago

          Honestly, not a big deal if you build PC’s to last 6-7 years, since you will be targeting a new RAM generation every time.

          • da_cow (she/her)@feddit.org
            link
            fedilink
            English
            arrow-up
            5
            ·
            2 days ago

            If only your CPU becomes a limiting factor at one point you can simply upgrade your CPU to a few generations newer cpu without having to swap out your motherboard. You can’t really do that with Intel (AFAIK they switch platforms every 2 CPU generations so depending on your CPU you may not be table to upgrade at all (can happen with AMD too, but not that frequent)

      • boonhet@sopuli.xyz
        link
        fedilink
        English
        arrow-up
        1
        ·
        2 days ago

        I mean the i7s had SMT. You had to pay extra for SMT, whereas AMD started giving it to you on every SKU except a few low-end ones.

        • jnod4@lemmy.ca
          link
          fedilink
          English
          arrow-up
          1
          ·
          2 days ago

          Is it true that all of them had SMT but they just locked it away for lower tiers processors and some managed to activate it despite Intel’s effort?

  • Voytrekk@sopuli.xyz
    link
    fedilink
    English
    arrow-up
    126
    ·
    3 days ago

    Worse product and worse consumer practices (changing sockets every 2 generations) made it an easy choice to go with AMD.

    • Prove_your_argument@piefed.social
      link
      fedilink
      English
      arrow-up
      15
      ·
      edit-2
      3 days ago

      DDR4 compatibility held on for a while though after AM5 was full DDR5.

      The only real issue they had which has led to the current dire straits is the 13th/14th gen gradual failures from power/heat which they initially tried to claim didn’t exist. If that didn’t happen AMD would still have next to no market share.

      You still find people swearing up and down that intel is the only way to go, even despite the true stagnation of progress on the processor side for a long, long time. A couple of cherry picked benchmarks where they lead by a miniscule amount is all they care about, scheduling / parking issues be damned.

      • msage@programming.dev
        link
        fedilink
        English
        arrow-up
        23
        ·
        3 days ago

        Oh hell naw, the issues with Intel came up much sooner.

        Ever since Ryzen came out, Intel just stagnated.

        • Prove_your_argument@piefed.social
          link
          fedilink
          English
          arrow-up
          6
          ·
          3 days ago

          I don’t disagree that intel has been shit for a long time, but they were still the go to recommendation all the way through the 14th gen. It wasn’t until the 5800x3d came along that people started really looking at AMD for gaming… and if you’re not doing a prebuilt odds are you wanted the fastest processor, not the one that is most efficient.

          I had a 5800x because I didn’t want yet another intel rig after a 4790k. Then I went on to the 5800x3d, before the 9800x3d now. The 5800x was behind intel, and for me it was just a stopgap anyway because a 5950x was not purchasable when I was building. It was just good enough.

          As someone who lived through the fTPM firmware issue on AM4… I can confidently state that the tpm freezes were a dealbreaker. If you didn’t use fTPM and had the module disabled, or you updated your firmware after release you were fine - but the ftpm bug was for many, MANY years unsolved. It persisted for multiple generations. You could randomly freeze for a few seconds in any game (or any software) at any time… sometimes only once every few hours, sometimes multiple times in the span of a few minutes. That’s not usable by any stretch for gaming or anything important.

          • Atherel@lemmy.dbzer0.com
            link
            fedilink
            English
            arrow-up
            7
            ·
            edit-2
            3 days ago

            and if you’re not doing a prebuilt odds are you wanted the fastest processor, not the one that is most efficient.

            strongly disagree. Prebuilds are mostly overpriced and/or have cheap components and in worst case proprietary connectors.

            I build for the best bang for bucks, and at least in my bubble so do others.

            • Prove_your_argument@piefed.social
              link
              fedilink
              English
              arrow-up
              1
              ·
              3 days ago

              Somehow I think you misunderstood my meaning.

              Prebuilt have all kinds of hardware and unfortunately many users go with those. I offered to do a 265k 5070ti build for my brother’s girlfriend but he instead spent the same amount on a 265k 5070 32gb 5200mhz prebuilt. He does some dev work and she does a tiny amount of creative work and honestly I think he wanted to make sure her system was inferior to his. 1 year warranty and you have to pay to ship the whole system in if there’s any issues. He wouldn’t even consider AMD or going with a custom build like I do for myself and others (just finished another intel build over the weekend for a coworker, diehard intel even after the issues…)

              In the custom build world I think you find more gamers and people who want the fastest gear they can afford, which is why we see gamers picking up AMD x3d chips today. They aren’t beaten and aren’t just the most expensive option.

              AM5 as a platform still has issues with memory training, though it’s largely set it and forget it until you reboot after a month or dont have memory context restore enabled in bios.

              I’m less familiar with the intel side nowadays despite literally just doing a build. They seem to win on boot times unless you accept the instability of AMD’s fast boot memory check bypass stuff. Getting a government bailout though is enough to make me want to avoid them indefinitely for my own gear so I doubt I’ll get much hands on with the current or next gen.

          • msage@programming.dev
            link
            fedilink
            English
            arrow-up
            4
            ·
            3 days ago

            I’ve had AMDs since forever, my first own build with Phenom II.

            They were always good, but Ryzens were just best.

            Never used TPM, so can’t comment on that. And most people never used it,

            But yes, so many hardcore Intel diehards, it’s almost funny if it wasn’t sad. Like Intels legacy of adding wattage to get nothing in return.

          • Mavytan@feddit.nl
            link
            fedilink
            English
            arrow-up
            1
            ·
            2 days ago

            This might be true for the top of the line builds, but for any build from budget to just below that Ryzen has been a good and commonly recommended choice for a long time

  • SleeplessCityLights@programming.dev
    link
    fedilink
    English
    arrow-up
    10
    ·
    2 days ago

    I have to lower my 12th Gen cpu multiplier to stop constant crashing when playing UE games, because everything is overlooked at the factory so they could keep up with AMD performance. Fuck Intel.

  • grue@lemmy.world
    link
    fedilink
    English
    arrow-up
    72
    ·
    3 days ago

    I’ve been buying AMD since the K6-2, because AMD almost always had the better price/performance ratio (as opposed to outright top performance) and, almost as importantly, because I liked supporting the underdog.

    That means it was folks like me who helped keep AMD in business long enough to catch up with and then pass Intel. You’re welcome.

    It also means I recently bought my first Intel product in decades, an Arc GPU. Weird that it’s the underdog now, LOL.

    • brucethemoose@lemmy.world
      link
      fedilink
      English
      arrow-up
      15
      ·
      edit-2
      3 days ago

      AMD almost always had the better price/performance

      Except anything Bulldozer-derived, heh. Those were more expensive and less performant than the Phenom II CPUs and Llano APUs.

      • grue@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        8 hours ago

        To be fair, I upgraded my main desktop directly from a Phenom II X4 840(?) to a Ryzen 1700x without owning any Bulldozer stuff in between.

        (I did later buy a couple of used Opteron 6272s, but that’s different for multiple reasons.)

      • Octagon9561@lemmy.ml
        link
        fedilink
        English
        arrow-up
        1
        ·
        2 days ago

        I’ve got an FX 8350, sure AMD fell behind during that time but it was by no means a bad CPU imo. Main PC’s got a 7800X3D now but my FX system is still working just fine to this day, especially since upgrading to an SSD and 16GB RAM some years ago. It can technically even run Cyberpunk 2077 with console like frame rates on high settings.

        • brucethemoose@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          2 days ago

          I mean… It functioned as a CPU.

          But a Phenom II X6 outperformed it sometimes, single thread and multithreaded. That’s crazy given Pildriver’s two generation jump and huge process/transistor count advantage. Power consumption was awful in any form factor.

          Look. I am an AMD simp. I will praise my 7800X3D all day. But there were a whole bunch of internet apologist for Bulldozer back then, so I don’t want to mince words:

          It was bad.

          Objectively bad, a few software niches aside. Between cheaper Phenoms and the reasonably priced 2500K/4670K, it made zero financial sense 99% of the time.

    • Someonelol@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      8
      ·
      3 days ago

      I’ve been buying since the the Phenom II days with the X3 720. One could easily unlock their 4th core for an easy performance boost. Most of the time it’d work without a hassle.

      • ripcord@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        3 days ago

        My first AMD was a 386-40. Had several of their CPUs since. But there were a few years there that it was real tough to pick AMD.

    • halfapage@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      ·
      3 days ago

      I’ve had the same approach as you. Only one time I’ve bought Intel, I’ve had this feeling that it just didn’t perform well enough to justify the price. Never regretted AMD, especially the last one which basically made me abandon discrete GPU whatsoever lol.

    • LastYearsIrritant@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      1
      ·
      3 days ago

      I decide every upgrade which one to go with. Try not to stay dedicated to one.

      Basically - Buy Intel cause it’s the best last I checked… Oh, that was two years ago, now AMD should have been the right one.

      Next upgrade, won’t make that mistake - buy AMD. Shit… AMD is garbage this gen, shoulda gotten Intel. Ok, I’ll know better next upgrade.

      Repeat forever.

      • Omgpwnies@lemmy.world
        link
        fedilink
        English
        arrow-up
        7
        ·
        3 days ago

        TBF, AMD has been pretty rock-solid for CPUs for the last 5-6 years. Intel… not so much.

        My last two computers have been AMD, the last time I built an Intel system was ~2016

  • somethingold@lemmy.zip
    link
    fedilink
    English
    arrow-up
    23
    ·
    3 days ago

    Just upgraded from an i7-6600k to an RX 7800x3D. Obviously a big upgrade no matter if I went AMD or Intel but I’m loving this new CPU. I had an AMD Athlon XP in the early 2000’s that was excellent so I’ve always had a positive feeling towards AMD.

    • NιƙƙιDιɱҽʂ@lemmy.world
      link
      fedilink
      English
      arrow-up
      8
      ·
      2 days ago

      AMD has had a history of some pretty stellar chips, imo. The fx series just absolutelty sucked and tarnished their reputation for a long time. My Phenom II x6, though? Whew that thing kicked ass.

    • YiddishMcSquidish@lemmy.today
      link
      fedilink
      English
      arrow-up
      4
      ·
      2 days ago

      I played through mass effect 3 when it was new on a discount AMD laptop with an igpu. Granted it was definitely not on max setting, but it wasn’t with everything turned all the way down either.

    • Trainguyrom@reddthat.com
      link
      fedilink
      English
      arrow-up
      8
      ·
      3 days ago

      When I updated my wife’s computer for Windows 11 I went AMD for that reason as well. They released 2 generations in a row with now well-documented hardware bugs that slowly kill the processors. 13th and 14th gen CPUs simply will have zero resale value if they last long enough to hit the second hand market. I briefly worked at an MSP at the beginning of last year and the amount of gaming computers that came in via noncommercial walk-in customers for stability issues that ultimately turned out to be the Intel CPU bugs was incredible

      • ViperActual@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        6
        ·
        3 days ago

        I initially got the 13 series, got a 14 series as the warranty replacement. Even with updated bios firmware the 14 series also began suffering from the same instability issues, sooner than the 13. Switched after that because two chips in a row, different generations is not a small mistake. Just happy I didn’t have to pay for that 14 series only to see it have the same problem

        • Trainguyrom@reddthat.com
          link
          fedilink
          English
          arrow-up
          1
          ·
          2 days ago

          That is insane. I know that’s what’s been happening because I’ve both seen it in the news and in the real world through work yet I still struggle to comprehend that this is what’s actually happening with these processors

          • ViperActual@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            2
            ·
            2 days ago

            Yeah it was highly disappointing. I’d always used Intel CPUs just because I picked one chipset and stuck with it. I even put up with the instability issues from the 13 for a while. At first I figured something else in the PC was dying on me. Until it reached a point where I literally couldn’t run certain applications because it would always crash that the news articles started coming out about the chip issues.

  • eli@lemmy.world
    link
    fedilink
    English
    arrow-up
    33
    ·
    3 days ago

    I know we shouldn’t have brand loyalty, but after the near decade of quad core only CPUs from Intel, I can’t help but feel absolute hate towards them as a company.

    I had a 3770k until AMD released their Ryzen 1000 series and I immediately jumped over, and within the next generation Intel started releasing 8 core desktop cpus with zero issues.

    I haven’t bought anything Intel since my 3770k and I don’t think I ever will going forward.

    • InFerNo@lemmy.ml
      link
      fedilink
      English
      arrow-up
      9
      ·
      3 days ago

      The 3770k was legendary. I used it for so long. I upgraded to a 7600k almost a decade ago and now just ordered my first AMD chip (Ryzen 9700X). The Intel chips were solid, did so long with them, I hope this AMD system will last as long.

      • CptOblivius@lemmy.world
        link
        fedilink
        English
        arrow-up
        9
        ·
        3 days ago

        Yep, I kept the 3770k until I bought a 7800x3d. It lasted that long, and I gave my son the 3770k system and it was still overkill to play the games he wanted. Rocket League, Minecraft, fortnite etc…

      • eli@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        3 days ago

        I still have my 3770k but it’s in storage.

        I bought a 1700X and was using that until upgrading to a 3700X, which I’m still using today in my main gaming desktop.

        I think you’ll be fine!

      • ripcord@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        3 days ago

        7700k here I will upgrade from (likely to AMD) one day. But still almost zero reason to.

  • mesa@piefed.social
    link
    fedilink
    English
    arrow-up
    46
    ·
    edit-2
    3 days ago

    I remember, it was a huge issue for programs. Developers were just not supporting other chipsets because Intel was faster than the competition and mostly cheaper. Then they got more expensive, did some shitty business to MINIX and stayed the same speed wise.

    So now we see what actual competition does.

    • DaddleDew@lemmy.world
      link
      fedilink
      English
      arrow-up
      47
      ·
      edit-2
      3 days ago

      I do want them to stay alive and sort themselves out though. Otherwise in a few years it will be AMD who will start outputting overpriced crap and this time there will be no alternative on the market.

      They’re already not interested in seriously putting competitive pressure on NVidia’s historically high GPU prices.

      • mesa@piefed.social
        link
        fedilink
        English
        arrow-up
        18
        ·
        3 days ago

        I’m personally hoping more 3rd parties start making affordable RISC V. But yeah I agree, having Intel stick aroundbwould be good for people as you said.

      • blitzen@lemmy.ca
        link
        fedilink
        English
        arrow-up
        8
        ·
        3 days ago

        Not only that, but (as an American) I do want the US to have some fab capability. A strong Intel is in our national security interest.

        • RamRabbit@lemmy.world
          link
          fedilink
          English
          arrow-up
          6
          ·
          3 days ago

          Yeah, if Taiwan is ever invaded, having US-based fabs will be crucial for world supply. Absolutely want to see Intel survive and TSMC continue to build factories here.

          Nothing would say ‘get fucked’ like Intel going belly up and Taiwan exploding. The supply of any new computer parts would be a dumpster fire for years.