• ag10n@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    11 hours ago

    Quote me in full.

    You can run it at scale, on huawei. You can also run it on a cpu

    • theunknownmuncher@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      11 hours ago

      Quote me in full.

      Okay!

      You can run at scale, on huawei. You can also run it on a cpu

      Yeah, that is absolutely not what you argued.

      Anyway, you’ve conceded that I’m correct that you cannot run it at scale on a CPU, because running on CPU is too slow and inefficient, and that they instead use GPU hardware like Huawei GPUs to run the model at scale. That’s good enough for me!

      • Diurnambule@jlai.lu
        link
        fedilink
        English
        arrow-up
        1
        ·
        2 hours ago

        Okey, then priced to just screenshot the part after the initial argument. Dude do more efforts.

      • ag10n@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        10 hours ago

        Your interpretation of the English language has won you an argument! Huzzah

        So good of you to concede it runs on cpu