In a recent survey, we explored gamers’ attitudes towards the use of Gen AI in video games and whether those attitudes varied by demographics and gaming motivations. The overwhelmingly negative attitude stood out compared to other surveys we’ve run over the past decade.

In an optional survey (N=1,799) we ran from October through December 2025 alongside the Gamer Motivation Profile, we invited gamers to answer additional questions after they had looked at their profile results. Some of these questions were specifically about attitudes towards Gen AI in video games.

Overall, the attitude towards the use of Gen AI in video games is very negative. 85% of respondents have a below-neutral attitude towards the use of Gen AI in video games, with a highly-skewed 63% who selected the most negative response option.

Such a highly-skewed negative response is rare in the many years we’ve conducted survey research among gamers. As a point of comparison, in 2024 Q2-Q4, we collected survey data on attitudes towards a variety of game features. The chart below shows the % negative (i.e., below neutral) responses for each mentioned feature. In that survey, 79% had a negative attitude towards blockchain-based games. This helps anchor where the attitude towards Gen AI currently sits. We’ll come back to the “AI-generated quests/dialogue” feature later in this blog post since we break down the specific AI use in another survey question.

  • leftzero@lemmy.dbzer0.com
    link
    fedilink
    arrow-up
    21
    ·
    23 hours ago

    No, laziness is good. Laziness begets engineering.

    The issue is that “generative AI” (which is neither generative nor intelligence) is built upon the stolen works of countless artists.

    The issue is that it consumes massive amounts of resources and energy to produce mediocre results at best.

    The issue is that it threatens the livelihood of whole segments of society, especially the ones who contribute the most to human culture.

    The issue is that it’s not sustainable. Once it runs out of new content to plagiarize it will be unable to produce anything new. It can’t replace what it’s destroying.

    The issue is that it’s so vastly inefficient that the data centres needed to sustain it are becoming a major contributor to global warming.

    The issue is that its bubble is causing massive price increases in consumer computer parts.

    The issue is that when it pops it’ll take the rest of the economy with it.

    The issue is that it’s a gateway drug. It’s being sold at a loss to destroy the human competition, and will inevitably increase massively in price once it’s become a necessary part of everyone’s process.

    The issue is that it’s being forced everywhere regardless of its uselessness for the task, replacing technologies that were actually useful and making everything less useable and more inefficient.

    The issue is that it’s making everything less reliable, and will inevitably cause massive damage and loss of life.

    The issue is that LLM use has been demonstrated to cause brain damage, yet they elude regulation and the companies selling them have yet to face consequences.

    The issue is that all of this makes it an existential threat to humanity, and a significant contributor to the ones we were already facing.

    The issue is that, once you’ve taken into account all the pros and cons, doing everything possible to ensure it ceases to exist as soon as possible in any way, shape, or form, together with the companies selling it and the CEOs responsible for them and any politicians and investors enabling them, becomes an evident moral and ethical imperative.

    • P03 Locke@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      4
      ·
      edit-2
      17 hours ago

      If looking at a picture is stealing, then I’m doing so every day when I browse a web page or Google Image Search.

      Energy needs are a concern, but it was never a new problem. Our energy needs have been ramping up ever since we learned how to make electricity.

      The ones who “contribute to human culture” are the 1% who are lucky enough to make a career out of making art or making music or whatever other creative talent they had. The problem is oversaturation, not AI. AI makes the problem worse, but so does the Internet and every other technological leap we’ve seen.

      “Once it runs out of new content to plagiarize it will be unable to produce anything new.” Sounds like humans in a nutshell. Good artists borrow, great artists steal. Creativity itself is not sustainable at the rate we consume it. Every new thing is drilled into the ground, and beaten into a bloody pulp, until we find the next new thing. This is not a new problem.

      Capitalism is the enemy of humanity. Capitalists wield AI as a weapon, and we treat it as a scapegoat. We think that we can just get rid of AI, and then the enemy is gone. But, AI isn’t going away, and the same enemy we’ve always had still exists.

      Use it to your advantage. Use local models. Support open source LLMs. The biggest failure of rich capitalist assholes is sheer, absolute overconfidence and an inability to relate to the people they are trying to fleece.

      • Deyis@beehaw.org
        link
        fedilink
        arrow-up
        1
        ·
        3 hours ago

        If looking at a picture is stealing. . .

        Except that’s not what AI and those who use it are doing. This is a deliberate oversimplification to try to excuse derivative and copied works of artists who have had their art stolen. When you do it, it’s copyright infringement. When AI does it, you get a deluge of people who lack the patience and discipline to actually produce any creative work trying to excuse it.

        • P03 Locke@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          1
          ·
          3 hours ago

          This is a deliberate oversimplification to try to excuse derivative and copied works of artists who have had their art stolen.

          It’s not. You misunderstand both copyright law and how LLMs work.

          Models are GBs of weights, typically in the 4GB to 24GB range. LLMs do not look at a picture and then copy that picture into the model. There’s not enough disk space to do something like that. It’s used for training, adjusting weights here and there, based on how the image links to the description. You can’t just say “recreate the Mona Lisa” and have it give you a pixel perfect copy of the original.

          When you do it, it’s copyright infringement.

          It’s not copyright infringement to copy a style. People do it all the time. You wouldn’t believe the amount of times I’ve seen something that I thought was some unique style, and thought that one artist did it, but it turns out it’s just another copycat artist “inspired by” the more popular artist.

          Because that’s what people do to something unique, or even remotely rare: Copy it a thousands times and drive it into the ground until you’re fucking sick of it.

          • Deyis@beehaw.org
            link
            fedilink
            arrow-up
            1
            ·
            2 hours ago

            uMmMmM aCkTuALLy

            Taking the work of artists without compensating them for your own commercial gain is ethically bankrupt and theft. The fact that you keep likening an AI model to actual person demonstrates that this isn’t a conversation worth continuing.