• Stepos Venzny@beehaw.org
    link
    fedilink
    English
    arrow-up
    16
    ·
    16 hours ago

    The claim Nvidia’s making that you can apply it selectively within a given frame kind of makes me want a game where only one character looks like that so as to emphasize their otherworldly wrongness. A Dishonored game where only the Outsider has it, that kind of thing.

  • Lawliss@lemmy.ml
    link
    fedilink
    arrow-up
    45
    ·
    20 hours ago

    The most ridiculous thing is that it required TWO 5090s to run: one to run the game, the other for just the AI shit.

    • idlesheep@piefed.blahaj.zone
      link
      fedilink
      English
      arrow-up
      33
      ·
      20 hours ago

      I think that’s just more evidence that the goal isn’t for consumers to own it, but to pay to rent these services via GeForce Now or whatever, because nobody can afford a PC powerful enough for this.

      Basically we’re not the target of this commercial, shareholders are.

      • knightly the Sneptaur@pawb.social
        link
        fedilink
        arrow-up
        17
        ·
        edit-2
        16 hours ago

        That’s 100% correct, because the shareholders still haven’t learned their lesson from OnLive, Google Stadia, Amazon Luna, Nvidia GeForce Now, Xbox Game Pass Ultimate, or Playstation Plus Premium.

        Game streaming serices are never going to catch on because the capital needed to build out the infrastructure is ridiculous.

        Without even counting the cost of hosting and bandwidth, a game streaming node already costs as much as a gaming PC, easily $2k. Say it gets 100% utilization with four customers each playing 6 hours a day, and that they’re paying $20/month for the service, then it’d take 25 months for that $2k node to pay for itself. However, by that time the hardware will be old and outdated, in need of replacement.

        That’s zero return on investment for the lifetime of the hardware, absolutely no profit even in this idealized case unless you can charge more than $20. Add in the monthly expenses for colocation space, power, bandwidth, and overhead and you get a product with zero ROI even at $35/month.

        Then, consider that this node can only realistically serve a small geographic area due to network latency limitations, and that using the service requires your customers to already be paying for high-quality broadband, and you’ve got a recipe for a tiny potential customer base and massive capital investment to make nodes available in close enough proximity to all major markets.

        • tal@lemmy.today
          link
          fedilink
          English
          arrow-up
          2
          ·
          edit-2
          11 hours ago

          Game streaming serices are never going to catch on because the capital needed to build out the infrastructure is ridiculous.

          I don’t know about “never”, but I’ve made similar arguments on here predicated on the cost of building out the bandwidth — I don’t think that we’re likely going to get to the point any time soon where computers living in datacenters are a general-purpose replacement for non-mobile gaming, just because of the cost of building out the bandwidth from datacenter to monitor. Any benefit from having a remote GPU just doesn’t compare terribly well with the cost of having to effectively have a monitor-computer cable for every computer that might be used concurrently to the nearest datacenter.

          But…I can think of specific cases where they’re competitive.

          First, where power is your relevant constraint. If you’re using something like a cell phone or other battery-powered device, it’s a way to deal with power limitations. I mean, if you’re using even something like a laptop without wall power, you probably don’t have more than 100 Wh of battery power, absent USB-C and an external powerstation or something, due to airline restrictions on laptop battery size. If you want to be able to play a game for, say, 3 hours, then your power budget (not just for the GPU, but for everything) is something like 30W. You’re not going to beat that limit unless the restrictions on battery size go away (which…maybe they will, as I understand that there are some more-fire-safe battery chemistries out there).

          And cell phone battery restrictions are typically even harder, like, 20 Wh. That means that for three hours of gaming, your power budget because of size constraints on the phone is maybe about 6 watts.

          If you want power-intensive rendering on those platforms doing remote rendering is your only real option then.

          Second, there are (and could be more) video game genres where you need dynamically-generated images, but where latency isn’t really a constraint. Like, a first-person shooter has some real latency constraints. You need to get a frame back in a tightly bounded amount of time, and you have constraints on how many frames per second you need. But if you were dynamically-rendering images for, I don’t know, an otherwise-text-based adventure game, then the acceptable time required to get a new frame illustrating a given scene might expand to seconds. That drastically slashes the bandwidth required.

          What I don’t think is going to happen in the near future is “gaming PC/non-portable video game consoles get moved to the datacenter”.

          • knightly the Sneptaur@pawb.social
            link
            fedilink
            arrow-up
            2
            ·
            10 hours ago

            I’m confident in my “never” because of the capital economics; the service has to be expensive enough to pay for the infrastructure it requires plus some profit for the shareholders, while simultaneously being cheap enough to offer gamers a better value proposition than buying their own hardware. There’s no margin between those limits, so the only market left for them to appeal to are niches where local rendering performance is limited but network latency and bandwidth are not. Even then, gamers still have the option of streaming from their own hardware using Moonlight rather than paying for a third-party service, so the only customers left are the ones with more money than sense.

            Don’t get me wrong, I love the concept (I even bought an OnLive microconsole back in the day and still regularly use a Steam Link to stream games to the living room TV), but it isn’t nearly convenient or performant enough to justify itself.

      • Telorand@reddthat.com
        link
        fedilink
        arrow-up
        2
        ·
        16 hours ago

        My thought, too. Guess I’ll just have to run what I’ve got until it breaks, then make my own retro handheld, because I’m not paying for cloud bullshit.

  • Scrubbles@poptalk.scrubbles.tech
    link
    fedilink
    English
    arrow-up
    27
    ·
    edit-2
    20 hours ago

    Spouse and I were joking yesterday. Yeah super they turned it on in starfield. Did it suddenly make the ge not boring? Is it interesting to play? Is the dialogue at all realistic? No? They’re just all onlyfans models now? Then who cares.

    • frank@sopuli.xyz
      link
      fedilink
      arrow-up
      13
      ·
      20 hours ago

      If they spent 1% of the time and budget of the visuals on making the game fun it’d be a really neat game