“Well, first of all, they’re completely wrong,” Huang said in response to a question from Tom’s Hardware editor-in-chief Paul Alcorn about the criticism.

“The reason for that is because, as I have explained very carefully, DLSS 5 fuses controllability of the of geometry and textures and everything about the game with generative AI,” Huang continued.

Just a elongated way to say AI slop.

  • JcbAzPx@lemmy.world
    link
    fedilink
    English
    arrow-up
    56
    ·
    7 days ago

    “Well, first of all, they’re completely wrong,”

    Proceeds to explain exactly what everyone hates about it.

  • wonderingwanderer@sopuli.xyz
    link
    fedilink
    English
    arrow-up
    54
    ·
    7 days ago

    “The consumers don’t know what they want. I, the CEO, know what the consumers want. And the consumers want to give me money!”

      • wonderingwanderer@sopuli.xyz
        link
        fedilink
        English
        arrow-up
        4
        ·
        6 days ago

        Jensen Huang has all the GPUs, he can probably play games where each character has its own dedicated GPU and every atom and molecule of the environment is rendered in real time with a hyper-realistic physics engine, with built-in AI that plays for you so that even your idle pastimes are automated giving you more time to WORK AND PRODUCE VALUE FOR THE OWNER-CASTE.

        “This game only needs two GPUs to run, what’s the problem?”

    • Yggstyle@lemmy.world
      link
      fedilink
      English
      arrow-up
      11
      ·
      7 days ago

      Same exact vibes.

      But we thought everyone was okay with repackaged interpolation! Why not repackaged Instagram filters!?

      • Echo Dot@feddit.uk
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        2
        ·
        7 days ago

        I think most people are ok with frame gen because it doesn’t touch the actual content. It just moves things around a bit with motion vectors which actually was kind of a thing even before AI although not very good. It didn’t repaint the game into some different art style.

        Also there were real frames in there.

        This is going to 100% replace the game graphics.

        • M0oP0o@mander.xyz
          link
          fedilink
          English
          arrow-up
          4
          arrow-down
          1
          ·
          6 days ago

          I think most people are ok with frame gen because it doesn’t touch the actual content. It just moves things around a bit with motion vectors which actually was kind of a thing even before AI although not very good. It didn’t repaint the game into some different art style.

          Naw most people are not ok with fake frames, and like raytracing is getting less and less likely to be left on. Most people however hate fake frames not due to the frames themselves but the motion blur effect that seems to be needed to make things look ok on top of the frame gen (no one likes motion blur).

          You are right that this is going to replace game graphics to some degree since its another shortcut game studios can use to cut costs (and the industry is kinda struggling at the moment). Why spend effort, time and money making a model look good when you can use a tool to gloss over the work and while it does not look “good” per say it will look better then it should.

        • Yggstyle@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          7 days ago

          Yep, I’m more suggesting that this was the logical path they would have continued down.

          I personally don’t like the generation because it’s functionally noise and can effect the feel / responsiveness of the game. Upscaling seems pretty reasonable - but like many I just can’t abide by the notion that we are counting a generated frame as a frame for benchmark sake.

          I’m not against framegen existing. It’s a preference. Same as that feature on TVs. To each their own.

          Back to the new dlss though: yeah it was inevitable they go here… and I’m personally thrilled this was the line everyone more or less took issue with.

  • FreddiesLantern@leminal.space
    link
    fedilink
    English
    arrow-up
    48
    ·
    7 days ago

    By the amount and intensity that this man tries to sell his garbage through sheer bluff and bs you’d think he’s running to be President someday.

  • nightlily@leminal.space
    link
    fedilink
    English
    arrow-up
    51
    arrow-down
    1
    ·
    7 days ago

    So this dumb fuck’s own marketing material has said this operates off final pixel colour and motion vectors (for temporal stability presumably) - that says to me that it’s not working with actual geometry info at all. It probably has a step to infer geometry but it’s still just a fancy Instagram filter working with limited data and an obviously ill-suited training set.

    • lime!@feddit.nu
      link
      fedilink
      English
      arrow-up
      6
      arrow-down
      1
      ·
      7 days ago

      the previous versions at least need the software to supply motion vectors. otherwise it’s just guesswork. i’m assuming there will be some way to supply lighting information as well.

      whatever the final product can do, they certainly didn’t show it off in their examples.

      • orgrinrt@lemmy.world
        link
        fedilink
        English
        arrow-up
        7
        ·
        7 days ago

        Technically, at least on vulkan, these things can be inferred or intercepted with just an injected layer, though it’s not trivial. If you store a buffer history for depth you can fairly accurately compute an approximation of actual (isolated) mesh surfaces from the pov of the view. But that isn’t the same as real polygons and meshes that the textures and all map to… pretty sure you can’t run that pipeline real time even with tiled temporal ss. Almost definitely works on the output directly, perhaps some buffers like motion vectors and depth for the same frame that they’ve needed since dlss2 anyway. But pretty suspect to claim full polygons, unless running with tight integration from the game itself, even then the frame budgets are crazy tight as it is, nevermind running extra passes on that level

        • SkunkWorkz@lemmy.world
          link
          fedilink
          English
          arrow-up
          6
          ·
          7 days ago

          Probably not meshes since it is way too expensive. But these guys write the GPU drivers, so they of course have access to the different frame buffers and textures buffers and light source data. So just from depth and normal map data you can get a good representation of geometry. Like deferred rendering lights the scene with the data in the G-Buffer, which is 2D, not geometry.

    • lb_o@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      1
      ·
      7 days ago

      Oh, thanks for pointing that out.

      Ignoring that current version looks sloppy, as a gamedev I would accept extra AI beautification post processing step as additional feature, but I would never accept corporation getting their hands into my beloved geometry.

    • kromem@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      3
      ·
      7 days ago

      That’s what he’s saying. That it doesn’t change the geometry or textures (still completely controlled by the devs) and that the parts that it does change are also tunable by the devs.

      He’s responding to the backlash about how it changes models/textures (which it doesn’t) by saying those are still fully in the hands of the devs and the parts people are seeing in the demos can be fine tuned by the dev teams to match their vision for what they want it to do or not do (like change lighting on material surfaces and hair but not character faces as an example).

      • nightlily@leminal.space
        link
        fedilink
        English
        arrow-up
        3
        ·
        7 days ago

        It’s a post-processing screen space effect. At that point, there’s zero control the game can have over the geometry. If the AI model wants to change it, it can. It fundamentally can’t only operate on lighting like the marketing claims, it can only make a hallucinating best-effort statistical guess at what the geometry in the final image should be.

  • Techno-rat@lemmy.blahaj.zone
    link
    fedilink
    English
    arrow-up
    31
    ·
    7 days ago

    Trillions invested to make unoptimized games barely run, and make it look worse at the same time, instead of just investing like 1/10th into optimization during dev cycles.

    NVIDIA really is like a parasitic cancerous growth on the side of the games industry, it’s existence increasingly predicated on the destruction of current standards, overtaking their function to ensure survival and it’s continuous ever expanding cancerous growth

  • Yggstyle@lemmy.world
    link
    fedilink
    English
    arrow-up
    21
    arrow-down
    1
    ·
    6 days ago

    This is long winded but I firmly believe this explains a lot about the industries frenzied push into all these odd directions… All of it. Here seems as good as any place to dump this mess I’ve been stewing on:

    I really think it’s important that raytracing, while novel, wasn’t created to improve visuals. It wasnt created to make a programmers life easier. It was created because it was computationally difficult and could be optimized for. It was a fantastic play by nvidia. They created a feature that functionally did very little but they could get an entire cycle ahead of the competition in that optimization. Differentiation of products, in a duopoly, is a big deal. Amd dove right into it - knowing full well that this would leave them brutally behind… But this was a fortuitous event: despite the disadvantage.

    Why? Simple. GPUs have been struggling against Moore’s law. Framerates were exceeding ranges even monitors can refresh at. And worse yet there was another hard limit: our eyes. How do you sell cards that have no perceivable value?

    Reality is we may well be reaching a point where additional resolutions and framerates dont matter. Badly optimized games only buy so much time.

    These companies aren’t stupid. Crypto? They loved it. Computationally expensive. Always need faster… Until we didnt. What now? Demand was plummeting for overpriced high end cards.

    Go back and look at when AI and nvidia got in bed. The earnings call was due to be a bloodbath after all these cards were rotting on shelves, unpurchased, and depreciating daily. It was coming ro light that they had been selling cards to miners under the table and that was going to get ugly fast. I have never, in my life, heard a company talk so much about a product on a earnings call – that wasn’t theirs. Not a word breathed about unsold cards barely any numbers discussed. ChatGPT referenced so many times that there was confusion as to whether nvidia actually owned it. The Q/A at the end was comedy gold. People were so confused.

    AI was the perfect save. AI is a power virus. Want to fix the black box? Train a black box to mangage that black box. Its a computational sinkhole. They’ve extracted value from gamers to dimishing returns. Meanwhile they can sell the ultimate snake oil to investors: virtual slave labor. Unpaid workers. In floods private equity. Gamers stopped mattering immediately. All of these advances are software. From a GPU design company. Why? It shuts up the peasants while they continue rebranding the “snake oil” to get whoever is buying. Weve nearly achieved the panacea. Just a bit longer!

    Behold: we have dressed our industry in the finest of the emperors newest clothes. You can either start selling them or be the only one who doesn’t.

    🫧

    • Eximius@lemmy.world
      link
      fedilink
      English
      arrow-up
      10
      arrow-down
      1
      ·
      6 days ago

      From a programming and visuals standpoint: Ray tracing was always sought after, and it is peak graphical fidelity. It makes visuals better, and (shader) programming easier, more physics-based. It’s not just differentiation, the industry has been dreaming of realtime ray-tracing for 30 years. With slow, continuous movement in that direction.

      • Yggstyle@lemmy.world
        link
        fedilink
        English
        arrow-up
        9
        arrow-down
        1
        ·
        6 days ago

        Dont get me wrong. Its absolutely a very novel and useful feature. It made shit look great. I’m not down on the tech: I’m just saying the push for it wasn’t for the industry. It was to kill framerates and sell cards.

        • Eximius@lemmy.world
          link
          fedilink
          English
          arrow-up
          6
          arrow-down
          1
          ·
          6 days ago

          I doubt it. This thing was in the pipeline for decades. It wasn’t just nvidia doing the thing because moore’s law. Everybody was interested and excited, while the moore’s law was alive and well. Literally can’t find better quality, but intel was pushing tech demos such as this.

          The actual push for adoption and walled garden of NV RTX is… honestly, just business as usual. Nvidia did exact same with PhysX. Once they have the technological edge, they push hard to pump their ecosystem. They always played evil.

          • Yggstyle@lemmy.world
            link
            fedilink
            English
            arrow-up
            4
            ·
            6 days ago

            It is good business. Shit for the consumer (unsurprising) … But really aside from Jensen’s apparent ego - I’m curious why nvidia has any interest in the gaming sector. I feel like they accomplished the perfect transition.

  • Techno-rat@lemmy.blahaj.zone
    link
    fedilink
    English
    arrow-up
    25
    ·
    edit-2
    6 days ago

    STOP BUYING NVIDIA

    i feel like we might as well drum up som boycotting spirit while we’re at it. Shaming certainly has an effect as evidenced by his statement here, and if we could add to that a collectivist ‘no more money for you doofus’ energy i think that’d be swell.

    We aren’t their main customers anymore, and it would least force them to also aknowledge this themselves. Hopefully.

    Idk folks, keep adding pressure on these ghouls, seems to at least have gotten their attention

    STOP BUYING NVIDIA

        • Echo Dot@feddit.uk
          link
          fedilink
          English
          arrow-up
          3
          ·
          7 days ago

          I never understood why everyone was completely happy with one manufacturer having dominance. Everyone seem to think it was just ok for them to be no competition in the market. Here is exactly why we need that competition.

          • Yggstyle@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            ·
            7 days ago

            Exactly. The more compititon the better. Imagine what we’d get if nvidia was split into 3 new companies and had to compete. 5 total companies suddenly would be very motivated to make a substantial product to bring to market at a competitive price. We as consumers need this diversity to keep the market honest and moving forward.

    • inclementimmigrant@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      3
      ·
      7 days ago

      Unfortunately social media users who post are not the average gamer and that’s very apparent when 95% of the discrete GPU market is Nvidia now.

      Plus doesn’t help the only other two competitors is AMD who are also jumping on the AI bandwagon and doing not actively competing. Then there’s Intel that’s in bed with Nvidia now.

    • ScoffingLizard@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      2
      ·
      6 days ago

      I gave them.up in 2020. The measurable difference between them and AMD didn’t warrant the expenditures, and neither did Intel. Absolutely no issues since.

    • sobchak@programming.dev
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      1
      ·
      6 days ago

      Would have to boycott pretty much all hardware. I don’t know of any large hardware manufacturer that’s not chasing the AI investment money and bribing the Trump admin.

      • Techno-rat@lemmy.blahaj.zone
        link
        fedilink
        English
        arrow-up
        2
        ·
        6 days ago

        Don’t have to do anything, I’m just being vocal about certain feelings hoping others do the same. Boycotts don’t work yadda yadda sitting on your ass also don’t work bla bla…

        in really not invested in what you decide to do at the end of the day, keep buying or don’t or whatever suits your fancy… I can just feel this move tainting Nvidia and their products with an inherent ‘ewww’ reaction on my part which very strongly disentivises ever buying their shit, and I feel like it’s generally good if people are honest about how they feel about phenomena in general, that’s how shared sentiments, zetgeist and ‘common sense’ is created u know

        • sobchak@programming.dev
          link
          fedilink
          English
          arrow-up
          1
          ·
          6 days ago

          Fair enough. I was just trying to point out that the entire hardware industry, and pretty much the entire executive and investor class is doing the same stuff as Nvidia.

  • Tywèle@piefed.social
    link
    fedilink
    English
    arrow-up
    31
    ·
    7 days ago

    This makes me even more sure of my decision to get an AMD card as my next GPU (currently I have an RTX 4080 so it’s still a long time until that happens but still).

    • inclementimmigrant@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      6
      ·
      edit-2
      7 days ago

      Just a reminder that AMD is also just another money hungry corporation that would sell gamers out to AI. Don’t forget that AMD just put a 20% stake of their company and output in a AI partnership with Meta.

      That said, I also bought a 9070 XT a year ago because I didn’t want to directly support Nvidia and their never ending quest to force AI into games, which this kind of at least provided some justification for that as I don’t support AI Only Fans being injected into my games. Steam already has enough of those types of games without direct AI injection.

    • Ibuthyr@lemmy.wtf
      link
      fedilink
      English
      arrow-up
      1
      ·
      6 days ago

      Dude, that card will survive the bursting AI bubble, world war 3 and then some. You can easily use that card for the next 15 years.

    • OrgunDonor@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      7 days ago

      Right there with you. Have a 4080, and only reason I got this one, was because it was cheaper(by a sizable chunk) than the 7900xtx when I got it.

    • Vlyn@lemmy.zip
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      32
      ·
      7 days ago

      Every time I got an AMD card I got burned, so that’s not really an option. Last try was a 5700 XT and oh my god was that a pain. So much so that instead of my usual 4-5 year upgrade cycle I grabbed a 3080 one year later.

      Nowadays DLSS is a must for me, it just looks so much better than TAA. FSR is alright, but not great.

        • Vlyn@lemmy.zip
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          4
          ·
          7 days ago

          Meh, I’ve owned an ATI 4870X2, GTX 580, GTX 970, 5700 XT, 3080 and now 5080.

          Also helped out friends with their GPUs, 2070 Super, 6800 XT (which have a really shitty fan curve at stock).

          The 5700 XT had the worst drivers of the bunch. Crashes, stuttering, … AMD managed to fix most issues with time, but not all.

          Nvidia drivers early this year were shit too, but at this point they are great again. I don’t care about the brand, I only care about my PC running well.

      • NocturnalMorning@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        1
        ·
        7 days ago

        You do know the games themselves generally choose the type of anti-aliasing used right? Your graphics card doesn’t run anti-aliasing ontop of everything else.

        • Vlyn@lemmy.zip
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          4
          ·
          7 days ago

          ???

          Of course the game developers choose what to put into their games. Some games have FXAA, TAA, DLSS, FSR and even XeSS.

          With an Nvidia card you can use them all. With AMD you can’t use DLSS.

          Not sure what your last sentence means, of course your GPU runs AA?

    • bearboiblake@pawb.social
      link
      fedilink
      English
      arrow-up
      16
      ·
      7 days ago

      It’s so demanding that it needs two RTX 5090 GPUs to run, I don’t think it’s really anything except AI hype to keep the bubble inflated a little longer

      • Crozekiel@lemmy.zip
        link
        fedilink
        English
        arrow-up
        5
        ·
        7 days ago

        Wait, is that real? I thought the entire point of previous versions of DLSS was to get “better” performance out of “less” hardware? I had suspicions that running every frame through an AI image generator wasn’t going to be an improvement to performance, but that’s even worse than I was expecting.

        • bearboiblake@pawb.social
          link
          fedilink
          English
          arrow-up
          7
          ·
          7 days ago

          Yeah, DLSS 5 is a big departure in that regard. Here’s a source:

          Nvidia actually used two RTX 5090s for its demos: one plays the game, the other exclusively runs the DLSS 5 technology. The use of two GPUs is required right now as DLSS 5 still has a long way to go in terms of optimisation

          • Crozekiel@lemmy.zip
            link
            fedilink
            English
            arrow-up
            6
            arrow-down
            1
            ·
            7 days ago

            Jesus christ… A second top of the line GPU just to run the AI slop filter is one hell of a deluded announcement. I kinda feel bad now about all that hate Blizzard got for announcing a diablo mobile game.

            • bearboiblake@pawb.social
              link
              fedilink
              English
              arrow-up
              4
              ·
              7 days ago

              They say that they’re gonna optimize it to run on a single GPU, but I’m extremely skeptical about how well that’ll perform. Honestly, I think this announcement is aimed more at investors than gamers, to keep the AI hype train rolling.

              I kinda feel bad now about all that hate Blizzard got for announcing a diablo mobile game.

              Do you guys not have $10k to blow on GPUs? /s

  • CosmoNova@lemmy.world
    link
    fedilink
    English
    arrow-up
    25
    ·
    7 days ago

    „No, no! You‘re wrong because…“

    And then this charlatan goes on to explain why we‘re right. It‘s exhausting to listen to these gilded clowns.

    • qarbone@lemmy.world
      link
      fedilink
      English
      arrow-up
      10
      ·
      7 days ago

      And they’re gilded in the wealth they’ve fleeced from their workers and their normal, individual consumers.

      Prancing and jeering for the AI companies they’ve sold and shackled themselves to.

  • tumblechinchilla@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    20
    ·
    7 days ago

    Jensen is a useless deflated wenis sack of a man. This ai arms race is doing nothing but hurting the fabric of humanity and more importantly, further increasing the rate of environmental destruction. :(