• just_another_person@lemmy.world
    link
    fedilink
    English
    arrow-up
    77
    arrow-down
    4
    ·
    5 months ago

    My mind is still blown on why people are so interested in spending 2x the cost of the entire machine they are playing on AND a hefty power utility bill to run these awful products from Nvidia. Generational improvements are minor on the performance side, and fucking AWFUL on the product and efficiency side. You’d think people would have learned their lessons a decade ago.

    • Eager Eagle@lemmy.world
      link
      fedilink
      English
      arrow-up
      46
      arrow-down
      4
      ·
      5 months ago

      they pay because AMD (or any other for that matter) has no product to compete with a 5080 or 5090

      • just_another_person@lemmy.world
        link
        fedilink
        English
        arrow-up
        32
        arrow-down
        6
        ·
        5 months ago

        Because they choose not to go full idiot though. They could make their top-line cards to compete if they slam enough into a pipeline and require a dedicated PSU to compete, but that’s not where their product line intends to go. That’s why it’s smart.

        For reference: AMD has the most deployed GPUs on the planet as of right now. There’s a reason why it’s in every gaming console except Switch 1/2, and why OpenAI just partnered with them for chips. The goal shouldn’t just making a product that churns out results at the cost of everything else does, but to be cost-effective and efficient. Nvidia fails at that on every level.

        • Eager Eagle@lemmy.world
          link
          fedilink
          English
          arrow-up
          19
          arrow-down
          2
          ·
          5 months ago

          this openai partnership really stands out, because the server world is dominated by nvidia, even more than in consumer cards.

          • just_another_person@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            arrow-down
            3
            ·
            5 months ago

            Actually…not true. Nvidia recently became bigger in the DC because of their terrible inference cards being bought up, but AMD overtook Intel on chips with all major cloud platforms last year, and their Xilinix chips are slowly overtaking the sales of regular CPUs for special purposes processing. By the end of this year, I bet AMD will be the most deployed brand in datacenters globally. FPGA is the only path forward in the architecture world at this point for speed and efficiency in single-purpose processing. Nvidia doesn’t have a competing product.

        • ctrl_alt_esc@lemmy.ml
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          3
          ·
          5 months ago

          Unfortunately, this partnership with OpenAI means they’ve sided with evil and I won’t spend a cent on their products anymore.

            • ctrl_alt_esc@lemmy.ml
              link
              fedilink
              English
              arrow-up
              1
              ·
              5 months ago

              Oh so you support grifting off the public domain? Maybe grow some balls instead of taking the status quo for granted.

      • Naz@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        9
        arrow-down
        1
        ·
        5 months ago

        I have overclocked my AMD 7900XTX as far as it will go on air alone.

        Undervolted every step on the frequency curve, cranked up the power, 100% fan duty cycles.

        At it’s absolute best, it’s competitive or trades blows with the 4090D, and is 6% slower than the RTX 4090 Founder’s Edition (the slowest of the stock 4090 lineup).

        The fastest AMD card is equivalent to a 4080 Super, and the next gen hasn’t shown anything new.

        AMD needs a 5090-killer. Dual socket or whatever monstrosity which pulls 800W, but it needs to slap that greenbo with at least a 20-50% lead in frame rates across all titles, including raytraced. Then we’ll see some serious price cuts and competition.

      • Cyberwolf@feddit.org
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        1
        ·
        edit-2
        5 months ago

        What do you even need those graphics cards for?

        Even the best games don’t require those and if they did, I wouldn’t be interested in them, especially if it’s an online game.

        Probably only a couple people would be playing said game with me.

    • Static_Rocket@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      ·
      5 months ago

      Well, to be fair the 10 series was actually an impressive improvement to what was available. Since then I switched to AMD for better SW support. I know since then the improvements have dwindled.

      • just_another_person@lemmy.world
        link
        fedilink
        English
        arrow-up
        5
        arrow-down
        2
        ·
        5 months ago

        AMD is at least running the smart game on their hardware releases with generational leaps instead of just jacking up power requirements and clock speeds as Nvidia does. Hell, even Nvidia’s latest lines of Jetson are just recooked versions from years ago.

      • yeehaw@lemmy.ca
        link
        fedilink
        English
        arrow-up
        7
        ·
        5 months ago

        The best part is, for me, ray tracing looks great. When I’m standing there and slowly looking around.

        When I’m running and gunning and shits exploding, I don’t think the human eye is even capable of comprehending the difference between raster and ray tracing at that point.

        • Chozo@fedia.io
          link
          fedilink
          arrow-up
          3
          ·
          5 months ago

          Yeah, that’s what’s always bothered me about the drive for the highest-fidelity graphics possible. In motion, those details are only visible for a frame or two in most cases.

          For instance, some of the PC mods I’ve seen for Cyberpunk 2077 look absolutely gorgeous… in screenshots. But once you get into a car and start driving or get into combat, it looks nearly indistinguishable from what I see playing the vanilla game on my PS5.

    • tormeh@discuss.tchncs.de
      link
      fedilink
      English
      arrow-up
      9
      arrow-down
      10
      ·
      5 months ago

      If you’re on Windows it’s hard to recommend anything else. Nvidia has DLSS supported in basically every game. For recent games there’s the new transformer DLSS. Add to that ray reconstruction, superior ray tracing, and a steady stream of new features. That’s the state of the art, and if you want it you gotta pay Nvidia. AMD is about 4 years behind Nvidia in terms of features. Intel is not much better. The people who really care about advancements in graphics and derive joy from that are all going to buy Nvidia because there’s no competition.

      • just_another_person@lemmy.world
        link
        fedilink
        English
        arrow-up
        13
        arrow-down
        5
        ·
        5 months ago

        First, DLSS is supported on Linux.

        Second, DLSS is kinda bullshit. The article goes into details that are fairly accurate.

        Lastly, AMD is at parity with Nvidia with features. You can see my other comments, but AMD’s goal isn’t selling cards for gamers. Especially ones that require an entire dedicated PSU to power them.

          • just_another_person@lemmy.world
            link
            fedilink
            English
            arrow-up
            4
            arrow-down
            2
            ·
            5 months ago

            No. AMD. See my other comments in this thread. Though they are in every major gaming console, the bulk of AMD sales are aimed at the datacenter.