• theunknownmuncher@lemmy.world
    link
    fedilink
    English
    arrow-up
    9
    ·
    2 days ago

    Seemed like it was marginal improvement with focus on upscaling/framegen, which does not really interest me. I’m still really happy with my 6900 XT. Although, NVIDIA has been marginal improvement with significant TDP (💀) and price increase for several generations now, so whatever 🤷

    • KingRandomGuy@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      24 hours ago

      40 series to 30 series was pretty tangible IMO (4090 gets something like 30-50% more perf in most tasks than 3090 Ti with the same TDP), in part thanks to the much higher L2 cache plus newer process node.

      50 series was very poor though, probably because it’s the same process node.

    • Buddahriffic@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      ·
      2 days ago

      I was surprised to see the 9070 xt at about double the 6800 xt performance in benchmarks, once ones with both of those started coming out.

      I got it because I also see that if China does follow through with an attack on Taiwan, PC components are going to become very hard to find and very expensive while all of that production capacity is replaced. And depending on how things go after that, this might be the last GPU I ever buy.

      • theunknownmuncher@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        edit-2
        2 days ago

        A huge factor is rendering resolution. I only render at most <1080p (1024x768 or 1600x1200). 2x performance improvement over 6800XT in general sounds very incorrect if the benchmarks are run at 1080p, unless they are using upscaling and frame gen to cheat the performance numbers. Do you have a link to these benchmarks? I’d be less skeptical about a significant performance improvement over 6800XT if the benchmarks were done specifically at 4k resolution though as both AMD and NVIDIA have further optimized GPUs for 4k rendering each passing generation.

        Upscaling/framegen and 4k are completely irrelevant to me, so counting that out, it is marginal improvement based on the numbers I’ve seen. I’d like to be wrong though, and I could be

        • RejZoR@lemmy.ml
          link
          fedilink
          English
          arrow-up
          3
          ·
          2 days ago

          I do care about upscaling and ray tracing, which is why I didn’t go with AMD for last few generations. RX 9070 XT felt like the right time as they made huge improvements. Especially FSR4 is easily comparable to DLSS and I use it as antialiasing replacement while boosting performance. FSR2, while it works it turns into pixelated mess during fast movements and has a lot of ghosting. FSR4 is near perfect.

          What I also love is how AMD’s Fluid Motion Frames just work in all games with minimal artifacting and Radeon Chill is what I especially love with summer coming in. It decreases power consumption dramatically and thus heat output to levels RTX 5070Ti couldn’t ever achieve despite being more efficient in raw tests for power consumption. All while not affecting experience. It’s so good I’m using it in Overwatch 2 and Marvel Rivals and I can’t really tell a diffeeence, it controls framerate that seamlessly.

          • theunknownmuncher@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            1 day ago

            Ray tracing just still isn’t there yet. Even during the manicured ray tracing demo during the AMD announcement event for 9000 series, its nothing but surface boil. Looks like analog white static overlayed on all the surfaces.

              • theunknownmuncher@lemmy.world
                link
                fedilink
                English
                arrow-up
                1
                ·
                edit-2
                1 day ago

                Are you kidding…?? I wish that was true. The worst I’ve seen it is in Marvel Rivals. It’s pretty bad in S.T.A.L.K.E.R. Heart of Chernobyl as well

                • RejZoR@lemmy.ml
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  16 hours ago

                  That’s not down to graphic card. It’s the game. I had horrible boiling in Marvel Rivals on RTX 3080 to a point I preferred screenspace reflections over ray traced Lumen reflections. Still do on Radeon. Surprisingly, Oblivion Remaster running Unreal Engine 5 doesn’t have this issue even on RX 9070 XT.

                  • theunknownmuncher@lemmy.world
                    link
                    fedilink
                    English
                    arrow-up
                    1
                    ·
                    12 hours ago

                    That’s not down to graphic card.

                    Yeah. That’s literally my point. Ray tracing just isn’t there yet. Has nothing to do with GPUs.

                    Surprisingly, Oblivion Remaster running Unreal Engine 5 doesn’t have this issue even on RX 9070 XT.

                    Because you have aggressive upscaling and frame gen enabled, so you’ve blurred your screen to the point that details like boiling are lost and then artificially resharpened your screen with the details that an AI is guessing were there.

                    Disable these and set to render natively and enjoy the analog static