• formergijoe@lemmy.world
        link
        fedilink
        English
        arrow-up
        11
        ·
        2 months ago

        Oh it’s a bit of a running joke that every time there’s a new Forza or Gran Turismo, they brag about how round the tires are and how wet the pavement looks.

    • Cethin@lemmy.zip
      link
      fedilink
      English
      arrow-up
      3
      ·
      2 months ago

      We technically aren’t at max roundness. Almost every rendered now renders polygons, but it’s possible to make a rendered to other shapes. We can render a perfect cylinder if we want to, or whatever shape you can define mathematically.

  • kitnaht@lemmy.world
    link
    fedilink
    English
    arrow-up
    43
    ·
    2 months ago

    Kind of like smartphones. They all kind of blew up into this rectangular slab, and…

    Nothing. It’s all the same shit. I’m using a OnePlus 6T from 2018, and I think I’ll have it easily for another 3 years. Things eventually just stagnate.

    • starman2112@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      5
      arrow-down
      1
      ·
      2 months ago

      One company put a stupid fucking notch in their screen and everyone bought that phone, so now every company has to put a stupid fucking notch in the screen

      I just got my tax refund. If someone can show me a modern phone with a 9:16 aspect ratio and no notch, I will buy it right now

      • CancerMancer@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        4
        ·
        2 months ago

        I would love to have a smaller phone. Not thinner, smaller. I don’t care if it’s a bit thick, but I do care if the screen is so big I can’t reach across it with one hand.

    • mrvictory1@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      2 months ago

      OnePlus 6 line of phones are one of the very few with good Linux support, I mean, GNU/Linux support. If custom ROMs no longer cut it you can get even more years with Linux. I had an iPhone, was eventually fed up, got an Android aaand I realized I am done with smartphones lol. Gimme a laptop with phone stuff (push notifications w/o killing battery, VoLTE) and my money is yours, but no such product exists.

  • HEXN3T@lemmy.blahaj.zone
    link
    fedilink
    English
    arrow-up
    43
    arrow-down
    4
    ·
    edit-2
    2 months ago

    Let’s compare two completely separate games to a game and a remaster.

    Generational leaps then:

    Good lord.

    EDIT: That isn’t even the Zero Dawn remaster. That is literally two still-image screenshots of Forbidden West on both platforms.

    Good. Lord.

    • Maggoty@lemmy.world
      link
      fedilink
      English
      arrow-up
      23
      arrow-down
      1
      ·
      2 months ago

      Yeah no. You went from console to portable.

      We’ve had absolutely huge leaps in graphical ability. Denying that we’re getting diminishing returns now is just ridiculous.

      • HEXN3T@lemmy.blahaj.zone
        link
        fedilink
        English
        arrow-up
        9
        arrow-down
        5
        ·
        2 months ago

        We’re still getting huge leaps. It simply doesn’t translate into massively improved graphics. What those leaps do result in, however, is major performance gains.

        I have played Horizon Zero Dawn, its remaster, and Forbidden West. I am reminded how much better Forbidden West looks and runs on PS5 compared to either version of Zero Dawn. The differences are absolutely there, it’s just not as spectacular as the jump from 2D to 3D.

        The post comes off like a criticism of hardware not getting better enough faster enough. Wait until we can create dirt, sand, water or snow simulations in real time, instead of having to fake the look of physics. Imagine real simulations of wind and heat.

        And then there’s gaussian splatting, which absolutely is a huge leap. Forget trees practically being arrangements of PNGs–what if each and every leaf and branch had volume? What if leaves actually fell off?

        Then there’s efficiency. What if you could run Monster Hunter Wilds at max graphics, on battery, for hours? The first gen M1 Max MacBook Pro can comfortably run Baldur’s Gate III. Reducing power draw would have immense benefits on top of graphical improvements.

        Combined with better and better storage and VR/AR, there is still plenty of room for tech to grow. Saying “diminishing returns” is like saying that fire burns you when you touch it.

        • I Cast Fist@programming.dev
          link
          fedilink
          English
          arrow-up
          3
          ·
          2 months ago

          What those leaps do result in, however, is major performance gains.

          Which many devs will make sure you never feel them by “optimizing” the game for only the most bleeding edge hardware

          Then there’s efficiency. What if you could run Monster Hunter Wilds at max graphics, on battery, for hours? The first gen M1 Max MacBook Pro can comfortably run Baldur’s Gate III. Reducing power draw would have immense benefits on top of graphical improvements.

          See, if the games were made with a performance first mindset, that’d be possible already. Not to dunk on performance gains, but there’s a saying that every time hardware gets faster, programmers make their code slower. I mean, you can totally play emulated SNES games with minimal impact compared to leaving the computer idling.

          Saying “diminishing returns” is like saying that fire burns you when you touch it.

          Unless chip fabrication can figure a way to make transistors “stack” on top of one another, effectively making 3D chips, they’ll continue to be “flat” sheets that can only increase core count horizontally. Single core frequency peaked in early 2000s, from then on it’s been about adding more cores. Even the gains from a RTX 5090 vs a RTX 4090 aren’t that big. Now compare with the gains from a GTX 980 vs a GTX 1080

        • Maggoty@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          arrow-down
          2
          ·
          2 months ago

          I am reminded how much better Forbidden West looks and runs on PS5 compared to either version of Zero Dawn.

          Really? I’ve played both on PS5 and didn’t notice any real difference in performance or graphics. I did notice that the PC Version of Forbidden West has vastly higher minimum requirements though. Which is the opposite of performance gains.

          Who the fuck cares if leaves are actually falling off or spawning in above your screen to fall?

          And BG3 has notoriously low minimums, it is the exception, not the standard.

          If you want to see every dimple on the ass of a horse then that’s fine, build your expensive computer and leave the rest of us alone. Modern Next Gen Graphics aren’t adding anything to a game.

          • HEXN3T@lemmy.blahaj.zone
            link
            fedilink
            English
            arrow-up
            3
            ·
            2 months ago

            I’m assuming you’re playing on a bad TV. I have a 4k120 HDR OLED panel, and the difference is night and day.

            I also prefer to enjoy new things, instead of not enjoying new things. It gives me a positive energy that disgruntled gamers seem to be missing.

              • HEXN3T@lemmy.blahaj.zone
                link
                fedilink
                English
                arrow-up
                3
                arrow-down
                1
                ·
                2 months ago

                So you’re claiming new hardware isn’t perceivably better, despite not using a display which is actually capable of displaying said improvements. I use such a display. I have good vision. The quality improvement is extremely obvious. Just because not everyone has a high end display doesn’t mean that new hardware is pointless, and that everyone else has to settle for the same quality as the lowest common denominator.

                My best hardware used to be Intel on-board graphics. I still enjoyed games, instead of incessantly complaining how stagnant the gaming industry is because my hardware isn’t magically able to put out more pixels.

                The PS5 is a good console. Modern GPUs are better than older ones. Games look better than they did five or ten years ago. Those are cold, hard, unobjectionable facts. Don’t like it? Don’t buy it.

                I do like it.

    • starman2112@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      12
      ·
      2 months ago

      The fact that the Game Boy Advance looks that much better than the Super Nintendo despite being a handheld, battery powered device is insane

    • HEXN3T@lemmy.blahaj.zone
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      5
      ·
      2 months ago

      It is baffling to me that people hate cross gen games so much. Like, how awful for PS4 owners that don’t have to buy a new console to enjoy the game, and how awful for PS5 owners that the game runs at the same fidelity at over 60FPS, or significantly higher fidelity at the same frame rate.

      They should have made the PS4 version the only one. Better yet, we should never make consoles again because they can’t make you comprehend four dimensions to be new enough.

      • Maggoty@lemmy.world
        link
        fedilink
        English
        arrow-up
        11
        ·
        2 months ago

        The point isn’t about cross generation games. It’s about graphics not actually getting better anymore unless you turn your computer into a space heater rated for Antarctica.

          • Maggoty@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            ·
            2 months ago

            ARM isn’t going to magically make GPUs need less brute force energy in badly optimized games.

            • HEXN3T@lemmy.blahaj.zone
              link
              fedilink
              English
              arrow-up
              1
              ·
              edit-2
              2 months ago

              …So push ARM. By optimising games.

              EDIT: This statement is like saying “Focusing on ARM won’t fix efficiency because we aren’t focusing on ARM”.

  • Steve Dice@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    36
    arrow-down
    1
    ·
    2 months ago

    I mean, how much more photorealistic can you get? Regardless, the same game would look very different in 4K (real, not what consoles do) vs 1080p.

    • hlmw@lemm.ee
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      1
      ·
      2 months ago

      The lighting in that image is far, far from photorealistic. Light transport is hard.

      • Steve Dice@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        5
        ·
        2 months ago

        That’s true but realistic lightning still wouldn’t make anywhere near the same amount of difference that the other example shows.

  • GraniteM@lemmy.world
    link
    fedilink
    English
    arrow-up
    30
    arrow-down
    2
    ·
    2 months ago

    Don’t get me started on Horizon: Forbidden West. It was a beautiful game. It also had every gameplay problem the first one did, and added several more to boot. The last half of the game was fucking tedious, and I basically finished it out of spite.

    • inb4_FoundTheVegan@lemmy.world
      link
      fedilink
      English
      arrow-up
      8
      ·
      2 months ago

      Awww.

      I enjoyed the heck out of the first one, especially the story. Haven’t gotten around to picking up the 2nd so that’s a bummer to read.

      • moody@lemmings.world
        link
        fedilink
        English
        arrow-up
        12
        ·
        2 months ago

        I’d say it’s still worth playing, but the story is way more predictable, and they made some things more grindy to upgrade than they were in the first one. Also they added robots that are even more of a slog to fight through.

        Those giant turtles are bullshit and just not fun.

      • hOrni@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        2 months ago

        If You liked the stealth aspects of the first game then there is no point in starting the second. The stealth is gone. It’s also more difficult. The equipment is much more complicated.

      • ShinkanTrain@lemmy.ml
        link
        fedilink
        English
        arrow-up
        3
        ·
        2 months ago

        I enjoyed learning the backstory of the first one, but I was very disinterested in the story, as in, what is currently happening.

    • hOrni@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      2 months ago

      I agree. I loved the first game, considered it one of my favourites. Couldn’t wait for the sequel. I was so disappointed, I abandoned it after a couple of hours.

  • renegadesporkA
    link
    fedilink
    English
    arrow-up
    25
    ·
    2 months ago

    This is true of literally any technology. There are so many things that can be improved in the early stages that progress seems very fast. Over time, the industry finds most of the optimal ways of doing things and starts hitting diminishing returns on research & development.

    The only way to break out of this cycle is to discover a paradigm shift that changes the overall structure of the industry and forces a rethinking of existing solutions.

    The automobile is a very mature technology and is thus a great example of these trends. Cars have achieved optimal design and slowed to incremental progress multiple times, only to have the cycle broken by paradigm shifts. The most recent one is electrification.

    • Maggoty@lemmy.world
      link
      fedilink
      English
      arrow-up
      9
      arrow-down
      2
      ·
      2 months ago

      Okay then why are they arbitrarily requiring new GPUs? It’s not just about the diminishing returns of “next gen graphics”.

      • renegadesporkA
        link
        fedilink
        English
        arrow-up
        10
        ·
        2 months ago

        That’s exactly why. Diminishing returns means exponentially more processing power for minimal visual improvement.

        • Maggoty@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          ·
          2 months ago

          I think my real question is what point do we stop trying until researchers make another breakthrough?

      • AdrianTheFrog@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        1
        ·
        2 months ago

        path tracing is a paradigm shift, a completely different way of showing a scene to that normally done, it’s just a slow and expensive one (that has existed for many years but only started to become possible in real time recently due to advancing gpu hardware)

        Yes, usually the improvement is minimal. That is because games are designed around rasterization and have path tracing as an afterthought. The quality of path tracing still isn’t great because a bunch of tricks are currently needed to make it run faster.

        You could say the same about EVs actually, they have existed since like the 1920s but only are becoming useful for actual driving because of advancing battery technology.

        • Maggoty@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          ·
          2 months ago

          Then let the tech mature more so it’s actually analogous with modern EVs and not EVs 30 years ago.

          • AdrianTheFrog@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            arrow-down
            2
            ·
            2 months ago

            Yea, it’s doing that. RT is getting cheaper, and PT is not really used outside of things like cyberpunk “rt overdrive” which are basically just for show.

            • Maggoty@lemmy.world
              link
              fedilink
              English
              arrow-up
              6
              ·
              2 months ago

              Except it’s being forced on us and we have to buy more and more powerful GPUs just to handle the minimums. And the new stuff isn’t stable anyways. So we get the ability to see the peach fuzz on a character’s face if we have a water-cooled $5,000 spaceship. But the guy rocking solid GPU tech from 2 years ago has to deal with stuttering and crashes.

              This is insane, and we shouldn’t be buying into this.

              • AdrianTheFrog@lemmy.world
                link
                fedilink
                English
                arrow-up
                1
                ·
                edit-2
                2 months ago

                It’s not really about detail, it’s about basic lighting especially in dynamic situations

                (Sometimes it is used to provide more detail in shadows I guess, but that is also usually a pretty big visual improvement)

                I think there’s currently a single popular game where rt is required? And I honestly doubt a card old enough to not support ray tracing would be fast enough for any alternate minimum setting it would have had instead. Maybe the people with 1080 ti-s are missing out, but there’s not that many of them honestly. I haven’t played that game and don’t know all that much about it, it might be a pointless requirement for all I know.

                Nowadays budget cards support rt, even integrated gpus do (at probably unusable levels of speed, but still)

                I don’t think every game needs rt or that rt should be required, but it’s currently the only way to get the best graphics, and it has the potential to completely change what is possible with the visual style of games in the future.

                Edit: also the vast majority of new solid gpus started supporting rt 6 years ago, with the 20 series from nvidia

                • Maggoty@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  2 months ago

                  That’s my point though, the minimums are jacked up well beyond where they need to be in order to cram new tech in and get 1 percent better graphics even without RT. There’s not been any significant upgrade to graphics in the last 5 years, but try playing a 2025 AAA with a 2020 graphics card. It might work, but it’s certainly not supported and some games are actually locking out old GPUs.

  • dragonlobster@programming.dev
    link
    fedilink
    English
    arrow-up
    24
    ·
    2 months ago

    I don’t mind the graphics that much, what really pisses me off is the lack of optimization and heavy reliance on frame gen.

  • RightHandOfIkaros@lemmy.world
    link
    fedilink
    English
    arrow-up
    23
    ·
    2 months ago

    Ironically, Zelda Link to the Past ran at 60fps, and Ocarina of Time ran at 20fps.

    The same framerates are probably in the Horizon pictures below lol.

    Now, Ocarina of Time had to run at 20fps because it had one of the biggest draw distances of any N64 game at the time. This was so the player could see to the other end of Hyrule Field, or other large spaces. They had to sacrifice framerate, but for the time it was totally worth the sacrifice.

    Modern games sacrifice performance for an improvement so tiny that most people would not be able to tell unless they are sitting 2 feet from a large 4k screen.

    • Maalus@lemmy.world
      link
      fedilink
      English
      arrow-up
      12
      arrow-down
      1
      ·
      2 months ago

      Had to, as in “they didn’t have enough experience to optimize the games”. Same for Super Mario 64. Some programmers decompiled the code and made it run like a dream on original hardware.

      • RightHandOfIkaros@lemmy.world
        link
        fedilink
        English
        arrow-up
        8
        arrow-down
        1
        ·
        2 months ago

        The programming knowledge did not exist at the time. Its not that they did not have the experience, it was impossible for them to have the knowledge because it did not exist at the time. You can’t really count that against them.

        Kaze optimizing Mario 64 is amazing, but it would have been impossible for Nintendo to have programmed the game like that because Kaze is able to use programming technique and knowledge that literally did not exist at the time the N64 was new. Its like saying that the NASA engineers that designed the Atlas LV-3B spacecraft were bad engineers or incapable of making a good rocket design just because of what NASA engineers could design today with the knowledge that did not exist in the 50s.

    • CancerMancer@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      5
      ·
      2 months ago

      One of the reasons I skipped the other consoles but got a GameCube was because all the first party stuff was buttery smooth. Meanwhile trying to play shit like MechAssault on Xbox was painful.

      • RightHandOfIkaros@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        2 months ago

        I never had trouble with MechAssault, because the fun far outweighed infrequent performance drops.

        I am a big proponent of 60fps minimum, but I make an exception for consoles from the 5th and 6th generations. The amount of technical leap and improvement, both in graphics technology and in gameplay innovation, far outweighs any performance dips as a cost of such improvement. 7th generation is on a game by game basis, and personally 8th generation (Xbox One, Switch, and PS4) is where it became completely unacceptable to run even just a single frame below 60fps. There is no reason that target could not have been met by then, definitely now. Switch was especially disappointing with this, since Nintendo made basically a 2015 mid-range smartphone but then they tried to make games for a real game console, with performance massively suffering as a result. 11fps, docked, in Breath of the Wild’s Korok Forest or Age of Calamity (anyehwere in the game, take your pick,) is totally unacceptable, even if it only happened one time ever rather than consistently.

        • thisismyhaendel@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          2 months ago

          I’m usually tolerant of frame drops, especially when they make hard games easier (like on the N64), but I agree it has gotten much worse on recent consoles. Looking at you, Control on PS4 (seems like it should just have been a PS5 game with all the frame drops; even just unpausing freezes the game for multiple seconds).

    • JoYo@lemmy.ml
      link
      fedilink
      English
      arrow-up
      2
      ·
      edit-2
      2 months ago

      when i was a smol i thought i needed to buy the memory expansion pack whenever OoT fps tanked.

  • Ibaudia@lemmy.world
    link
    fedilink
    English
    arrow-up
    16
    ·
    2 months ago

    I don’t understand why developers and publishers aren’t prioritizing spectacle games with simple graphics like TABS, mount and blade, or similar. Use modern processing power to just throw tons of shit on screen, make it totally chaotic and confusing. Huge battles are super entertaining.

    • UnderpantsWeevil@lemmy.world
      link
      fedilink
      English
      arrow-up
      9
      arrow-down
      1
      ·
      2 months ago

      The dream of the '10s/20s game industry was VR. Hyper-realistic settings were supposed to supplant the real world. Ready Player One was what big development studios genuinely thought they were aiming for.

      They lost sight of video games as an abstraction and drank too much of their own cyberpunk kool-aid. So we had this fixation on Ray Tracing and AI-driven NPC interactions that gradually lost sight of the gameplay loop and the broader iterative social dynamics of online play.

      That hasn’t eliminated development in these spheres, but it has bifricated the space between game novelty and game immersion. If you want the next Starcraft or Earthbound or Counterstrike, you need to look towards the indie studios and their low-graphics / highly experimental dev studios (where games like Stardew Valley and Undertale and Balatro live). The AAA studios are just turning out 100 hour long movies with a few obnoxious gameplay elements sprinkled in.

      • Xanthrax@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        edit-2
        2 months ago

        If anyone can optimize Disney’s omni directional walking pad, we’ll be there. I’d give it 3 decades if it goes that way. I’ve heard it’s not like real walking. It feels very slippery. All that being said, you don’t have to wrap yourself in a harness and fight friction to simulate walking like other walking pads. It also seems simple enough, hardware wise, that it could be recreated using preexisting parts/ 3d printing. I’m honestly surprised I haven’t seen a DIY project yet.

    • renegadesporkA
      link
      fedilink
      English
      arrow-up
      2
      ·
      2 months ago

      VR definitely feels like the next 2D->3D paradigm shift, with similar challenges. except it hasn’t taken off like 3D did IMO for 2 reasons:

      1. VR presents unique ergonomic challenges.

      Like 3D, VR significantly increased graphics processing requirements and presented several gameplay design challenges. A lot of the early solutions were awkward, and felt more like proof-of-concepts than actual games. However, 3D graphics can be controlled (more or less) by the same human interface devices as 2D, so there weren’t many ergonomic/accessibility problems to solve. Interfacing VR with the human body requires a lot of rather clunky equipment, which presents all kinds of challenges like nausea, fatigue, glasses, face/head size/shape, etc.

      2. The video game industry was significantly more mature when (modern) VR entered the scene.

      Video games were still a relatively young industry when games jumped to 3D, so there was much more risk tolerance and experimentation even in the “AAA” space. When VR took off in 2016, studios were much bigger and had a lot more money involved. This usually results in risk aversion. Why risk losing millions on developing a AAA VR game that a small percentage of gamers even have the hardware for when we can spend half (and make 10x) on just making a proven sequel? Instead large game publishers all dipped their toes in with tech demos, half-assed ports, and then gave up when they didn’t sell that well (Valve, as usual, being the exception).

      I honestly don’t believe the complaints you hear about hardware costs and processing power are the primary reasons, because many gaming tech, including 3D, had the same exact problem in the early stages. Enthusiasts bought the early stuff anyway because it was groundbreaking, and eventually costs come down and economies of scale kick in.

    • The Picard Maneuver@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      2
      ·
      2 months ago

      VR is the one thing that feels similar to the old generational leaps to me. It’s great, but I haven’t set mine up in a few years now.

      • Xanthrax@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        2 months ago

        Fair. I haven’t played “No Man’s Sky,” yet, but apparently, it’s awesome in VR.

  • drislands@lemmy.world
    link
    fedilink
    English
    arrow-up
    13
    arrow-down
    1
    ·
    2 months ago

    The problem as I see it is that there is an upper limit on how good any game can look graphically. You can’t make a game that looks more realistic than literal reality, so any improvement is going to just approach that limit. (Barring direct brain interfacing that gives better info than the optical nerve)

    Before, we started from a point that was so far removed from reality than practically anything would be an improvement. Like say “reality” is 10,000. Early games started at 10, then when we switched to 3D it was 1,000. That an enormous relative improvement, even if it’s far from the max. But now your improvements are going from 8,000 to 8,500 and while it’s still a big absolute improvement, it’s relatively minor – and you’re never going to get a perfect 10,000 so the amount you can improve by gets smaller and smaller.

    All that to say, the days of huge graphical leaps are over, but the marketing for video games acts like that’s not the case. Hence all the buzzwords around new tech without much to show for it.

    • jj4211@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      ·
      2 months ago

      Well you can get to a perfect 10k hypothetically, you can have more geometric/texture/lighting detail than the eye could process. From a technical perspective.

      Of course you have the technical capabilities, and that’s part of the equation. The other part is the human effort to create the environments. Now the tech sometimes makes it easier on the artist (for example, better light modeling in the engine at run time means less effort to bake lighting in, and ability for author to basically “etc…” to more detail, by smoothing or some machine learning extrapolations). Despite this, more detail does mean more man hours to try to make the most of that, and this has caused massive cost increases as models got more detailed and more models and environments became feasible. The level of artwork that goes into the whole have of pacman is less than a single model in a modern game.

    • Squizzy@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      ·
      2 months ago

      Graphics are only part of it, with the power that is there I am disappointed in the low quality put to rrlease. I loved Jedi survivor, a brilliant game but it was terribly optimised. I booted it today and had nothing but those assest loading flashes as walls and structures in my immediate vicinity and eyeline flashed white into existence.

      Good games arent solely reliant om graphics but christ if they dont waste what they have. Programmers used to push everything to the max, now they get away with pushing beta releases to print.

  • merthyr1831@lemmy.ml
    link
    fedilink
    English
    arrow-up
    13
    arrow-down
    2
    ·
    2 months ago

    yeah but the right hand pic has twenty billion more triangles that are compressed down and upscaled with AI so the engine programmers dont have to design tools to optimise art assets.

    • Cethin@lemmy.zip
      link
      fedilink
      English
      arrow-up
      5
      ·
      2 months ago

      I know you’re joking, but these probably have the same poly count. The biggest noticeable difference to me is subsurface scattering on her skin. The left her skin looks flat, but the right it mostly looks like skin. I’m sure the lighting in general is better too, but it’s hard to tell.

      • merthyr1831@lemmy.ml
        link
        fedilink
        English
        arrow-up
        2
        ·
        2 months ago

        yeah they probably just upped internal resolution and effects for what I assume is an in-engine cutscene. Not that the quality of the screenshot helps lmao

  • atomicbocks@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    14
    arrow-down
    3
    ·
    2 months ago

    The improvement levels are the same amount they used to be. It’s just that adding 100mhz to a 100mhz processor doubles your performance, adding 100mhz to a modern processor adds little in comparison as a for instance.

  • PlexSheep@infosec.pub
    link
    fedilink
    English
    arrow-up
    10
    ·
    2 months ago

    To be fair there isn’t just graphics.

    Something like Zelda Twilight princess HHD to Zelda Breath of the wild was a huge leap in just gameplay. (And also in graphics but that’s not my point)

    • UnderpantsWeevil@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      1
      ·
      2 months ago

      Idk. Breath of the Wild felt more like a tech demo than a full game. Tears of the Kingdom felt more fleshed out, but even then… the wideness of the world belied its shallowness in a lot of places. Ocarina of Time had a smaller overall map, but ever region had this very bespokely crafted setting and culture and strategy. By the time you got to Twilight Princess, you had this history to the setting and this weight to this iteration of the Zelda setting.

      What could you really do in BotW that you couldn’t do in Twilight? The graphics got a tweak. The amount of running around you did went way up. But the game itself? Zelda really peaked with Majorem’s Mask. So much of this new stuff is more fluff than substance.

      • PlexSheep@infosec.pub
        link
        fedilink
        English
        arrow-up
        3
        ·
        2 months ago

        What? Botw was awesome! There was so much to explore, the world was interesting, the NPCs are good, and so on. Oot and Majora’s Mask are both amazing too of course, but botw is a modern masterpiece.