• GraniteM@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    1 month ago

    Don’t get me started on Horizon: Forbidden West. It was a beautiful game. It also had every gameplay problem the first one did, and added several more to boot. The last half of the game was fucking tedious, and I basically finished it out of spite.

    • inb4_FoundTheVegan@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      1 month ago

      Awww.

      I enjoyed the heck out of the first one, especially the story. Haven’t gotten around to picking up the 2nd so that’s a bummer to read.

      • moody@lemmings.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        1 month ago

        I’d say it’s still worth playing, but the story is way more predictable, and they made some things more grindy to upgrade than they were in the first one. Also they added robots that are even more of a slog to fight through.

        Those giant turtles are bullshit and just not fun.

        • metaldream@sopuli.xyz
          link
          fedilink
          English
          arrow-up
          0
          ·
          edit-2
          1 month ago

          If you’re actually struggling with the turtle guys that is 100% a skill issue. Literally just break the shell off and they die very quickly, there’s nothing to “slog” through with them. Out of all the big enemies they are by far the easiest.

          So sick of reading nothing but shitty hot takes when it comes to this game. It’s such a good game but gets unfairly nitpicked by reddit/lemmy and review bombed by fascists.

      • ShinkanTrain@lemmy.ml
        link
        fedilink
        English
        arrow-up
        0
        ·
        1 month ago

        I enjoyed learning the backstory of the first one, but I was very disinterested in the story, as in, what is currently happening.

        • scops@reddthat.com
          link
          fedilink
          English
          arrow-up
          0
          ·
          1 month ago

          Very much same. I wish the Burning Shores expansion was a bit longer. It’s kinda hard to call it a must-play DLC, but it’s got some big stuff in terms of Aloy’s character development.

      • hOrni@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        1 month ago

        If You liked the stealth aspects of the first game then there is no point in starting the second. The stealth is gone. It’s also more difficult. The equipment is much more complicated.

    • hOrni@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      1 month ago

      I agree. I loved the first game, considered it one of my favourites. Couldn’t wait for the sequel. I was so disappointed, I abandoned it after a couple of hours.

  • inbeesee@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    30 days ago

    It’s a corporate problem. Fucking Balatro was a smash hit for design, art, etc. These ‘next gen’ games are pushed because bigger numbers are absolute, and quantifiable. A CEO likes number go up, but real artists don’t push polygons.

  • Fandangalo@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    1 month ago

    I’d say there’s more progress on scale than visual fidelity. There’s greater ability to render complexity at scale, whether that’s real actors on screen or physics in motion. I agree that progress in detail still frame has plateaued.

  • piskertariot@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    1 month ago

    NHL 2014 and NHL 2024 are probably the same game, only in NHL 2014 the players don’t spit out their mouthguards like they do in 2024.

    But I need that level of realism /s

  • CitizenKong@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    1 month ago

    What I don’t understand is why games look prettier but things like NPC AI (which is really path-finding and decision trees, not actual AI), interactivity of the game world, destructability of game objects - all those things are objectively worse than they have been in a game of 10-15 years ago (with some exceptions like RDR2).

    How can a game like Starfield still have all the Bethesda jank but now the NPCs lack any kind of daily routine?

    Most enemies in modern shooters barely know how to flank, compare that to something like F.E.A.R. which came out in 2006!

      • MudMan@fedia.io
        link
        fedilink
        arrow-up
        0
        ·
        1 month ago

        I mean, the original image is a cutscene, so…

        But hey, I’ll split the difference. Instead of SMB 1, which was a launch game and literally wasn’t running on the same hardware (because mappers), we can do Mario 3 instead.

        Or, hear me out, let’s not do a remaster at all for current gen leaps. Here’s a PS4 vs PS5 sequel one.

        It doesn’t work as well, though, since taking the absolutely ridiculous shift from 2D to 3D, which has happened once and only once in all of gaming history, is a bit of a cheat anyway.

        Oh, and for the record, and I can’t believe I’m saying this only now, LttP looks a LOT better than OoT. Not even close.

        • warm@kbin.earth
          link
          fedilink
          arrow-up
          0
          ·
          1 month ago

          Oh I don’t care about leap comparisons, was just taking interest at how graphics have evolved over time. To be honest graphics have been going downhill for a few years now in big games thanks to lazy development chasing “good” graphics, fucking TAA…

          • MudMan@fedia.io
            link
            fedilink
            arrow-up
            0
            ·
            1 month ago

            I agree that it’s a meme comparison anyway. I just found it pertinent to call out that remasters have been around for a long time.

            I don’t know that I agree on the rest. I don’t think I’m aware of a lazy game developer. That’s a pretty rare breed. TAA isn’t a bad thing (how quickly we forget the era when FXAA vaseline smearing was considered valid antialiasing for 720p games) and sue me, but I do like good visuals.

            I do believe we are in a very weird quagmire of a transitional period, where we’re using what is effectively now a VFX suite to make games that aren’t meant to run in real time on most of the hardware being used to run them and that are simultaneously too expensive and large and aiming at waaay too many hardware configs. It’s a mess out there and it’ll continue to be a mess, because the days of a 1080Ti being a “set to Ultra and forget it” deal were officially over the moment we decided we were going to sell people 4K monitors running at 240Hz and also games made for real time raytracing.

            It’s not the only time we’ve been in a weird interaction of GPUs and software (hey, remember when every GPU had its own incompatible graphics API? I do), but it’s up there.

            • warm@kbin.earth
              link
              fedilink
              arrow-up
              0
              ·
              1 month ago

              TAA is absolutely a bad thing, I’m sorry, but it’s way worse than FXAA, especially when combined with the new ML upscaling shit.
              It’s only really a problem with big games or more specifically UE5 games as temporal is baked into it.

              Yeah, there was that perfect moment in time where you could just put everything max, have some nice SMAA on and be happy with >120fps. The 4K chase started yeah, but the hardware we have now is ridiculously powerful and could run 4K 120fps no problem natively, if the time was spent achiveing that rather than throwing in more lighting effects no one asked for, speed running development and then slapping DLSS on at the end to try and reach playable framerates, making the end product a blurry ghosting mess. Ugh.

    • snooggums@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      1 month ago

      Like cgi and other visual effects, realism has some applications that can massively improve the experience in some games. Just like how lighting has a massive impact, or sound design, etc.

      Chasing it at the expense of game play or art design is a negative though.

    • ☂️-@lemmy.ml
      link
      fedilink
      English
      arrow-up
      0
      ·
      1 month ago

      not really. plenty of great games have visual fidelity as a prerequisite of being good.

      i dont think rdr2 would be such a beautiful immersive experience if it had crappy graphics.

      • Cethin@lemmy.zip
        link
        fedilink
        English
        arrow-up
        0
        ·
        1 month ago

        Visual fidelity isn’t the same as realism. RDR2 is trying to replicate a real experience, so I mostly agree with you. However, it does step away from realism sometimes to create something more.

        Take a look at impressionist art, for example. It starts at realism, but it isn’t realistic. It has more style to it that enhances what the artist saw (or wanted to highlight).

        A game should focus on the experience it’s tying to create, and it’s art style should enhance that experience. It shouldn’t just be realistic because that’s the “premium” style.

        For an example, Mirror’s Edge has a high amount of fidelity (for its time), but it’s highly stylized in order to create the experience they wanted out of it. The game would be far worse if they tried to make the graphics realistic. This is true for most games, though some do try to simulate being a part of this world, and it’s fine for them to try to replicate it because it suits what their game is.

      • Maggoty@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        1 month ago

        I had way more fun in GTA 3 than GTA 5. RDR2 isn’t a success because the horse has realistic balls.

        To put another nail in the coffin, ARMA’s latest incarnation isn’t the most realistic shooter ever made. No amount of wavy grass and moon phases can beat realistic weapon handling in the fps sim space. (And no ARMA’s weapon handling is not realistic, it’s what a bunch of keyboard warriors decided was realistic because it made them feel superior.) Hilariously the most realistic shooter was a recruiting game made by the US Army with half the graphics.

        • ☂️-@lemmy.ml
          link
          fedilink
          English
          arrow-up
          0
          ·
          1 month ago

          realism and visual fidelity are not the same thing.

          BUT, visual fidelity adds a LOT to the great writing in rdr2.

            • ☂️-@lemmy.ml
              link
              fedilink
              English
              arrow-up
              0
              ·
              edit-2
              1 month ago

              you are right i didnt notice i had worded it that way and its not what i meant

              • Maggoty@lemmy.world
                link
                fedilink
                English
                arrow-up
                0
                ·
                1 month ago

                I see, and yeah graphics can help a lot. But how much do we actually need? At what point is the gain not enough to justify forcing everyone to buy another generation of GPUs?

      • CancerMancer@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        0
        ·
        1 month ago

        Couldn’t disagree more. Immersion comes from the details, not the fidelity. I was told to expect this incredibly immersive experience form RDR2 and then I got:

        • carving up animals is frequently wonky
        • gun cleaning is just autopilot wiping the exterior of a gun
        • shaving might as well be done off-screen
        • you transport things on your horse without tying them down

        Yeah that didn’t do it for me.

        • ☂️-@lemmy.ml
          link
          fedilink
          English
          arrow-up
          0
          ·
          edit-2
          1 month ago

          realism and visual fidelity are two slightly overlapping but different things.

          a game can have great graphics but its npcs be unrealistic bullet sponges. cp2077 comes to mind, not that this makes it a bad game necessarily.

          i dont actually want to go to the bathroom in-game but i love me some well written story, graphics can help immensely with that. among other things.

          come to think of it 100% realist games would probably be boring

    • The Picard Maneuver@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      0
      ·
      1 month ago

      So many retro games are replayable and fun to this day, but I struggle to return to games whose art style relied on being “cutting edge realistic” 20 years ago.

      • sploosh@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        1 month ago

        I dunno, Crysis looks pretty great on modern hardware and its 18 years old.

        Also, CRYSIS IS 18 WHERE DID THE TIME GO?

      • MudMan@fedia.io
        link
        fedilink
        arrow-up
        0
        ·
        1 month ago

        Really? Cause I don’t know, I can play Shadow of the Colossus, Resident Evil 4, Metal Gear Solid 3, Ninja Gaiden Black, God of War, Burnout Revenge and GTA San Andreas just fine.

        And yes, those are all 20 years ago. You are now dead and I made it happen.

        As a side note, man, 2005 was a YEAR in gaming. That list gives 1998 a run for its money.

        • snooggums@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          edit-2
          1 month ago

          Did those go for realism though, or were they just good at balancing the more detailed art design with the gameplay?

          • MudMan@fedia.io
            link
            fedilink
            arrow-up
            0
            ·
            1 month ago

            Absolutely they went for realism. That was the absolute peak of graphics tech in 2004, are you kidding me? I gawked at the fur in Shadow of the Colossus, GTA was insane for detail and size for an open world at the time. Resi 4 was one of the best looking games that gen and when the 360 came out later that year it absolutely was the “last gen still looked good” game people pointed at.

            I only went for that year because I wanted the round number, but before that Silent Hill 2 came out in 2001 and that was such a ridiculous step up in lighting tech I didn’t believe it was real time when the first screenshots came out. It still looks great, it still plays… well, like Silent Hill, and it’s still a fantastic game I can get back into, even with the modern remake in place.

            This isn’t a zero sum game. You don’t trade gameplay or artistry for rendering features or photorealism. Those happen in parallel.

            • snooggums@lemmy.world
              link
              fedilink
              English
              arrow-up
              0
              ·
              edit-2
              1 month ago

              They clearly balanced the more detailed art design with the game play.

              GTA didn’t have detail on cars to the level of a racing game, and didn’t have characters with as much detail as Resident Evil, so that it could have a larger world for example. Colossus had fewer objects on screen so it could put more detail on what was there.

              • MudMan@fedia.io
                link
                fedilink
                arrow-up
                0
                ·
                1 month ago

                Yeah. So like every other game.

                Nothing was going harder for visuals, so by default that’s what was happening. They were pushing visuals as hard as they would go with the tech that they had.

                The big change isn’t that they balanced visuals and gameplay. If anything the big change is that visuals were capped by performance rather than budget (well, short of offline CG cutscenes and VO, I suppose).

                If anything they were pushing visuals harder than now. There is no way you’d see a pixel art deck building game on GOTY lists in 2005, it was all AAA as far as the eye could see. We pay less attention to technological escalation now, by some margin.

                • snooggums@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  0
                  ·
                  1 month ago

                  Yeah. So like every other game.

                  Except for the ones that don’t do a good job of balancing the two things. Like the games that have incredible detail but shit performance and/or awful gameplay.

        • Cethin@lemmy.zip
          link
          fedilink
          English
          arrow-up
          0
          ·
          1 month ago

          I would say GoW and SotC at least take realism as inspiration, but aren’t realistic. They’re like an idealized version of realism. They’re detailed, but they’re absolutely stylized. SotC landscapes, for example, look more like paintings you’d see rather than places you’d see in real life.

          Realism is a bad goal because you end up making every game look the same. Taking our world as inspiration is fine, but it should almost always be expanded on. Know what your game is and make the art style enhance it. Don’t just replicate realism because that’s “what you’re supposed to do.”

          • MudMan@fedia.io
            link
            fedilink
            arrow-up
            0
            ·
            1 month ago

            Look, don’t take it personally, but I disagree as hard as humanly possible.

            Claiming that realism “makes every game look the same” is a shocking statement, and I don’t think you mean it like it sounds. That’s like saying that every movie looks the same because they all use photographing people as a core technique.

            If anything, I don’t know what “realism” is supposed to mean. What is more realistic? Yakuza because it does these harsh, photo-based textures meant to highlight all the pores or, say, a Pixar movie where everything is built on this insanely accurate light transfer, path traced simulation?

            At any rate, the idea that taking photorealism as a target means you give up on aesthetics or artistic intent is baffling. That’s not even a little bit how it works.

            On the other point, I think you’re blending technical limitations with intent in ways that are a bit fallacious. SotC is stylized, for sure, in that… well, there are kaijus running around and you sometimes get teleported by black tendrils back to your sleeping beauty girlfirend.

            But is it aiming at photorealism? Hell yeah. That approach to faking dynamic range, the deliberate crushing of exteriors from interiors, the way the sky gets treated, the outright visible air adding distance and scale when you look at the colossi from a distance, the desaturated take on natural spaces… That game is meant to look like it was shot by a camera all the way. They worked SO hard to make a PS2 look like it has aperture and grain and a piece of celluloid capturing light. Harder than the newer remake, arguably.

            Some of that applies to GoW, too, except they are trying to make things look like Jason and the Argonauts more than Saving Private Ryan. But still, the references are filmic.

            I guess we’re back to the problem of establishing what people mean by “realism” and how it makes no sense. In what world does Cyberpunk look similar to Indiana Jones or Wukong? It just has no real meaning as a statement.

            • Cethin@lemmy.zip
              link
              fedilink
              English
              arrow-up
              0
              ·
              1 month ago

              If anything, I don’t know what “realism” is supposed to mean. What is more realistic? Yakuza because it does these harsh, photo-based textures meant to highlight all the pores or, say, a Pixar movie where everything is built on this insanely accurate light transfer, path traced simulation?

              The former is more realistic, but not for that reason. The lighting techniques are techniques, not a style. Realism is trying to recreate the look of the real world. Pixar is not doing that. They’re using advanced lighting techniques to enhance their stylized worlds.

              Some of that applies to GoW, too, except they are trying to make things look like Jason and the Argonauts more than Saving Private Ryan. But still, the references are filmic.

              Being inspired by film is not the same as trying to replicate the real world. (I’d argue it’s antithetical to it to an extent.) Usually film is trying to be more than realistic. Sure, it’s taking images from the real world, but they use lighting, perspective, and all kinds of other tools to enhance the film. They don’t just put some actors in place in the real environment and film it without thought. There’s intent behind everything shown.

              I guess we’re back to the problem of establishing what people mean by “realism” and how it makes no sense. In what world does Cyberpunk look similar to Indiana Jones or Wukong? It just has no real meaning as a statement.

              Cyberpunk looks more like Indiana Jones than Persona 5. Sure, they stand out from each other, but it’s mostly due to environments.

              I think there’s plenty of games that benefit from realism, but not all of them do. There are many games that could do better with stylized graphics instead. For example, Cyberpunk is represented incredibly well in both the game and the anime. They both have different things they do better, and the anime’s style is an advantage for the show at least. The graphics style should be chosen to enhance the game. It shouldn’t just be realistic because it can be. If realism is the goal, fine. If it’s supposed to be more (or different) than realism, maybe try a different style that improves the game.

              Realism is incredibly hard to create assets for, so it costs more money, and usually takes more system resources. For the games that are improved by it, that’s fine. There’s a lot of games that could be made on a smaller budget, faster, run better, and look more visually interesting if they chose a different style though. I think it should be a consideration that developers are allowed to make, but most are just told to do realism because it’s the “premium” style. They aren’t allowed to do things that are better suited for their game. I think this is bad, and also leads to a lack in diversity of styles.

              • MudMan@fedia.io
                link
                fedilink
                arrow-up
                0
                ·
                1 month ago

                I don’t understand what you’re saying. Or, I do, but if I do, then you don’t.

                I think you’re mixing up technique with style, in fact. And really confusing a rendering technique with an art style. But beyond that, you’re ignoring so many games. So many. Just last year, how do you look at Balatro and Penny’s Big Breakaway and Indiana Jones and go “ah, yes, games all look the same now”. The list of GOTY nominees in the TGAs was Astro Bot, Balatro, Wukong, Metaphor, Elden Ring and Final Fantasy VII R. How do you look at that list of games and go “ah, yes, same old, same old”.

                Whenever I see takes like these I can’t help but think that people who like to talk about games don’t play enough games, or just think of a handful of high profile releases as all of gaming. Because man, there’s so much stuff and it goes from grungy, chunky pixel art to lofi PS1-era jank to pitch-perfect anime cel shading to naturalistic light simulation. If you’re out there thinking games look samey you have more of a need to switch genres than devs to switch approach, I think.

                • Cethin@lemmy.zip
                  link
                  fedilink
                  English
                  arrow-up
                  0
                  ·
                  1 month ago

                  By “all games look the same” I’m being hyperbolic. I mean nearly all AAA games and the majority of AA games (and not an insignificant number of indies even).

                  Watch this video. Maybe it’ll help you understand what I’m saying.

                  Whenever I see takes like these I can’t help but think that people who like to talk about games don’t play enough games, or just think of a handful of high profile releases as all of gaming.

                  Lol. No. Again, I was being hyperbolic and talking mostly about the AAA and AA space. I personally almost exclusively play indies who know what they’re trying to make and use a style appropriate to it. I play probably too many games. I also occasionally make games myself, I was the officer in a game development club in college, and I have friends in the industry. I’m not just some person who doesn’t understand video games.

      • conditional_soup@lemm.ee
        link
        fedilink
        English
        arrow-up
        0
        ·
        1 month ago

        STALKER is good, though I played a lot of Anomaly mostly, and I’m not sure that STALKER was ever known for bleeding edge graphics

        • UltraGiGaGigantic@lemmy.ml
          link
          fedilink
          English
          arrow-up
          0
          ·
          1 month ago

          Stalker gamma is free if anyone wanted to try it out. I ended up buying the OG games cause I liked it so much.

          The 2nd one is good, but I would advise people to wait until they implement more promised features before they buy it.

    • ProfessorProteus@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      1 month ago

      I agree generally, but I have to offer a counterpoint with Kingdom Come: Deliverance. I only just got back into it after bouncing off in 2019, and I wish I hadn’t stopped playing. I have a decent-ish PC and it still blows my entire mind when I go roaming around the countryside.

      Like Picard said above, in due time this too will look aged, but even 7 years on, it looks and plays incredible even at less-than-highest settings. IMHO the most visually impressive game ever created (disclaimer: I haven’t seen or played Horizon). Can’t wait to play KC:D 2!

    • conditional_soup@lemm.ee
      link
      fedilink
      English
      arrow-up
      0
      ·
      1 month ago

      Idk, I’d say that pursuing realism is worthy, but you get diminishing returns pretty quick when all the advances are strictly in one (or I guess two, with audio) sense. Graphical improvements massively improved the experience of the game moving from NES or Gameboy to SNES and again to PS1 and N64. I’d say that the most impressive leap, imo, was PS1/N64 to PS2/XBox/GameCube. After that, I’d say we got 3/4 of the return from improvements to the PS3 generation, 1/2 the improvement to PS4 gen, 1/5 the improvement to PS5, and 1/8 the improvement when we move on to PS5 Pro. I’d guess if you plotted out the value add, with the perceived value on the Y and the time series or compute ability or texture density or whatever on the x, it’d probably look a bit like a square root curve.

      I do think that there’s an (understandably, don’t get me wrong) untapped frontier in gaming realism in that games don’t really engage your sense of touch or any of the subsets thereof. The first step in this direction is probably vibrating controllers, and I find that it definitely does make the game feel more immersive. Likewise, few games engage your proprioception (that is, your knowledge of your body position in space), though there’ve been attempts to engage it via the Switch, Wii, and VR. There’s, of course, enormous technical barriers, but I think there’s very clearly a good reason why a brain interface is sort of thought of as the holy grail of gaming.

      • jpreston2005@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        1 month ago

        Having a direct brain interface game, that’s realistic enough to overcome the Uncanny Valley, would destroy peoples lives. People would, inevitably, prefer their virtual environment to the real one. They’d end up wasting away, plugged into some machine. It would lend serious credence to the idea of a simulated universe, and reduce the human experience by replacing it with an improved one. Shit, give me a universe wherein I can double-jump, fly, or communicate with animals, and I’d have a hard time returning to this version.

        We could probably get close with a haptic feedback suit, a mechanism that allows you to run/jump in any direction, and a VR headset, but there would always be something tethering you to reality. But a direct brain to machine interaction would have none of that, it would essentially be hijacking our own electrical neural network to run simulations. Much like Humans trying to play Doom on literally everything. It would be as amazing as it was destructive, finally realizing the warnings from so many parents before its time: “that thing’ll fry your brain.”

        • UltraGiGaGigantic@lemmy.ml
          link
          fedilink
          English
          arrow-up
          0
          ·
          1 month ago

          People would, inevitably, prefer their virtual environment to the real one. They’d end up wasting away, plugged into some machine. It would lend serious credence to the idea of a simulated universe, and reduce the human experience by replacing it with an improved one.

          Have you considered making the real world better?

        • conditional_soup@lemm.ee
          link
          fedilink
          English
          arrow-up
          0
          ·
          1 month ago

          Tbf, it’s kinda bullshit that we can’t double jump IRL. Double jumping just feels right, like it’s something we should be able to do.

          Yeah, no, it’d likely be really awful for us. I mean, can you imagine what porn would be like on that? That’s a fermi paradox solution right there. I could see the tech having a lot of really great applications, too, like training simulations for example, but the video game use case is simultaneously exhilarating and terrifying.

    • Dil@is.hardlywork.ing
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      1 month ago

      We should be looking at more particles, more dynamic lighting, effects, realism is forsure a goal just not in the way you think, pixar movies have realistic lighting and shadows but arent “realistic”

      After I started messing with cycles on blender I went back to wanting more “realistic” graphics, its better for stylized games too

      But yeah I want the focus to shift towards procedural generation (I like how houdini and unreal approach it right now), more physics based interactions, elemental interactions, realtime fire, smoke, fluid, etc. Destruction is the biggest dissapointment, was really hoping for a fps that let me spend hours bulldozing and blowing up the map.

      • mrvictory1@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        1 month ago

        Destruction is the biggest dissapointment, was really hoping for a fps that let me spend hours bulldozing and blowing up the map.

        Ever heard of The Finals?

  • merthyr1831@lemmy.ml
    link
    fedilink
    English
    arrow-up
    0
    ·
    1 month ago

    yeah but the right hand pic has twenty billion more triangles that are compressed down and upscaled with AI so the engine programmers dont have to design tools to optimise art assets.

    • Cethin@lemmy.zip
      link
      fedilink
      English
      arrow-up
      0
      ·
      1 month ago

      I know you’re joking, but these probably have the same poly count. The biggest noticeable difference to me is subsurface scattering on her skin. The left her skin looks flat, but the right it mostly looks like skin. I’m sure the lighting in general is better too, but it’s hard to tell.

      • merthyr1831@lemmy.ml
        link
        fedilink
        English
        arrow-up
        0
        ·
        1 month ago

        yeah they probably just upped internal resolution and effects for what I assume is an in-engine cutscene. Not that the quality of the screenshot helps lmao

  • Melonpoly@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    1 month ago

    Slightly improved graphics while having worse enemy ai, unreal engine stutter, constant hand holding with in game puzzles, restricted character creation, all while having to wait for updates to fix issues that shouldn’t be there at launch.

    • The Picard Maneuver@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      0
      ·
      1 month ago

      Don’t forget how many modern AAA games feel like you’re playing a gamified version of your car’s navigation app.

      Waypoint>cutscene>waypoint>cutscene>waypoint>cutscene

  • jmcs@discuss.tchncs.de
    link
    fedilink
    English
    arrow-up
    0
    ·
    1 month ago

    What big shift do you expect? Even regarding 3D realism we are way past the point of diminishing returns in terms of development costs.

  • drislands@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    1 month ago

    The problem as I see it is that there is an upper limit on how good any game can look graphically. You can’t make a game that looks more realistic than literal reality, so any improvement is going to just approach that limit. (Barring direct brain interfacing that gives better info than the optical nerve)

    Before, we started from a point that was so far removed from reality than practically anything would be an improvement. Like say “reality” is 10,000. Early games started at 10, then when we switched to 3D it was 1,000. That an enormous relative improvement, even if it’s far from the max. But now your improvements are going from 8,000 to 8,500 and while it’s still a big absolute improvement, it’s relatively minor – and you’re never going to get a perfect 10,000 so the amount you can improve by gets smaller and smaller.

    All that to say, the days of huge graphical leaps are over, but the marketing for video games acts like that’s not the case. Hence all the buzzwords around new tech without much to show for it.

    • Squizzy@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      1 month ago

      Graphics are only part of it, with the power that is there I am disappointed in the low quality put to rrlease. I loved Jedi survivor, a brilliant game but it was terribly optimised. I booted it today and had nothing but those assest loading flashes as walls and structures in my immediate vicinity and eyeline flashed white into existence.

      Good games arent solely reliant om graphics but christ if they dont waste what they have. Programmers used to push everything to the max, now they get away with pushing beta releases to print.

    • jj4211@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      1 month ago

      Well you can get to a perfect 10k hypothetically, you can have more geometric/texture/lighting detail than the eye could process. From a technical perspective.

      Of course you have the technical capabilities, and that’s part of the equation. The other part is the human effort to create the environments. Now the tech sometimes makes it easier on the artist (for example, better light modeling in the engine at run time means less effort to bake lighting in, and ability for author to basically “etc…” to more detail, by smoothing or some machine learning extrapolations). Despite this, more detail does mean more man hours to try to make the most of that, and this has caused massive cost increases as models got more detailed and more models and environments became feasible. The level of artwork that goes into the whole have of pacman is less than a single model in a modern game.

    • The Picard Maneuver@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      0
      ·
      1 month ago

      VR is the one thing that feels similar to the old generational leaps to me. It’s great, but I haven’t set mine up in a few years now.

      • Xanthrax@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        1 month ago

        Fair. I haven’t played “No Man’s Sky,” yet, but apparently, it’s awesome in VR.

    • renegadespork@lemmy.jelliefrontier.net
      link
      fedilink
      English
      arrow-up
      0
      ·
      1 month ago

      VR definitely feels like the next 2D->3D paradigm shift, with similar challenges. except it hasn’t taken off like 3D did IMO for 2 reasons:

      1. VR presents unique ergonomic challenges.

      Like 3D, VR significantly increased graphics processing requirements and presented several gameplay design challenges. A lot of the early solutions were awkward, and felt more like proof-of-concepts than actual games. However, 3D graphics can be controlled (more or less) by the same human interface devices as 2D, so there weren’t many ergonomic/accessibility problems to solve. Interfacing VR with the human body requires a lot of rather clunky equipment, which presents all kinds of challenges like nausea, fatigue, glasses, face/head size/shape, etc.

      2. The video game industry was significantly more mature when (modern) VR entered the scene.

      Video games were still a relatively young industry when games jumped to 3D, so there was much more risk tolerance and experimentation even in the “AAA” space. When VR took off in 2016, studios were much bigger and had a lot more money involved. This usually results in risk aversion. Why risk losing millions on developing a AAA VR game that a small percentage of gamers even have the hardware for when we can spend half (and make 10x) on just making a proven sequel? Instead large game publishers all dipped their toes in with tech demos, half-assed ports, and then gave up when they didn’t sell that well (Valve, as usual, being the exception).

      I honestly don’t believe the complaints you hear about hardware costs and processing power are the primary reasons, because many gaming tech, including 3D, had the same exact problem in the early stages. Enthusiasts bought the early stuff anyway because it was groundbreaking, and eventually costs come down and economies of scale kick in.

      • Xanthrax@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        1 month ago

        If anyone can optimize Disney’s omni directional walking pad, we’ll be there. I’d give it 3 decades if it goes that way. I’ve heard it’s not like real walking. It feels very slippery. All that being said, you don’t have to wrap yourself in a harness and fight friction to simulate walking like other walking pads. It also seems simple enough, hardware wise, that it could be recreated using preexisting parts/ 3d printing. I’m honestly surprised I haven’t seen a DIY project yet.

  • kitnaht@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    1 month ago

    Kind of like smartphones. They all kind of blew up into this rectangular slab, and…

    Nothing. It’s all the same shit. I’m using a OnePlus 6T from 2018, and I think I’ll have it easily for another 3 years. Things eventually just stagnate.

    • starman2112@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      0
      ·
      1 month ago

      One company put a stupid fucking notch in their screen and everyone bought that phone, so now every company has to put a stupid fucking notch in the screen

      I just got my tax refund. If someone can show me a modern phone with a 9:16 aspect ratio and no notch, I will buy it right now

    • u/lukmly013 💾 (lemmy.sdf.org)@lemmy.sdf.org
      link
      fedilink
      English
      arrow-up
      0
      ·
      1 month ago

      I was hoping that eventually smartphones would evolve to do everything. Especially when things like Samsung Dex were intorduced, it looked to me like maybe in the future phones could replace desktops, running a full desktop OS when docked and some simplified mobile UI + power saving when in mobile mode.

      But no, I only have a locked-down computer.

      • Trainguyrom@reddthat.com
        link
        fedilink
        English
        arrow-up
        0
        ·
        1 month ago

        Yeah whatever happened to that? That was such a good idea and could have been absolutely game changing if it was actually marketed to the people who would benefit the most from it

        • pufferfisherpowder@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          1 month ago

          I used it for a while when I worked two jobs. Is clock out of job 1 and had an agreement with them to be allowed to use the screen and input devices at my desk for job 2. Then I’d plug in my Tab S8 and get to work, instead of having to carry to chunky laptops.
          So it still exists! What I noticed is that a Snapdragon 8 Gen 1 feels underpowered and that Android, and this is the bigger issue, does not have a single browser that works as a full fledged desktop version. All browser I tested has some shortcomings, especially with drag and drop or context menus or whatever. Like things work but you’re constantly reminded that you’re running a mobile os. Like weird behavior or oversized context menus or whatever.

          I wish you could lunch into a Linux vm instead of Dex UI. Or for Samsung to double down on the concept. The Motorola Atrix was so ahead of it’s time. Like your phone transforming into your tablet, into your laptop, into your desktop. How fucking cool is that?
          Apple would be in a prime position, they’re entire ecosystem is now ARM based and they have the chips with enough power. But it’s not their style to do something cool to threaten their bottom line. Why sell one phone when you can sell phone, laptop, tablet, desktop separately?

          • Trainguyrom@reddthat.com
            link
            fedilink
            English
            arrow-up
            0
            ·
            1 month ago

            It’s super easy to forget but Ubuntu tried to do it back in the day with Convergence as well, and amusingly this article also compares it to Microsoft’s solution on Windows Phone. It’s a brilliant idea but apparently no corporation with the ecosystem to make it actually happen has the will to risk actually changing the world despite every company talking about wanting an “iPhone moment”

            Apple would be in a prime position, they’re entire ecosystem is now ARM based and they have the chips with enough power. But it’s not their style to do something cool to threaten their bottom line. Why sell one phone when you can sell phone, laptop, tablet, desktop separately?

            Let’s be real, Apple’s biggest risk would be losing the entire student and young professional market by actually demonstrating that they don’t need a Mac Book Pro to use the same 5 webapps that would work just as well on a decent Chromebook (if such a thing existed)

          • u/lukmly013 💾 (lemmy.sdf.org)@lemmy.sdf.org
            link
            fedilink
            English
            arrow-up
            0
            ·
            1 month ago

            Linux vm

            Or just something like Termux, a terminal emulator for Android. Example screenshot (XFCE desktop over VNC server), I didn’t know what to fit in there:

            Full desktop apps, running natively under Android. For better compatibility Termux also has proot-distro (similar to chroot) where you can have… let me copy-paste

            Supported distributions (format: name < alias >):
            
              * Alpine Linux < alpine >
              * Arch Linux < archlinux >
              * Artix Linux < artix >
              * Chimera Linux < chimera >
              * Debian (bookworm) < debian >
              * deepin < deepin >
              * Fedora < fedora >
              * Manjaro < manjaro >
              * openKylin < openkylin >
              * OpenSUSE < opensuse >
              * Pardus < pardus >
              * Ubuntu (24.04) < ubuntu >
              * Void Linux < void >
            
            Install selected one with: proot-distro install <alias>
            

            Though there is apparently some performance hit. I just prefer Android, but maybe you could run even full LibreOffice under some distro this way.

            If it can be done by Termux, then someone like Samsung could definitely make something like that too, but integrated with the system and with more software available in their repos.

            What’s missing from the picture but is interesting too is NGINX server (reverse proxy, lazy file sharing, wget mirrored static website serving), kiwix-serve (serving ZIM files including the entire Wikipedia from SD card) and Navidrome (music server).
            And brought to any internet-connected computer via Cloudflare QuickTunnel (because it doesn’t need account nor domain name). The mobile data upload speed will finally matter, a lot.

            You get the idea, GNU+Linux. And Android already has the Linux kernel part.

      • CancerMancer@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        0
        ·
        1 month ago

        I would love to have a smaller phone. Not thinner, smaller. I don’t care if it’s a bit thick, but I do care if the screen is so big I can’t reach across it with one hand.

    • mrvictory1@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      1 month ago

      OnePlus 6 line of phones are one of the very few with good Linux support, I mean, GNU/Linux support. If custom ROMs no longer cut it you can get even more years with Linux. I had an iPhone, was eventually fed up, got an Android aaand I realized I am done with smartphones lol. Gimme a laptop with phone stuff (push notifications w/o killing battery, VoLTE) and my money is yours, but no such product exists.

  • parlaptie@feddit.org
    link
    fedilink
    English
    arrow-up
    0
    ·
    1 month ago

    There’s no better generational leap than Monster Hunter Wilds, which looks like a PS2 game on its lowest settings and still chugs at 24fps on my PC.

    • upandatom@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      1 month ago

      Could’ve done your research before buying. Companies aren’t held to standards bc people are uninformed buyers.

      • parlaptie@feddit.org
        link
        fedilink
        English
        arrow-up
        0
        ·
        1 month ago

        Never said I bought it. Why would I buy a 70€ game without running the benchmark tool first?

        I just still find it ridiculous that it looks and runs like ass when MH World looks and runs way better on the same PC. Makes me wonder what’s really behind whatever ‘technological advancements’ have been put into Wilds. It’s like it’s an actual scam to make people buy new hardware with no actual benefit.