Absolutely bizarre that a 1st party title doesn’t seem optimized for the console they’re developing for. This makes me skeptical the PC version will be optimized too.

  • Ghoelian@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    20
    arrow-down
    1
    ·
    3 months ago

    The people that keep saying that should really just try to use a 144+hz monitor for a while. Surely they’ll be able to notice the difference as well.

    • lengau@midwest.social
      link
      fedilink
      English
      arrow-up
      17
      ·
      3 months ago

      If someone’s saying that about 30fps they should just set their refresh rate to 30 and move their mouse.

    • Thrashy@lemmy.world
      link
      fedilink
      English
      arrow-up
      7
      arrow-down
      1
      ·
      edit-2
      3 months ago

      Might just be my middle-aged eyes, but I recently went from a 75Hz monitor to a 160Hz one and I’ll be damned if I can see the difference in motion. Granted that don’t play much in the way of twitch-style shooters anymore, but for me the threshold of visual smoothness is closer to 60Hz than whatever bonkers 240Hz+ refresh rates that current OLEDs are pushing.

      I’ll agree that 30fps is pretty marginal for any sort of action gameplay, though historically console players have been more forgiving of mediocre performance in service of more eye candy.

      • mephiska@fedia.io
        link
        fedilink
        arrow-up
        7
        ·
        3 months ago

        Are you sure you have the reset rate set correctly on your video card? The difference between 75hz and 160hz is very clear just by moving your mouse cursor around. Age shouldn’t have anything to do with it.

        • Thrashy@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          ·
          3 months ago

          Quite sure – and given that one game I’ve been playing lately (and the exception to the lack of shooters in my portfolio) is Selaco, so I ought to have noticed by now.

          There’s a very slight difference in smoothness when I’m rapidly waving a mouse cursor waving around on one screen versus the other, but it’s hardly the night-and-day difference that going from 30fsp to 60fps was back in Ye Olden Days, and watching a small, fast-moving, high-contrast object doesn’t make up the bulk of gameplay in anything I play these days.

      • OminousOrange
        link
        fedilink
        English
        arrow-up
        7
        arrow-down
        1
        ·
        3 months ago

        If the two are beside eachother, you’ll definitely see the difference.

        • Thrashy@lemmy.world
          link
          fedilink
          English
          arrow-up
          4
          ·
          3 months ago

          The old one and the new one are literally side by side on my desktop, don’t know what to tell you…

          • OminousOrange
            link
            fedilink
            English
            arrow-up
            3
            ·
            3 months ago

            Hmm, I’ve found it quite noticeable. Perhaps turn an FPS counter on and see what it’s actually running at. If you have a game showing on both screens, it’ll likely limit the fps to suit the lowest display hz.

            • Gerudo@lemm.ee
              link
              fedilink
              English
              arrow-up
              4
              ·
              3 months ago

              This is a good point, a lot of people just assume plugging it in gets the hz, but a lot of the time you have to select the hz in your settings.

      • ParetoOptimalDev@lemmy.today
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        1
        ·
        3 months ago

        Games feel almost disgusting on 60hz now, but they felt fine before I tried 144hz.

        Maybe if I was stuck at 60hz for a long time id get used to it.

        Now though, if I switch for 30m I can’t ignore the difference.

      • jorp@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        3 months ago

        A 160hz refresh rate gives the software a 6ms render budget, do things actually even run at that rate?

        • SpacetimeMachine@lemmy.world
          link
          fedilink
          English
          arrow-up
          9
          ·
          3 months ago

          If your comp is good enough absolutely. Strong PCs now can run sub 5ms frame times at 4k pretty regularly. Especially for competitive games that aren’t designed to look incredible.

      • r00ty@kbin.life
        link
        fedilink
        arrow-up
        1
        ·
        3 months ago

        That’s weird. I’m getting to the age where I wouldn’t see the point in 4k, I’d need to have my head on top of the screen to see it. But refresh rate can be felt in fluid scrolling etc and definitely even if only on the unconcious level, improves awareness in games too.

      • lengau@midwest.social
        link
        fedilink
        English
        arrow-up
        1
        ·
        3 months ago

        It really depends what one’s doing, also. For many things, including many games, 30fps is fine for me. But I need at least 60fps for mousing. Beyond that though I don’t notice the mouse getting smoother above 60fps, but some games I do have a better experience at 120fps. And I’m absolutely sold on 500+ fps for simulating paper.

    • lustyargonian@lemm.ee
      link
      fedilink
      English
      arrow-up
      3
      ·
      3 months ago

      My work macbook can only do 60 while my Rog Ally can do 120, and damn the mouse feel of 120 is so much better that I hate my work laptop can’t do it.