• Darkassassin07
    link
    fedilink
    English
    arrow-up
    9
    arrow-down
    49
    ·
    7 hours ago

    So poorly optimised you need future technology to run it isn’t the future proofing strategy I’d go with, but ok…

    • tal@lemmy.today
      link
      fedilink
      English
      arrow-up
      30
      ·
      edit-2
      5 hours ago

      So, I’ve seen this phenomenon discussed before, though I don’t think it was from the Crysis guys. They’ve got a legit point, and I don’t think that this article does a very clear job of describing the problem.

      Basically, the problem is this: as a developer, you want to make your game able to take advantage of computing advances over the next N years other than just running faster. Okay, that’s legit, right? You want people to be able to jack up the draw distance, use higher-res textures further out, whatever. You’re trying to make life good for the players. You know what the game can do on current hardware, but you don’t want to restrict players to just that, so you let the sliders enable those draw distances or shadow resolutions that current hardware can’t reasonably handle.

      The problem is that the UI doesn’t typically indicate this in very helpful ways. What happens is that a lot of players who have just gotten themselves a fancy gaming machine, immediately upon getting a game, go to the settings, and turn them all up to maximum so that they can take advantage of their new hardware. If the game doesn’t run smoothly at those settings, then they complain that the game is badly-written. “I got a top of the line Geforce RTX 4090, and it still can’t run Game X at a reasonable framerate. Don’t the developers know how to do game development?”

      To some extent, developers have tried to deal with this by using terms that sound unreasonable, like “Extreme” or “Insane” instead of “High” to help to hint to players that they shouldn’t be expecting to just go run at those settings on current hardware. I am not sure that they have succeeded.

      I think that this is really a UI problem. That is, the idea should be to clearly communicate to the user that some settings are really intended for future computers. Maybe “Future computers”, or “Try this in the year 2028” or something. I suppose that games could just hide some settings and push an update down the line that unlocks them, though I think that that’s a little obnoxious and would rather not have that happen on games that I buy – and if a game company goes under, they might never get around to being unlocked. Maybe if games consistently had some kind of really reliable auto-profiling mechanism that could go run various “stress test” scenes with a variety of settings to find reasonable settings for given hardware, players wouldn’t head straight for all-maximum settings. That requires that pretty much all games do a good job of implementing that, or I expect that players won’t trust the feature to take advantage of their hardware. And if mods enter the picture, then it’s hard for developers to create a reliable stress-test scene to render, since they don’t know what mods will do.

      Console games tend to solve the problem by just taking the controls out of the player’s hands. The developers decide where the quality controls are, since players have – mostly – one set of hardware, and then you don’t get to touch them. The issue is really on the PC, where the question is “should the player be permitted to push the levers past what current hardware can reasonably do?”

      • MentalEdge@sopuli.xyz
        link
        fedilink
        English
        arrow-up
        1
        ·
        10 minutes ago

        I feel like instead of the “settings have been optimized for your hardware” pop up that almost always sets them to something that doesn’t account for the trade-off between looks and framerate that a player wants, there should be a “these settings are designed for future hardware and may not work well today” pop up when a player sets everything to max.

        I’ve noticed some games also don’t actually max things out when you select the highest preset.

        I also really like the settings menu of the RE engine games. It has indicators that aggregate how much “load” you’re putting on your system by turning each setting up or down, which lets you make more informed decisions on what settings to enable or disable. And it in fact will straight up tell you when you turn stuff too high, and warns you that things might not run well if you do.

      • FrostyCaveman@lemm.ee
        link
        fedilink
        English
        arrow-up
        1
        ·
        2 hours ago

        As much as I find distasteful the idea of shipping “mandatory” patches for single player games years down the line to fix issues that should’ve been caught during QA… this might be a decent use case for them

    • MoonlitSanguine@lemmy.one
      link
      fedilink
      English
      arrow-up
      53
      arrow-down
      1
      ·
      7 hours ago

      That is not at all how it works or what they are saying. For the time, it was not poorly optimised. Even at low settings it was one of the best looking games released, and pushed a lot of modern tech we take for granted today in games.

      Being designed to scale, does not mean its badly optimised.