• frezik@midwest.social
    link
    fedilink
    arrow-up
    31
    arrow-down
    1
    ·
    edit-2
    13 days ago

    They don’t have zero latency. It’s a misconception.

    The industry standard way to measure screen lag is from the middle of the screen. Let’s say you have a 60Hz display and hit the mouse button to shoot the very moment it’s about to draw the next frame, and the game manages to process the data before the draw starts. The beam would start to draw, and when it gets to the middle of the screen, we take our measurement. That will take 1 / 60 / 2 = 8.3ms.

    Some CRTs could do 90Hz, or even higher, but those were really expensive (edit: while keeping a high resolution, anyway). Modern LCDs can do better than any of them, but it took a long time to get there.

    • Björn Tantau@swg-empire.de
      link
      fedilink
      arrow-up
      19
      arrow-down
      1
      ·
      14 days ago

      Actually 60 Hz was too low to comfortably use a CRT. I think it started to work well at 75 Hz, better 80 or 85. Don’t know if I ever had a 90 Hz one, especially at a resolution above 1280x960. But if you valued your eyes you never went down to 60.

      No idea why 60 Hz on an LCD works better, though.

      • DefederateLemmyMl@feddit.nl
        link
        fedilink
        English
        arrow-up
        17
        ·
        edit-2
        13 days ago

        No idea why 60 Hz on an LCD works better, though.

        Because LCD pixels are constantly lit up by a backlight. They don’t start to dim in between refresh cycles. They may take some time to change from one state to another, but that is perceived as ghosting, not flickering.

        On a CRT the phosporus dots are periodically lit up (or “refreshed”) by an electron beam, and then start to dim afterwards. So the lower the refresh rate, the more time they have to dim in between strobes. On low refresh rates this is perceived as flickering. On higher refresh rates, the dots don’t have enough time to noticably dim, so this is perceived as a more stable image. 60Hz happens to the refresh rate where this flicker effect becomes quite noticable to the human eye.

      • frezik@midwest.social
        link
        fedilink
        arrow-up
        9
        ·
        14 days ago

        60Hz is what any NTSC TV would have had for consoles. Plenty of older computers, too. Lots of people gamed that way well into the 2000s.

        Incidently, if you do the same calculation above for PAL (50Hz), you end up at 10ms, or about 2ms more lag than NTSC. Many modern LCDs can have response times <2ms (which is on top of the console’s internal framerate matched to NTSC or PAL). The implication for retro consoles is that the lag difference between NTSC CRTs and modern LCDs is about the same as the difference between NTSC and PAL CRTs.