I was reading an article on the new LG display with a refresh rate of 7680Hz and it says:
While a typical refresh rate for a monitor might be 60Hz-240Hz, an outdoor display designed to be viewed from a distance needs to be much higher
The idea that there’s an intrinsic link between refresh rate and viewing distance is new to me and feels unintuitive. I can understand the need for high brighteness for far view distance. I also could understand refresh rate mattering for a non-persistent (CRT) display. But for an Led display surely you can see it far away even if it refreshes once a second?
Refresh rate normally needs to be high enough to avoid pixels “jumping” between refreshes on high resolution displays, so wouldn’t higher view distances allow you to decrease the refresh rate?
Is the article just spouting bullshit? Or is there an actual link between refresh rate and view distance?
Maybe so it won’t flicker when filmed?
Isn’t that more about the specific number than it just being a high number? Like it needs to be a multiple of the refresh rate of the video recorder.
In theory yes, but as there are many different video recorders, going with a super high refresh rate could be a better catch-all so to speak.
Yes and no. If it’s higher the threshold gets wider and you don’t need it to be as perfectly synced because the refresh is so fast the camera can’t pick it up.