We don’t need another video cable standard. Current GPUs are barely able to handle 4K gaming at high refresh rates. It will be a while before they can do 8K at high refresh rates. It will be even longer until there are affordable monitors. I’m sure there will be a new DP standard by then that can handle it.
I’m wondering if this is less about technological expansion and more about decoupling from western controlled (and fee charging) standards bodies of HDMI and VESA (Displayport).
Displayport is open and entirely free tho… Only HDMI is closed and fees.
DisplayPort 1.2 and later is very much not an open and free standard. Access to the specification is locked behind an NDA and a VESA membership that costs thousands of dollars annually.
DisplayPort 1.1a is a freely available standard and has enough bandwidth to support a single display at either 1080p/150Hz, 1440p/90Hz, or 4K/30Hz. Any higher than that and it’s proprietary. Still, VESA doesn’t seem to be as restrictive about its standard as the HDMI Forum, which goes so far as to deliberately prohibit HDMI 2.1 in anything open-source (foss drivers like Nouveau can only work with it if the actual support is handled by closed-source firmware).
VESA’s fees are for the membership itself rather than per-device like HDMI’s are, but a completely separate organization that’s unrelated to the DP standard tries to charge per-device license fees on all DP devices. MPEG LA demands $0.20 per DP device for protection from their patents, which is much higher than the HDMI per-device fee, but the claims that their patents apply at all seems to be disputed.
Not to be that guy, but I believe for monitors 8K might be the end of the line in terms of practical improvements. At the distances that monitors are used, resolutions beyond 8K cannot be differentiated by the human eye.
Please correct me if I am wrong. :)
One place I can see needed expansion beyond 8k is displays in VR goggles. Think about trying to emulate a virtual monitor experience of 2k (1080p) using 8k displays at a distance. As in, think about sitting at a desk in VR and looking at a “monitor” in VR. The actual physical displays producing the image are 8k, but must take your entire field of view. So only a subset of that 8k can be used to reproduce the “virtual 2k monitor” in VR. Here’s an image for reference:
Now move your face close enough to your lemmy viewer that that picture is taking up your entire field of view. Now look how big that 2k block is. Now imagine you want that level of detail at normal “monitor distance” in VR. That’s going to take more than an 8k physical display in your VR goggles.
VR is definitely one example where 16K might be useful, but VR is a relatively niche.
I can argue that VR is niche only because high res lightweight displays aren’t available at scale yet.
That’s a big ask though, such displays are extremely expensive and there is not enough scale for VR.
I think 8K is overkill for normal consumer use. Apart from something like photo/video editing or CAD on a large monitor, there are not a lot of uses where it will make much of a difference. Even movie theaters only use 2K and 4K resolutions for digital projection.
I’m confused. How does the proper work? You don’t plug the monitor with a power cord? Or power comes from the monitor and powers the device plugged into the monitor?
The latter.
The latter seems next to useless. What computer would you choose to power with a monitor with these specs?