- cross-posted to:
- [email protected]
- cross-posted to:
- [email protected]
Sounds like hdmi Forum are a bunch of twats. Time for a new format.
DisplayPort already exists
We cannot have two standards, that’s ridiculous! We need to develop one universal standard that covers everyone’s use cases.
There are now three competing standards.
I know what you are referencing, but displayport already covers everybody’s use cases
Oh? Let me CEC on that…
I’ll just pull it up on this display that’s more than 9 feet away from the source…
#switchtodisplayport
Yes, I agree. And it needs to be open bloody source!!
Hi, my name is USB-C!
And what does that use? That’s right it’s Displayport Alternate Mode! Oh you’ve got Thunderbolt? Guess what, also Displayport!
Chuck Testa!
Hard to find on non-pc gear, but that’s a fair point
It’s usually easy enough to adapt it as needed. It can typically send signals compatible with HDMI and DVI-D just fine.
The passive adapters that connect to DP++ ports probably still rely on this HDMI specific driver/firmware support for these features.
can’t you just mod it?
And also USB c
USB-C display output uses the Display Port protocol
Can it use others, and is there a benefit? USB C makes a lot of sense; lower material usage, small, carries data, power and connects to almost everything now.
I believe USB-C is the only connector supported for carrying DisplayPort signals other than DisplayPort itself.
The biggest issue with USB-C for display in my opinion is that cable specs vary so much. A cable with a type c end could carry anywhere from 60-10000MB/s and deliver anywhere from 5-240W. What’s worse is that most aren’t labeled, so even if you know what spec you need you’re going to have a hell of a time finding it in a pile of identical black cables.
Not that I dislike USB-C. It’s a great connector, but the branding of USB has always been a mess.
would be neat to somehow have a standard color coding. kinda how USB 3 is (usually) blue, maybe there could be thin bands of color on the connector?
better yet, maybe some raised bumps so visually impaired people could feel what type it was. for example one dot is USB 2, two could be USB 3, etc
Have you looked at the naming of the usb standards? No you havn’t otherwise you wouldn’t make this sensible suggestion.
Please think of the shareholders… :(
I think that the biggest issue with dp over usbc is that people are going to try to use the same cable for 4k and large data transfers at the same time, and will then whine about weird behaviour.
4K works for mine (it’s 3.2).
Yep, very true. I didn’t understand this until I couldn’t connect my Mac to my screen via the USB C given with the computer, I had to buy another (and order it in specifically). Pick up a cable, and I have no idea which version it is.
Dont forget the limited length. I cant remember exactly but usb c delivering power has a max length of arpund 4 metres
Yeah I have multiple USB cables, some at 30w, and some at 140w. Get them mixed up all the time! More companies need to at least brand the wattage on the connectors.
This is the big issue I have with with “USB C”. USB c is just the connector which can be used for so many things. What actual is supported depends on things you can’t see, like the cable construction or what the device supports.
There’s some really high bandwidth stuff that USB-C isn’t rated for. You have to really press the limits, though. Something like 4k + 240Hz + HDR.
That doesn’t even seem so unreasonable. Is that the limit though? My cable puts a gigabyte a second down it so I wouldn’t imagine that would hit the limit.
USB-C with Thunderbolt currently had a limit of 40Gbit/sec. Wikipedia has a table of what DisplayPort can do at that bandwidth:
https://en.wikipedia.org/wiki/DisplayPort
See the section “Resolution and refresh frequency limits”. The table there shows it’d be able to do 4k/144hz/10bpp just fine, but can’t keep above 60hz for 8k.
Its an uncompressed video signal, and that takes a lot of bandwidth. Though there is a simple lossless compression mode.
It is trivial arithmetic: 4.52403840*2160 ≈ 9 GB/ s. Not even close. Even worse, that cable will struggle to get ordinary 60hz 4k delivered.
USB C is just a connector, you might be referring to Displayport over USB C which is basically just the same standard with a different connector at the end. That or Thunderbolt I guess
I thought thunderbolt was DP passthrough as well
USB C seems like a good idea but in reality all it really did was take my 5 different, not interchangeable, but visually distinct, cables, and make them all look identical and require labeling
I love having mysterious cables that may or may not do things I expect them to when plugged into ports that may or may not support the features I think they do.
If the implementation is so broad that I have to break out my label maker, can we even really call it a “standard”
you mean thunderbolt?
deleted by creator
Linux has very little to do with DisplayPort. My Windows PCs use DisplayPort. You can get passive adapters to switch from HDMI to DisplayPort etc.
deleted by creator
The problem is those passive adapters only work because one side switches to the other’s protocol.
What exactly doesnt work over HDMI?
deleted by creator
More people should try DP.
I thought I had NSFW turned off… 🤣
( ͡° ͜ʖ ͡°)
What do Dill Pickles have to do with being work safe?
When you’re trying to get into DPs, the outside can be slippery and the screw part can be tight! Very dangerous for the workplace.
As already mentioned, DisplayPort exists. The problem is adoption. Even getting DisplayPort adopted as the de facto standard for PC monitors hasn’t done anything to get it built into TVs.
also there’s still no alternative to hdmi-cec
DisplayPort supports CEC.
From Wikipedia:
The DisplayPort AUX channel is a half-duplex (bidirectional) data channel used for miscellaneous additional data beyond video and audio, such as EDID (I2C) or CEC commands.
huh didn’t know. does it work in practice tho?
DisplayPort for life!
Is there a reason or way to prevent display port from having so many connection issues specifically on port replicators (docking stations)?
In corporate environments I find so many times that you plug them up over and over, unplug over and over and check the connection a million times before turning everything off one final time, holding the power button on everything (kind of like an smc reset) and then booting up everything like you originally did and they come up. Is this a result of the devices trying to remember a previous setup or is their an easy way to avoid it?
I’ve hooked up dozens of them and still ran into issues when a family member brought a setup home to work when they were sick last week.
We use Dell WD-19 docks. Not sure if you use similar. Updated dock firmware and laptop drivers made a difference for us with connection issues. Sometimes you gotta perform a reset on them to make them behave (disconnect dock power and USB-C and hold power button for just over 15 sec). Sometimes the laptop NVRAM needs to be reset instead (for Dell, disconnect all devices and power while off and hold button for just over 30 sec). Overall, though, no huge issues with DP specifically if the dock and laptop firmwares are up to date. Third-party docks/replicators definitely have way more issues, though.
DP for life!
ftfy
That doesn’t do audio too though right?
It does since version 1.0 it seems?
TIL
Thank God, for a moment I thought I had auditory hallucinations.
Visual are ok though
This is really frustrating. This is the only thing holding Linux gaming back for me, as someone who games with a AMD GPU and an OLED TV. On Windows 4k120 works fine, but on Linux I can only get 4k60. I’ve been trying to use an adapter, but it crashes a lot.
AMD seemed to be really trying to bring this feature to Linux, too. Really tragic that they were trying to support us, and some anti-open source goons shot them down.
ive found that the issue in my experience is that X11 only supports a max of 4k60, but Wayland supports 4k120 and beyond. I dont think the cable matters as the same cable im using works on windows with 4k160.
It’s a matter of cable bandwidth. 4k120 4:4:4 requires more bandwidth than hdmi 2.0 can provide. You can drop down to 4:2:0, but that’s a pretty bad experience and ruins the image quality.
I’ve been using an adapter cable, but it’s really flaky, I don’t know if it’s a bad cable or what. But a normal hdmi cable just plain works on Windows, since the windows amd driver supports hdmi 2.1.
yes, that is a valid factor. i meant more towards the type of cable doesn’t mater. but yes an up to date spec of the cable is also just as important
are you unable to use display port?
The OLED TV probably only has HDMI. TVs don’t normally have DisplayPort
There are DisplayPort to HDMI adapters. Like this one:
https://www.amazon.com/dp/B08XFSLWQF
None of the adapters seem to support VRR though, for whatever reason.
Same reason probably: the HDMI forum
Freesync works for me with that cable, but I have some awful cable training problem or something. Everytime the screen is reset, it refuses to go above 4k60 or turn on freesync, and it’ll only work again after I have the screen connected for a few minutes. So whenever I start my comouter, have to switch down 4k60 and wait like half an hour to be able to use high refresh rate. It’s been doing this for months.
I don’t know if I have a bad cable, weird interference, or if FRL adapters are just flaky.
I’m a bit confused by your comment. I have a 120Hz Monitor and use an AMD GPU on linux without issues. Connected via the display port on my GPU to the HDMI Port on my monitor (because samsung does not enable DDC on the display port for some reason).
I’m using an LG C2 Oled TV that doesnt have displayport.
I’m using an LG C2 Oled TV that doesnt have displayport.
Connected via the display port on my GPU to the HDMI Port on my monitor
ichbinjasokreativ seems to suggest that the viewing device he/she connects to is done via HDMI, the same as your OLED TV.
Unless I’m missing something?
Edit: You discuss this issue further down in the topic, so no need to reply.
Could have saved myself the time of replying to you if I had scrolled all the way through first, then backtracked, but that’s kind of unintuitive to do, especially on a cell phone browser.
No worries, I didn’t see they meant an adapter cable, either.
I have one too. Go take a look at Cable Matters. I am able to play games at 4k120 with my mac. See if something will work for you and you can always send a message to their customer support to ask questions.
Yeah thats the one I have. Maybe I’ll ask their support. It has the latest firmware but it’s so flaky about being able to do high bandwidth.
Should… should we sic EU on them?
They won’t change…
Apple didn’t want to change either hehe
Yeah, what I mean is that HDMI can be easier replaced than Apple :)
We already have DisplayPort, which is open source. If possible, we should stop buying HDMI cables, and stop using HDMI as much as possible.
So why is it rejected?
Just because they’re still trying to use HDMI to prevent piracy? Who in fuck’s name is using HDMI capture for piracy? On a 24fps movie, that’s 237MB of data to process every second just for the video. A 2 hour movie would be 1.6TB. Plus the audio would likely be over 2TB.
I’ve got a Jellyfin server packed with 4K Blu-ray rips that suggest there are easier ways to get at that data.
The CEO’s of the media companies are all fucking dinosaurs who still think VCRs should have been made illegal. You will never convince them that built in copy protection is a dumb idea and a waste of time.
Where are they finding dinosaurs to fuck that know what a VCR is?
Mine can barely work the TV remote!
HDMI Splitter + capture card.
No video put on a streaming service produced in the next 40 years will need HDMI 2.1 to display.
Can’t you compress what the HDMI outputs in real time so that it would have a normal size?
Sure. But why bother when you can rip it right from the disc in higher quality than you could ever hope to capture in real time?
All I can think of would be capturing a live broadcast of something airing on TV, and only on TV. Which… Has to be pretty rare these days. And you still have better methods to capture even that!
Even despite that HDMI capture is simply an awful way of obtaining that data, it’s even more pathetic when that “protection” can be defeated by a $30 capture card on Amazon…
The profiles HDMI 2.1 enables are even worse - 4k@120fps type stuff. Not exactly something needed for a movie.
Most people don’t pirate 4K media due to file size and internet speed constraints. Most people pirate 1080p video. There’s also the prospect of people pirating live television, which HDMI capture would be perfect for.
Then most people need get a better ISP. My crappy $60/mo fixed 5G can download an entire 4K film in under 10 minutes or start streaming it within a second. Y’all should see if there are any options beyond cable and DSL in your town. You might be pleasantly surprised what’s available these days.
Is that not a compressed stream though? Genuinely asking. A 4k blu ray rip and a 4k stream from a service (or whatever it saves for offline viewing on an app) a pretty different. I think things are getting conflated with capturing live 4k television and capturing a 4k blu ray as it plays, which both might be using an HDMI cable.
I use Stremio and only stream full 4K Blu Ray rips, with HDR and Dolby Atmos and all. So nothing is recompressed. 50-70GB files but it starts streaming almost instantly.
I have a poor 5G signal due to a tree that’s blocking my view of the antenna, so I get anywhere between 400Mbps and 1400Mbps (I’m supposed to get a gigabit but it’s usually closer to 500). Even with a poor signal it’s still way faster than any other ISP in my town.
The raw images are that big, but they’re compressed (even losslessly) to a fraction of the size.
Any good sources on those rips? You can pm of it helps
I just use the trusty old Radarr stack to find them. Pulls from https://1337x.to/ https://thepiratebay.org/ and https://therarbg.to/ on my set up.
You have to get there early to have much chance of getting a full 60GB+ 4K Blu-ray rip in a timely manner, but the ~15GB x265 rips are indistinguishable to me.
Recently become a fan of kickasstorrents, they usually have a x265 version with a bunch of blu-ray extras and Prowlarr already knows who they are.
They’re back? Thought they closed years ago
you and me both
I’ll have to add them to the list.
Is therarbg safe? The original rarbg closed one or two years ago.
Also, don’t forget torrent private trackers. They’re harder to get in (signups are usually closed, or you need an invite from someone who’s already in), but they’re very good!
Lots of hight quality content, well organized, usually with many seeds.
Of course you need to follow their rules and seed enough.
Usenet is also a surprisingly good way to find content, but you’ll need to pay both an indexer and a server.
You can pirate media that uses that new blu ray drm by plugging a capture card into the overpriced compatible DVD player and recording the video. Also, it’s a way to transfer saved content from a dvr as their hard drives are always encrypted (do those still exist). The video stream on all this stuff is encrypted with hdcp to prevent this but there exist hdcp strippers. It seems to still be possible to buy them even on Amazon. Stock up before they get banned. Frankly I’m surprised they aren’t banned already.
This sucks as all new TVs use HDMI2.1 for modern features and modern games consoles rely on those for 4k 60Hz HDR, etc.
So now Valve can’t just make their own home console with Steam OS for TVs directly (and support high-end features at least).
I believe this is specifically for FRL. Other features should still work afaik.
What’s the over/under that this was about preventing people getting around HDCP using a modified driver?
FRL != HDCP. I think HDCP works on Linux already?
DisplayPort supports HDCP as well though?
Alright, from now on I will never again buy any electronics with HDMI.
No disrespect meant towards GamingOnLinux, but this article from Tom’s Hardware has a much better description of what’s going on, including quotes.
This is the best summary I could come up with:
If you were hoping at some point to see HDMI 2.1+ on Linux with AMD + Mesa, you’re out of luck right now as it’s simply not going to be happening.
There’s been a bug report on the Mesa GitLab of “4k@120hz unavailable via HDMI 2.1” that’s been open for a few years now, with lots of comments and chatter about the issue.
In an update on the bug report, AMD engineer Alex Deucher commented: "The HDMI Forum has rejected our proposal unfortunately.
So if you’re on Linux, it’s going to continue to be best to buy hardware that uses DisplayPort.
On the NVIDIA side though, it seems like it may not be an issue, as developer Karol Herbst wrote on Mastodon: "Even though AMD might not be able to add support for HDMI 2.1, nouveau certainly will as Nvidia’s open source driver also supports HDMI 2.1 so there is no reason to believe that at least some drivers can’t support HDMI 2.1
It’s quite backwards, but apparently having all the logic inside firmware (like Nvidia does) will probably help us implementing support for HDMI 2.1"
The original article contains 244 words, the summary contains 183 words. Saved 25%. I’m a bot and I’m open source!
Boo! Get off the stage HDMI Forum!
Eli5, what are the security risks of my HDMI cable?
Piracy being easier is the only risk. Once again ruining the experience of legitimate customers to try and stop a thing that they have had no success at even slowing down.
Even further, it made it more expensive to buy products from all the dumb licensing fees that all the middlemen try to shoehorn in.
The security of mega Corp IP.
HDMI 2.1cspecs are closed source.
Having to use systems that swing to more closed solutions is going to degrade security in more places.
First thing I could think of is taking advantage of the ARC to spy on you.
Always thought that display port is better anyways lol. Anything that HDMI does or have that display port doesnt?
Support on low end devices
Audio? Forgive my ignorance, It is out of actual ignorance or rather lack of knowledge
Naah, DisplayPort carries everything from audio, USB, displays, etc. Version 1.2 even allows daisy chaining displays, so you don’t have to have number of cables going to your PC. When it comes to audio, version 1.4 supports 1536 kHz maximum sample rate at 24bits and supports 32 individual audio channels. Scary good! Overall it’s significantly better protocol.
DisplayPort supports audio.
I’m more recent implantations of DP have audio
I guess HDMI-CEC and some DRM.
DisplayPort supports CEC and HDCP DRM.
I have yet to see a reason why HDMI is better than DP other than what port is available on your device.
Yeaj, unfortunately most TVs only have HDMI, not DP
Really? Seems like I am late to the party.
Protocol level DRM?
VESA or bust
bastards