I do not really have a body for this. I was not aware that this is a thing and still feel like this is bs, but maybe there is an actual explanation for HDMI Forum’s decision that I am missing.

  • BoycottTwitter@lemmy.zip
    link
    fedilink
    English
    arrow-up
    46
    ·
    edit-2
    10 hours ago

    If you want change you got to direct your comments to the HDMI forum. Here we can talk about it forever and if they never see anything they won’t change. I sent the following email to: admin@hdmiforum.org

    Dear HDMI Forum,

    I was recently saw the news that the HDMI forum was blocking open source implementations of the HDMI 2.1 specifications and I want to express that I really believe this is a bad idea. I hope the HDMI Forum will consider allowing it. I can’t say I understand what the concern is or the reason for blocking it but I really doubt that whatever issue is envisioned will actually come to fruition, instead I believe that allowing open source implementations will be beneficial for adoption of the standard and since if I understand correctly the licensing fees are based on hardware sold so having open source code will of course not exempt anyone from HDMI licensing rules.

    Thank you so much for your consideration,

    (Name)

    Maybe it’s not perfect (I already wished I worded one sentence better) but I think what matters most is just trying your best and using your voice whenever you can. Be sure to send your email too, the more they receive hopefully the higher the chances that this works but of course be sure to use your own wording, I just put that here for an example.

        • muusemuuse@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          3
          ·
          2 hours ago

          Fun fact: DisplayPort can carry hdmi signals. So you can connect a cheap cable with DP on one end and HDMI on the other. The only catch is it goes DP->HDMI, not the other way around.

  • chillpanzee@lemmy.ml
    link
    fedilink
    English
    arrow-up
    57
    ·
    13 hours ago

    maybe there is an actual explanation for HDMI Forum’s decision that I am missing.

    HDMI has never been an open standard (to the best of my understanding anyway). You’ve always needed to be an adopter or a member of HDMI forum to get the latest (or future) specs. So it’s not like they’ve just rejected a new idea. The rejection is fully consistent with their entire history of keeping the latest versions on lockdown.

    Standards organizations like HDMI Forum look like a monolith from the outside (like “they should explain their thinking here”) but really they are loosely coupled amalgamations of hundreds of companies, all of whom are working hard to make sure that (a) their patents are (and remain) essential, and that (b) nothing mandatory in a new version of the standard threatens their business. Think of it more like the UN General Assembly than a unified group of participants. Their likely isn’t a unified thinking other than that many Forum members are also participants in the patent licensing pool, so giving away something for which they collect royalties is just not a normal thought. Like… they’re not gonna give something away without getting something in return.

    I was a member of HDMI Forum for a brief while. Standards bodies like tihs are a bit of a weird world where motivations are often quite opaque.

    • Kazumara@discuss.tchncs.de
      link
      fedilink
      arrow-up
      14
      ·
      7 hours ago

      HDMI has never been an open standard (to the best of my understanding anyway). You’ve always needed to be an adopter or a member of HDMI forum to get the latest (or future) specs. So it’s not like they’ve just rejected a new idea.

      Okay not publishing the spec is still the same, but something else is new nonetheless.

      AMD is an adopter*, they have the spec and they implemented a driver for 2.1 intended to be open sourced in Linux. But they were still blocked from publishing it. For HDMI 1.4 that wasn’t an issue yet from what I’ve found (though it’s always hard to search for non-existence). Open source implementations of HDMI 1.4, even in hardware description languages, seem to exist.

      *you can search for “ADVANCED MICRO DEVICES” here to confirm for yourself

      • chillpanzee@lemmy.ml
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 hour ago

        I may have misread or misunderstood the article, but it seemed as though Steam wanted to open source their 2.1 implementation, which would effectively publish the 2.1 specification. I’m pretty sure their agreements with HDMI Forum and HDMI.org prohibit that.

    • rumba@lemmy.zip
      link
      fedilink
      English
      arrow-up
      15
      arrow-down
      1
      ·
      10 hours ago

      Translation: Nothing’s happening until someone needs to get bribed.

      /s

    • Phoenixz
      link
      fedilink
      arrow-up
      8
      ·
      12 hours ago

      You want companies to stop supporting and using your shitty standard? Because that is how you get customers ntonstop using your standard and by extension, your companies

      • De Lancre@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        7 hours ago

        Because that is how you get customers ntonstop using your standard and by extension, your companies

        There no top spec TV’s with DP or DP over type-c. It’s that simple.

        Don’t like it? Your problem. Wanna vote with your wallet in favor of DP? You can’t. And all that controversy aside, unified standard isn’t that bad, it’s amazing from user perspective tbh. Most of hardware can work fine with it, the only outsider right now is a linux amd driver, specifically due to RDNA architecture as I understand it.

    • chillpanzee@lemmy.ml
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 hour ago

      It probably already has been, and Steam likely already has the specification. They just can’t open source an HDMI 2.1 implementation without consequences.

    • addie@feddit.uk
      link
      fedilink
      arrow-up
      7
      ·
      9 hours ago

      I’m going to guess it would require kernel support, but certainly graphics card driver support. AMD and Intel not so difficult, just patch and recompile; NVIDIA’s binary blob ha ha fat chance. Stick it in a repo somewhere outside of the zone of copyright control, add it to your package manager, boom, done.

      I bet it’s not even much code. A struct or two that map the contents of the 2.1 handshake, and an extension to a switch statement that says what to do if it comes down the wire.

      • ozymandias117@lemmy.world
        link
        fedilink
        English
        arrow-up
        5
        ·
        7 hours ago

        nvidia has HDMI 2.1 last I checked.

        They can do it because their driver (even nvidia “open”) is a proprietary blob

      • PieMePlenty@lemmy.world
        link
        fedilink
        arrow-up
        3
        ·
        8 hours ago

        nouveau? Switch between drivers if you wanna use HDMI 2.1 or proprietary nvidia when you wanna game! It won’t make any sense, but it will piss off the right people :D

  • m-p{3}A
    link
    fedilink
    arrow-up
    220
    arrow-down
    1
    ·
    20 hours ago

    I really hope we’ll see TVs with DisplayPort one day.

    • Kevlar21@piefed.social
      link
      fedilink
      English
      arrow-up
      91
      ·
      20 hours ago

      I think I’d like DisplayPort over a USB-C connector. It seems like this might be an easier sell too, since the general non-techy populace is already used to everything going to USB-C (thanks EU). Maybe one day we can actually just use the same cable for everything. I realize that not all USB-C cables are equal, but maybe if TVs used USB-C, we’d see more cables supporting power, data, and video.

      • Captain Aggravated@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        104
        arrow-down
        5
        ·
        16 hours ago

        Mildly spicy take: USB is an unrecoverable disaster and we need an entirely unrelated team to invent something entirely new to replace it because we’re never getting this sleeping bag back in the little bag it shipped in.

        USB 1.1 was cool in 1996; it replaced PS/2, RS-232, Centronics parallel, several proprietary connectors, several use cases for SCSI, ADB, Apple’s DIN serial ports, and probably some stuff I’m missing. There was an A plug and a B plug, main problem was both weren’t very obvious which way up you were supposed to plug them. Speed was low but firewire existed for high speed connections.

        USB 2.0 was cooler in 2000. The plugs and sockets were identical, the cable was similar but with better shielding, it was as fast or faster than FireWire 400. They did start introducing more plugs, like Mini-B and Micro-B, mainly for portable devices. There were also Mini-A and Micro-A, I’ve never personally seen them. That pretty much finished off external SCSI. Higher speed FireWire was still there if you needed faster than USB but USB 2.0 did basically everything. To indicate USB 2.0 devices and ports, they made the tongues black in contrast with USB 1.1’s white tongues. Didn’t really matter in practice; by the time people had devices that needed the speed, USB 2.0 ports were all machines had.

        USB 3.0 took too long to arrive in 2008. The additional speed was sorely needed by then, FireWire was mostly an Apple thing, PCs had but often didn’t use it, so PCs mostly didn’t have anything faster than 480Mbit/s until Obama was sworn in. USB 3.0 is best thought of as a separate tech bolted on top of USB 2.0, they added 5 more wires, a ground wire and two pair of high speed data lines for 5Gbit/s full duplex. The original four wires are also in the cable for power and 480Mbit/s half-duplex. They managed to make the A plug and socket entirely forwards and backwards compatible, the 3B sockets are compatible with 2B plugs (same with micro) but 3B plugs are not compatible with 2B sockets (again, same with micro). Which means we’ve just added two more kinds of cable for people to keep track of! So a typical consumer now likely has a printer with a USB A-B cable, some bluetooth headset or mp3 player they’re still using that has a mini-B plug, an Android smart phone with a micro-B plug, an iPod Touch with a Lightning plug because Apple are special widdle boys and girls with special widdle needs, and now an external hard drive with a 3A to micro-3B plug, which just looking at it is obviously a hack job.

        Computer manufacturers didn’t help. It’s still common for PCs to have 2.0 ports on them for low speed peripherals like mice, keyboards, printers, other sundry HIDs, to leave 3.0 ports open for high speed devices. To differentiate these to users, 3.0 ports are supposed to be blue. In my experience, about half of them are black. I own a Dell laptop made in 2014 with 1 2.0 and 2 3.0 ports, all are black. I own two Fractal Design cases, all of their front USB ports are black. Only ports on my Asrock motherboards are blue. I’ve had that laptop for nearly 12 years now, I STILL have to examine the pinout to tell which one is the USB 2.0 port. My Fractal cases aren’t that bad because they have no front 2.0, but I built a PC for my uncle that does have front 2.0 and 3.0 ports, and they’re all black.

        USB 3.1 showed up in 2013, alongside the USB-C connector, and the train came entirely off the rails. USB 3.1 offers even higher 10Gbit/s duplex throughput, maybe on the same cable as 3.0. If the port supports it. How do you tell a 3.1 port from a 3.0 port? They’ll silk screen on a logo in -8 point font that’ll scratch off in a month, it is otherwise physically identical. Some motherboard manufacturers break with the standard in a good way and color 3.1 capable ports a slightly teal-ish blue. USB A-B cables can carry a USB 3.1 10Gbit/s signal. But, they also introduced the USB-C connector, which is its own thing.

        USB-C was supposed to be the answer to our prayers. It’s almost as small as a Micro-2B connector, it’s reversible like a Lightning port, it can carry a LOT of power for fast charging and even charging laptops, and it’s got not one, but two sets of tx/rx pins, so it can carry high speed USB data in full duplex AND a 4k60hz DisplayPort signal AND good old fashioned 480Mbit/s USB2.0 half-duplex for peripherals. In one wire. That was the dream, anyway.

        Android smart phones moved over to USB-C, a lot of laptops went mostly or entirely USB-C, PCs added one or two…and that’s where we are to this day. Keyboards, mice, wireless dongles, HIDs, still all use USB-A plugs, there doesn’t seem to have been any move at all to migrate. Laptops are now permanently in dongle hell as bespoke ports like HDMI are disappearing, yet monitors and especially televisions are slow to adopt DP over USB-C.

        Also, about half of the USB-C cables on the market are 4-wire USB 2.0 cables. There are no USB-C data cables, just D+ and D- plus power. They’re phone charging cables; they’re sufficient for plugging a phone into a wall wart or car charger but they often don’t carry laptop amounts of power and they don’t carry high speed data or video.

        USB 3.2 turned up in 2017, added the ability to do two simultaneous 3.1 10Gbit/s connections in the same cable, a boon for external SSDs, retroactively renamed 3.0 and 3.1 to 3.2 Gen 1 and 3.2 Gen 2, with 3.2 being 3.2 Gen 2x2, changed to different case logos to match, pissed in the fireplace and started jabbering about Thunderbolt. Thunderbolt was an Intel thing to put PCIe lanes out mini DisplayPort cables, usually for the purposes of connecting external GPUs to laptops but also for general purpose high speed data transfer. Well, around this time they decided to transition to USB-C connectors for Thunderbolt.

        Problem: They use a lighting bolt logo to denote a Thunderbolt port. Lightning bolt, or angled squiggle lines, have been used to mean “high speed”, “Power delivery”, “Apple Lightning”, and now “Thunderbolt.”

        “Power delivery” sometimes but not always denoted by a yellow or orange tongue means that port delivers power even with the device turned off…or something. And has nothing to do with the fact that USB-C cables now have chips in them to negotiate with power bricks and devices for how much power can be delivered, and nobody marks the cables as such, so you just have to know what your cables can do. They’re nearly impossible to shop for, and if you want to set up a personal system of “my low-speed cables are black, my high speed cables are white, my high power cables are red” fuck you, your Samsung will come with a white 2.0 cable and nobody makes a high power red cable.

        USB4 is coming out now, it’s eaten Thunderbolt to gain its power, it’ll be able to do even higher speed links if you get yet another physically indistinguishable cable, and if you hold it upside down it’ll pressure wash your car, but only Gigabyte Aorus motherboards support that feature as of yet.

        The “fistful of different cables to keep track of” is only getting worse as we head into the USB4 era and it needs to be kicked in the head and replaced entirely.

        • PolarKraken@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          2
          ·
          54 minutes ago

          This is even worse than I already knew, holy shit.

          4-wire USB 2.0 in a USB-C cable should be fucking illegal. Utterly counterintuitive if one stops short of reading specs and merely “keeps up” by having to use this nonsense all day every day. Learning about that alone put to bed some head-scratching confusion I’ve run into with my own stuff over the years.

          The cabling problems in particular…just woof!

        • muusemuuse@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          3
          ·
          2 hours ago

          There is a very good PCs still have USB2 ports. They are reliable. The always work. Before an OS loads, getting usb3 ports working is iffy. Manufacturers rarely fix their implementation in firmware so you’d want to boot that Linux distro on that Dell or HP laptop and start with usb2.

          Usb3 also causes interference with wireless transmitters

          • hcbxzz@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            41 minutes ago

            Usb3 also causes interference with wireless transmitters

            Oh god I hate this so much. The proliferation of USB 3.0 devices in the office has made 2.4 GHz keyboard and mice nearly unusable. I’d much rather they change the frequency to fix this than whatever thunderbolt stuff they’re dealing with right now

            • muusemuuse@sh.itjust.works
              link
              fedilink
              English
              arrow-up
              1
              ·
              25 minutes ago

              That interference doesn’t span an entire office. Simply place the transmitter in a USB2 port or get a short usb2 extension cord and put the transmitter in that, giving it some distance from the usb3 port.

              This does get me thinking there is an attack you can pull on machines with USB3 ports where the signals emitted by the ports could be picked up further away and be used to break an airgap.

        • Ferk@lemmy.ml
          link
          fedilink
          arrow-up
          6
          ·
          edit-2
          2 hours ago

          I don’t think we would be throwing USB-C away completelly, because it even became mandated by law in EU with the goal of trying to slow down the rate at which people generate trash by getting new cables and power bricks for every new generation of connectors.

          But I agree that at the very least there should be a clear labeling mandated by consumer protection laws as well… it’s a nightmare and a scenario that opens the door for a lot of scams… this is even made worse by the fact that nowadays you can even have malicious software running inside of the connector of a cable plugged into an extremely capable port without realizing it, messing up with your device even though the only thing you wanted was to charge it.

        • webghost0101@sopuli.xyz
          link
          fedilink
          arrow-up
          19
          ·
          edit-2
          13 hours ago

          The renaming while still selling it with older packaging for years has been angering me since it happend.

          Honestly it would not be so much of a problem if things where actually labeled appropriately with all the actual specs and support features on the package but its more profitable to keep you guessing (and going for the higher priced one just in case)

          They do the same thing with Bluetooth audio transmission usb, their “high quality audio” and “ps5 compatible” but does not tell me wether it supports aptx or not?

          Also the whole “buy a product clearly pictured with usb A type connector, receive a usb C type connector variant, if lucky, with an added adapter.

          • anomnom@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            1
            ·
            1 hour ago

            Imagine how big the connector shroud would have to be to show all the features in a cable!

            Incidentally the last power cable I bought to replace my failing MacBook power cable is labeled for max wattage.

            • webghost0101@sopuli.xyz
              link
              fedilink
              arrow-up
              1
              ·
              26 minutes ago

              I have done research into what product i should buy three times and i am still not sure enough to answer this. That says enough about how confusing the modern tech market is.

              They are definitly proprietary and a competitor to LDAC from sony. Finding something that supports both sennheiser and sony headsets without being overpriced is a nightmare.

          • Captain Aggravated@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            10
            ·
            11 hours ago

            Yeah, “3.0” and “3.2 Gen 1” mean the same thing. Same with “3.1” and “3.2 Gen 2x1”. I’ve bought computer cases in 2025 with the front IO labeled “3.0”.

            How are normal people supposed to keep track of this?

        • JackbyDev@programming.dev
          link
          fedilink
          English
          arrow-up
          5
          ·
          11 hours ago

          You end with

          The “fistful of different cables to keep track of” is only getting worse as we head into the USB4 era and it needs to be kicked in the head and replaced entirely.

          But started with

          need an entirely unrelated team to invent something entirely new to replace it

          You want more cables?

          • hcbxzz@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            43 minutes ago

            You want more cables?

            Yes, I absolutely want different cables with different connectors.

            Being able to physically plug two USB-C devices together is not a benefit if the devices can’t actually talk to each other properly on the cable. I’d much rather have three different connectors, each of them guaranteeing protocol compatibility, than USB-C for which any given device-cable-device combination, the behavior is nearly impossible to predict.

          • Captain Aggravated@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            10
            arrow-down
            1
            ·
            10 hours ago

            I remember the original roll-out of USB, things like mice and keyboards very quickly transitioned to USB and came with one of those USB/PS2 dongles for awhile for compatibility with older computers, and then we were into the USB era.

            That hasn’t happened with USB-C, large market segments don’t seem interested in making it happen, it’s not getting better, in fact it seems to be getting worse. So kick it in the head and start over from scratch.

      • IMALlama@lemmy.world
        link
        fedilink
        arrow-up
        48
        ·
        19 hours ago

        Display port over USB-C is totally a thing. With things like USB-PD USB seem to be getting dangerously close to becoming the standard for everything. The cables are a wreck though and are way too hard for a layperson to tell apart.

          • Chronographs@lemmy.zip
            link
            fedilink
            English
            arrow-up
            40
            ·
            18 hours ago

            It’s pretty simple and straightforward, all you have to so is buy the cable and a professional cable tester to see what specs it’s actually in compliance with

            • mic_check_one_two@lemmy.dbzer0.com
              link
              fedilink
              English
              arrow-up
              20
              ·
              15 hours ago

              Don’t worry, I’m sure when USB 4 releases, they’ll retroactively change the names of USB 3.2 Gen 1 and USB 3.2 Gen 2 to “USB 4.3 Gen 0.01” and “USB 4.3 Gen 0.02” respectively. Then USB 4 will actually be named “USB 4.4 Gen 5” just because.

              And none of the cables will be labeled, nor will they simultaneously support high power delivery and full data speed. We’ll need to wait for “USB 4.4 Gen 4” for that, which is when the old standard will get renamed to “USB 4.4 Gen 3.5” instead.

            • amorpheus@lemmy.world
              link
              fedilink
              arrow-up
              7
              arrow-down
              1
              ·
              17 hours ago

              These days a ~10€ gadget can tell you about the electricity going through a USB connection and what the cable is capable of. I don’t like the idea of basically requiring this to get that knowledge, but considering the limited space on the USB-C plugs I’m not sure anything is likely to improve about their labeling.

          • IMALlama@lemmy.world
            link
            fedilink
            arrow-up
            4
            ·
            17 hours ago

            Nope! That’s part of the fun sadly. At least if you’re technical you’ll know that not all type-c cables are the same.

      • ramble81@lemmy.zip
        link
        fedilink
        arrow-up
        8
        ·
        16 hours ago

        I mentioned this in another thread but “DP Alt” (DP over USB-C) is not a default feature of the USB spec and is an optional extension that needs to be added via additional hardware and supported by the device. At that point you’re basically adding in DP with just a different port.

        To that end, it’s still the same thing that TV manufacturers just aren’t adding in DP support regardless of connector.

        • [object Object]@lemmy.world
          link
          fedilink
          arrow-up
          5
          arrow-down
          1
          ·
          14 hours ago

          Isn’t usb-c able to carry Thunderbolt, which subsumed DisplayPort at some point? I thought Thunderbolt and DisplayPort were thus merged into whatever the usb standard was at the time.

          • Cooper8@feddit.online
            link
            fedilink
            English
            arrow-up
            4
            ·
            3 hours ago

            Thunderbolt is a proprietary specification by Intel and Apple, while Displayport is an open standard developed by VESA.

            USB connector hardware can meet the Thunderbolt or Displayport specifications, but must conform. Most do not.

      • Gamma@beehaw.org
        link
        fedilink
        English
        arrow-up
        4
        ·
        19 hours ago

        My monitor (tv) supports usb c and I like it! The flexibility was nice during my single battle station move

        • Sir_Kevin@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          3
          ·
          10 hours ago

          Projectors have improved dramatically over the years. Any white wall can easily become a 100+inch display that’s good enough for movies.

        • Iheartcheese@lemmy.world
          link
          fedilink
          arrow-up
          9
          arrow-down
          1
          ·
          19 hours ago

          Because we don’t all live in dorm rooms sitting at desks watching TV. Some of us need something besides a 32-in.

          • Captain Aggravated@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            3
            arrow-down
            5
            ·
            15 hours ago

            I have enough back problems to remember a time when a 32 inch television WAS a big-screen. My family had a 35 inch Sony Trinitron that weighted as much as a motorcycle. You do not NEED a 50+ inch screen.

        • Dudewitbow@lemmy.zip
          link
          fedilink
          arrow-up
          6
          arrow-down
          12
          ·
          20 hours ago

          well its not the only option, its the only consumer ended option.

          the corporate option is large format display/digital signage screens

          • accideath@feddit.org
            link
            fedilink
            arrow-up
            42
            arrow-down
            1
            ·
            20 hours ago

            If you actually give a fuck about image quality beyond size and brightness, digital signage also isn’t really an option. You won’t find many commercial oled displays, for example.
            Best option for home entertainment, imo, is still a consumer TV, that you just never connect to the internet and use a set top box with, instead.

            • thethunderwolf@lemmy.dbzer0.com
              link
              fedilink
              English
              arrow-up
              1
              ·
              2 hours ago

              We need a good jailbreak for smart TVs and a TV-oriented Linux distro with Plasma Bigscreen to install on the jailbroken smart TVs

            • Dudewitbow@lemmy.zip
              link
              fedilink
              arrow-up
              7
              arrow-down
              1
              ·
              19 hours ago

              of course, I’m not suggesting anyone should use them. im just saying they exist. the companies that make the good screens are all part of the HDMI forum, which defeats the reason why display port wont be offered at these screen sizes effectively.

              basically no comoany is going to deny using HDMI unlrss a new upcoming screen company either develops propietary displayport tech, or the VESA foundation spins up monitor production.

              unironically the only company i can remotely see doing said action is apple.

            • Dudewitbow@lemmy.zip
              link
              fedilink
              arrow-up
              4
              ·
              16 hours ago

              many are hdmi only, but there are several that have display port as well. I see a lot since I work in lease return/e-waste recycling.

  • DonutsRMeh@lemmy.world
    link
    fedilink
    arrow-up
    119
    ·
    20 hours ago

    That’s why HDMI needs to die and display port needs to take over. The TV industry is too big for that to happen of course. They make a shit ton of money off of HDMI

  • HelloRoot@lemy.lol
    link
    fedilink
    English
    arrow-up
    97
    ·
    20 hours ago

    but maybe there is an actual explanation for HDMI Forum’s decision that I am missing.

    Licensing money.

      • mkwt@lemmy.world
        link
        fedilink
        arrow-up
        18
        ·
        19 hours ago

        The license holder is attaching additional terms and conditions that are incompatible with publicly disclosing the driver source code.

        • Fedizen@lemmy.world
          link
          fedilink
          arrow-up
          9
          ·
          15 hours ago

          It still boggles my mind things can be licensed/copyrighted without being forced to disclose source code. The lack of transparency we’re okay with in society is absolutely unsustainable.

      • dohpaz42@lemmy.world
        link
        fedilink
        English
        arrow-up
        8
        ·
        19 hours ago

        This wouldn’t work to scale. If Valve paid to license the spec for the Linux kernel, it would have to pay for every person who downloaded the driver, which is far more than the amount of people who buy the Steam Cube.

        Unless of course you’re suggesting that the kernel driver for the new spec become closed source.

        • CerebralHawks@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          2
          ·
          8 hours ago

          Unfortunately, I am — or rather, I am suggesting that Valve be granted a license they can use.

          I like open source, but not so much that I’d prefer hardware that already exists be held back a feature because others can’t benefit for free.

          I’d prefer a workaround.

        • legion02@lemmy.world
          link
          fedilink
          arrow-up
          5
          ·
          17 hours ago

          OK. Fine. Then it’s going to be reverse engineered and everyone will use it anyways and they’ll get nothing.

      • HelloRoot@lemy.lol
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        2
        ·
        19 hours ago

        If it ever gets open sourced, anybody will just use it without paying.

  • cmnybo@discuss.tchncs.de
    link
    fedilink
    English
    arrow-up
    70
    ·
    20 hours ago

    AMD should remove the HDMI port from all of their GPUs as a nice F.U. to the HDMI forum. They shouldn’t be paying the licensing fees if they are not allowed to make full use of the hardware.

      • dohpaz42@lemmy.world
        link
        fedilink
        English
        arrow-up
        30
        ·
        19 hours ago

        There would be uproar, but like the audio jack on phones people would come around. All it would take is one big enough company to pull it off, and the rest would follow.

        • Captain Aggravated@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          17
          arrow-down
          4
          ·
          15 hours ago

          Apple could remove the audio jack from iPhones because 1. They’re Apple. They could remove the eyes from their customers and 9/10ths of them would stay loyal. and 2. Eliminating the headphone jack mostly locked people out of $20 or less earbuds that might have come free with a previous phone anyway. People grumbled, and carried on using the Bluetooth headphones a lot of them already owned.

          AMD doesn’t have the following that Apple does; they’re the objectively worse but more affordable alternative to Nvidia. Eliminating the HDMI port would lock themselves out of the HTPC market entirely; anyone who wanted to connect a PC to a TV would find their products impossible to use, not without experience ruining adapter dongles. We’re talking about making machines that cost hundreds or thousands of dollars incompatible.

        • Jarix@lemmy.world
          link
          fedilink
          arrow-up
          8
          arrow-down
          4
          ·
          17 hours ago

          Just bought a new phone that has an audio jack. Some of us refuse to “come around”. They can fit a stylus and an audio jack in this thing. Why did they remove the audio jack again? Not enough room? Bullshit

          • dohpaz42@lemmy.world
            link
            fedilink
            English
            arrow-up
            6
            ·
            16 hours ago

            The point isn’t whether it’s needed or not. It’s not about space or features. The point is that a major player made a design decision and bucked the system. And while there may still be some phones with audio jacks, the majority of mainstream phones don’t. That major player is still successful, and other companies followed suit.

            Can we agree this is what should happen to HDMI. No?

          • dubyakay
            link
            fedilink
            arrow-up
            6
            arrow-down
            2
            ·
            16 hours ago

            tbh I looked at audio jacks in internals, and they do usually have double the footprint on a pcb than what you see outside of it, at least on low end consumer devices:

            That’s not to say that they couldn’t put anything more compact in a highend device like a smart phone.

            • Jarix@lemmy.world
              link
              fedilink
              arrow-up
              10
              arrow-down
              1
              ·
              16 hours ago

              Okay but I have a usbc slot, speakers, stylus, and an audio jack all on the bottom of my new phone. It’s bullshit that they needed the room as evidenced by this 2025 phone.

              It can also use an sdcard. Greedy fucking corporations just wanting you to repurchase stuff you already have.

              • moopet@sh.itjust.works
                link
                fedilink
                English
                arrow-up
                1
                ·
                10 hours ago

                There are sane reasons to ditch an audio port. Like, physical connectors are fragile. Why use something that’s so often broken, when you don’t need to? Why include circuitry for something that you don’t need? At this point, physical audio ports are there for backwards compatibility. I’m not saying wired headphones are bad - I have wired headphones - but phones are the least useful place for them.

                • Jarix@lemmy.world
                  link
                  fedilink
                  arrow-up
                  1
                  ·
                  5 hours ago

                  None of those reasons are the reasons that were stated for removing it from devices by the manufacturers.

      • Seefra 1@lemmy.zip
        link
        fedilink
        English
        arrow-up
        21
        arrow-down
        1
        ·
        20 hours ago

        For now, but DP and specially DP over USB-C is becoming gradually more popular for computer hardware, someone paying 400 euros for a GPU doesn’t mind paying 10 bucks extra on an adapter if they have an HDMI monitor. But most monitors nowadays come with DP anyway.

      • Midnitte@beehaw.org
        link
        fedilink
        English
        arrow-up
        4
        ·
        11 hours ago

        I’m wondering if Valve might just include a DP to HDMI cable for the Steam Machine - since it includes DP.

        Not sure it’s economically viable for device makers to drop HDMI altogether since TVs will never do that

        • Hirom@beehaw.org
          link
          fedilink
          arrow-up
          4
          ·
          8 hours ago

          If they sell 2 variants of the Steam Machine, they could remove HDMI from one , and just put it in the more expensive variant, to reflect the extra headaches and cost that comes from HDMI.

          That’d encourage people to get screens with DisplayPort. Many computer screens have DP.

        • addie@feddit.uk
          link
          fedilink
          arrow-up
          5
          ·
          9 hours ago

          HDMI -> DP might be viable, since DP is ‘simpler’.

          Supporting HDMI means supporting a whole pile of bullshit, however - lots of handshakes. The ‘HDMI splitters’ that you can get on eg. Alibaba (which also defeat HDCP) are active, powered things, and tend to get a bit expensive for high resolution / refresh.

          Steam Machine is already been closely inspected for price. Adding a fifty dollar dongle into the package is probably out of the question, especially a ‘spec non-compliant’ one.

  • NeatNit@discuss.tchncs.de
    link
    fedilink
    arrow-up
    46
    ·
    edit-2
    11 hours ago

    Yes, this isn’t new but it’s resurfacing thanks to the Steam Machine. Basically (off my memory), part of your title is accurate: AMD did create a FOSS driver with HDMI 2.1 which does not violate HDMI forum requirements, but the HDMI forum still vetoed it. I don’t know if it would necessarily “disclose the specification” as the first part of your title suggests, but I didn’t dig into the details enough to say for certain.

    Basically a dick move by HDMI. Maybe Valve can push their weight on this, we’ll see.