Trying to figure out why plugging in an HDMI cable to my video cards turns off HDR in Windows 22 and won’t let me turn it back on. Or rather does, then after a second turns it right back off on its own.

I could swear I did this in the past.

  • Uncle
    link
    fedilink
    English
    arrow-up
    34
    ·
    1 year ago

    Windows 22

    someone has an advanced copy

  • Dudewitbow@lemmy.ml
    link
    fedilink
    English
    arrow-up
    13
    ·
    1 year ago

    Yes. I have an oled next to a generic 1080p ips display. When i move a game from the ips to the oled, the auto HDR notification kicks in.

  • Carighan Maconar@lemmy.world
    link
    fedilink
    English
    arrow-up
    8
    ·
    1 year ago

    Yep, my current setup is a 1440p iiyama with HDR in the center with two 1080p monitors that can’t do it to the sides. I did turn it off because this cheap thing can only do HDR or 165Hz and I vastly prefer the latter, but it is pretty pretty!

  • uberkalden@lemmy.world
    link
    fedilink
    English
    arrow-up
    5
    ·
    1 year ago

    Does windows 10 support this? For some reason when I connect two monitors, HDR turns off. As soon as I go down to one it turns back on again

  • countstex@feddit.dk
    link
    fedilink
    English
    arrow-up
    4
    ·
    1 year ago

    Working for me in Windows 10 using a GTX 1660Ti om an aging Intel i5-3570 so anything from the last 10 years should be fine. My HDR monitor is running over Display Port rather than HDMI I should mention however.

  • empireOfLove@lemmy.one
    link
    fedilink
    English
    arrow-up
    2
    ·
    edit-2
    1 year ago

    It should be fine, yes. I can do that on my current AMD rig fine.

    What graphics card? Monitor specs?

    Depending on the age of your hardware (and cables), it may not be supported over HDMI. Real HDR support was only enabled with HDMI 2.0a in 2015. If you have an older GPU such as a Nvidia 900 series, then it does not support enough bandwidth over HDMI for hdr at higher resolutions. And even having too old a cable can ruin HDR support too, as it will not meet the standards necessary for HDMI2.0a+ and may fall back to slower signal rates.

    • RanchOnPancakes@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      1 year ago

      nvidia 3070. Primary display is a Alienware AW3423DWF over displayport, second display is literally a 4k TV over HDMI.

      I don’t care about HDR on the TV I just want to be able to have it on my gaming monitor but also have the TV plugged in.

      • empireOfLove@lemmy.one
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        1 year ago

        hmm. Could be a windows problem. Eliminating the HDMI connection, even if the cable is to an older HDR standard, the DP connection should easily support all forms of HDR since they are more cable agnostic than HDMI is. Windows 11 is a bit of a buggy shit…

        Have you tried backing your gaming monitor off of 165hz down to something like 60 and seeing if it allows HDR then? Could be hitting some unexpected cable limit and it’s backing off from HDR.

  • FeelzGoodMan420@eviltoast.org
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    1 year ago

    Windows still has some annoying as shit multi monitor bugs when it comes to HDR. It depends on which monitors and which drivers. My personal advice is to just only have your main HDR monitor activated when you want to use HDR. Otherwise it doesn’t matter.