In response to Wayland Breaks Your Bad Software

I say that the technical merits are irrelevant because I don’t believe that they’re a major factor any more in most people moving or not moving to Wayland.

With only a slight amount of generalization, none of these people will be moved by Wayland’s technical merits. The energetic people who could be persuaded by technical merits to go through switching desktop environments or in some cases replacing hardware (or accepting limited features) have mostly moved to Wayland already. The people who remain on X are there either because they don’t want to rebuild their desktop environment, they don’t want to do without features and performance they currently have, or their Linux distribution doesn’t think their desktop should switch to Wayland yet.

    • WuTang @lemmy.ninja
      link
      fedilink
      English
      arrow-up
      1
      ·
      10 months ago

      it should not. the 1st one should be that it is not opensource and 100% the cause of a X blackscreen on upgrades.

      AMD plays the game (no pun intended), so let’s go with it. If you need nvidia for CUDA for ML, standard are on the way to allow to use any GPU.

      • TechieDamien@lemmy.ml
        link
        fedilink
        arrow-up
        1
        ·
        10 months ago

        I already do ml on amd, and it works great. There’s usually a few extra steps that need doing as binaries aren’t always available, but that, too, will improve with time.

    • woelkchen@lemmy.world
      link
      fedilink
      arrow-up
      6
      arrow-down
      5
      ·
      10 months ago

      Only reason I’m not using it is Nvidia.

      Don’t buy Nvidia GPUs. NVidia’s broken Linux support is a well-known fact since at least a decade.

      • Ineocla@lemmy.ml
        link
        fedilink
        arrow-up
        5
        arrow-down
        1
        ·
        10 months ago

        For gaming AMD is as good as NVIDIA or even better. For anything else tho it’s a dumpster fire. Amf still isn’t on par with nvenc, rocm is pure garbage and they are basically useless for any compute task

        • woelkchen@lemmy.world
          link
          fedilink
          arrow-up
          1
          arrow-down
          2
          ·
          10 months ago

          For anything else tho it’s a dumpster fire. Amf still isn’t on par with nvenc, rocm is pure garbage and they are basically useless for any compute task

          Those specific compute tasks are not “anything else”. Pretty much every single everyday task by common people works better on GPUs with proper Mesa drivers than GeForce and there is absolutely no reason that you need to output your graphics from the NVidia GPU anyway. Do your compute tasks on dedicated Nvidia hardware if you have to. Even notebooks come with AMD and Intel iGPUs that are perfectly fine for non-gaming graphics output.

          • Ineocla@lemmy.ml
            link
            fedilink
            arrow-up
            1
            ·
            10 months ago

            Yep you’re right. Mesa covers almost anything. But streaming and recording, photo and video editing, 3d rendering ai training etc aren’t “specific compute tasks” they represent the vast majority of the market with billions of dollars in revenue. And no the solution isn’t to use another gpu. It’s for AMD to make their software stack actually usable

            • woelkchen@lemmy.world
              link
              fedilink
              arrow-up
              1
              ·
              10 months ago

              photo and video editing

              Which photo editor for Linux even supports special NVidia features? It’s not like Linux has Photoshop or something like that – there aren’t that many photo editors under Linux. It’s one of the areas Windows people complain most loudly about Linux. Seems to me your conflated Windows with Linux when hyping Nvidia above anything.

              ai training etc aren’t “specific compute tasks”

              AI training isn’t a specific compute task? What is it then? Why do you train your AI on the regular graphics output GPU and not on dedicated hardware like sane people?