• notfromhere@lemmy.ml
      link
      fedilink
      arrow-up
      3
      ·
      2 months ago

      If they released some cards with a boat load of VRAM that worked great on ROCm/vulcan for inference, they might be able to take back some market share.

      • OpticalMoose@discuss.tchncs.de
        link
        fedilink
        English
        arrow-up
        3
        ·
        2 months ago

        That would be nice. And maybe a bridge for sharing memory between cards, since Nvidia got rid of theirs.

        I’d love a 24Gb card, less than 270mm, without the cursed power connector.