Also sourced from Chiphell, the Radeon RX 9070 XT is expected to command a price tag between $479 for AMD’s reference card and roughly $549 for an AIB unit, varying based on which exact product one opts for. At that price, the Radeon RX 9070 XT easily undercuts the RTX 5070, which will start from $549, while offering 16 GB of VRAM, albeit of the older GDDR6 spec.

  • Poopfeast420@discuss.tchncs.de
    link
    fedilink
    English
    arrow-up
    10
    arrow-down
    4
    ·
    edit-2
    1 day ago

    They need to bring a lot more value than the 5070, otherwise only Linux users and fanboys will buy these. A $50 discount is not enough, and I don’t know if these rumored $70 will do it.

    • filister@lemmy.world
      link
      fedilink
      English
      arrow-up
      7
      ·
      22 hours ago

      AMD is notorious for screwing their releases, they usually announce prices very close to NVIDIA, then reviewers give them subpar reviews and after a couple of months they start to lower the prices but it is usually too late and very few reviewers are updating their verdicts so people think they are still crap even though now the price is a lot more competitive making the whole package a bargain.

      I don’t know why they do it that way, and I can only imagine that this hurts their brand and sales.

    • TechAnon@lemm.ee
      link
      fedilink
      English
      arrow-up
      6
      ·
      edit-2
      11 hours ago

      I like the 16GB mem vs 12GB on the 5070 but it is a bit slower mem. Hard to say how much that will matter in the real world. I’m also dual booting Linux so AMD wins there.

      For me it will come down to performance comparisons once they are released. Huge note: I don’t really care much about ray tracing. It’s cool tech, but not a big enough difference for me in the vast majority of games.

      • DdCno1@beehaw.org
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 hour ago

        You’re still seeing ray tracing as a graphics option instead of what it actually is: Something that makes game development considerably easier while at the same time dramatically improving lighting - provided it replaces rasterized graphics completely. Lighting levels the old-fashioned way is a royal pain in the butt, time- and labor-intensive, slow and error-prone. The rendering pipelines required to pull it off convincingly are a rat’s nest of shortcuts and arcane magic compared to the elegant simplicity of ray tracing.

        In other words: It doesn’t matter that you don’t care about it, because in a few short years, the vast majority of 3D games will make use of it. The necessary install base of RT-capable GPUs and consoles is already there if you look at the Steam hardware survey, the PS5, Xbox Series and soon Switch 2. Hell, even phones are already shipping with GPUs that can do it at least a little.

        Game developers have been waiting for this tech for decades, as has anyone who has ever gotten a taste of actually working with or otherwise experiencing it since the 1980s.

        My personal “this is the future” moment was with the groundbreaking real-time ray tracing demo heaven seven demo from the year 2000:

        https://pouet.net/prod.php?which=5

        I was expecting it to happen much sooner though, by the mid to late 2000s at the latest, but rasterized graphics and the hardware that runs it were improving at a much faster pace. This demo runs in software, entirely on the CPU, which obviously had its limitations. I got another delicious taste of near real-time RT with Nvidia’s iRay rendering engine in the early 2010s, which could churn out complex scenes with PBR materials (instead of the simple, barely textured geometric shapes of heaven seven) at a rate of just a few seconds per frame on a decent GPU with CUDA, even in real-time on a top of the line card. Even running entreily on the CPU, this engine was as fast as a conventional CPU rasterizer. I would sometimes preach about just how this was a stepping stone towards this tech appearing in games, but people rarely believed me back then.

      • 9488fcea02a9@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        1
        ·
        11 hours ago

        I used to think vram wasnt a big deal, but my 10G 3080 is already useless for some newer games, namely indiana jones

        Not literally unplayable, but severely hamstrung, which is not OK for a high end card thats only 2 gen old (soon to be 3 gen, i guess)

    • SolOrion@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      5
      ·
      1 day ago

      Realistically people just won’t be buying them because “AMD drivers bad” or “I’ve always had Nvidia”

      • swankypantsu@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        1 day ago

        I’ve had 2 cards in the past 10 years. One was AMD and it was a nightmare to find the one driver that didn’t crash my PC at least twice a month. The other card was NVIDIA and never had a crash on any driver ever.

        Maybe it’s just bad luck but I won’t buy AMD again after that crap and probably won’t go green either. I’ll keep waiting to see how Intel advances.

        • SolOrion@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          2
          ·
          1 day ago

          Fair enough. I’m not necessarily advocating for people to buy AMD cards, just that realistically the price is pretty irrelevant for a lot of people buying GPUs. They’re going to read AMD and bow out.

          I’m also very interested in how Intel shapes up, though.