30 votes

AMD Radeon RX 9070 XT reviews and launch

42 comments

  1. [25]
    Venko
    Link
    I failed to pick up an NVIDIA 5070 Ti for the UK MRRP. Since then I've been waiting to see what AMD's offering is and, to a layman like me, it looks like a very attractive option. I'm strongly...

    I failed to pick up an NVIDIA 5070 Ti for the UK MRRP. Since then I've been waiting to see what AMD's offering is and, to a layman like me, it looks like a very attractive option. I'm strongly considering picking one up at 2PM tomorrow when it supposedly launches and then building a new gaming PC around it.

    I don't have any experience with AMD though. There are many manufacturers and it's difficult for me to understand what the differences are between the MRRP models. In the UK the MRRP models that I've seen are:

    • Asrock Radeon RX 9070 XT Steel Legend
    • PowerColor Radeon RX 9070 XT Reaper
    • Sapphire Pulse Radeon RX 9070 XT Gaming
    11 votes
    1. [6]
      trim
      (edited )
      Link Parent
      Let me know how it goes. I've been on AMD graphics since forever, and I'm looking for an upgrade to my RX6600, but I only have an 850 PSU - though that is a recent upgrade too, with my 5800X3D...

      Let me know how it goes. I've been on AMD graphics since forever, and I'm looking for an upgrade to my RX6600, but I only have an 850 PSU - though that is a recent upgrade too, with my 5800X3D CPU.

      Nearly bought a 7800XT recently for Wilds but didn't.

      Edit: FWIW I've always used PowerColor cards

      Edit2: I don't like the new naming. It's confusing

      8 votes
      1. [5]
        Sodliddesu
        Link Parent
        They're likely just trying to 'out number' Nvidia. What do you want, the 5070 or the 9070?

        I don't like the new naming. It's confusing

        They're likely just trying to 'out number' Nvidia. What do you want, the 5070 or the 9070?

        4 votes
        1. jcd
          Link Parent
          Their next gen will change the architecture (UDNA vs RDNA). Might as well end this round on a 9xxx, then start fresh next year

          Their next gen will change the architecture (UDNA vs RDNA). Might as well end this round on a 9xxx, then start fresh next year

          7 votes
        2. [2]
          stu2b50
          Link Parent
          They’re definitely trying to out number Nvidia. You can tell because their number schemes has been “drifting”. It’s now coincidentally such that it aligns with nvidias naming.

          They’re definitely trying to out number Nvidia. You can tell because their number schemes has been “drifting”. It’s now coincidentally such that it aligns with nvidias naming.

          2 votes
          1. creesch
            Link Parent
            Afaik they have been fairly transparant about this being the purpose.

            Afaik they have been fairly transparant about this being the purpose.

            2 votes
        3. streblo
          Link Parent
          The cowards should have brought back the 9800 label for the 9x series instead of some lame attempt at mimicking the competition.

          The cowards should have brought back the 9800 label for the 9x series instead of some lame attempt at mimicking the competition.

    2. [10]
      streblo
      Link Parent
      I’ve bought only Sapphire cards for the last decade and a half. They’re the best AMD AIB partner imo and I’ve not had a bad experience yet.

      I’ve bought only Sapphire cards for the last decade and a half. They’re the best AMD AIB partner imo and I’ve not had a bad experience yet.

      8 votes
      1. [9]
        Akir
        Link Parent
        I've bought a lot of AMD and ATi cards in the past, and Sapphire was pretty much the only brand I bought that didn't eventually break down. The exception was the last one I bought, which was from...

        I've bought a lot of AMD and ATi cards in the past, and Sapphire was pretty much the only brand I bought that didn't eventually break down. The exception was the last one I bought, which was from Gigabyte. That was a really great card too, except for the cooler was extremely heavy and I worried it would break the PCIe slot.

        2 votes
        1. [7]
          fuzzy
          Link Parent
          In the future consider buying a GPU arm or bracket to help carry some of the weight. Someone I know built a PC and was having random crashing issues … until he braced his GPU. No more sag, no more...

          In the future consider buying a GPU arm or bracket to help carry some of the weight. Someone I know built a PC and was having random crashing issues … until he braced his GPU. No more sag, no more crashes.

          4 votes
          1. [5]
            teaearlgraycold
            Link Parent
            I’ve got one that attaches to the motherboard that I like. Seems like a more sturdy approach than the car jack style that goes between the bottom of the case and the GPU.

            I’ve got one that attaches to the motherboard that I like. Seems like a more sturdy approach than the car jack style that goes between the bottom of the case and the GPU.

            1. [2]
              hungariantoast
              Link Parent
              I call that a "wooden dowel"

              I call that a "wooden dowel"

              2 votes
              1. vord
                Link Parent
                I added some blue tack to my cut-to-fit chopstick for some extra stability.

                I added some blue tack to my cut-to-fit chopstick for some extra stability.

                2 votes
            2. [2]
              fuzzy
              Link Parent
              Attaches to your motherboard…how? I’ve been using a jack-style one for the past few years and it’s been solid. Though obviously it wouldn’t be ideal if I regularly transported or otherwise jostled...

              Attaches to your motherboard…how?

              I’ve been using a jack-style one for the past few years and it’s been solid. Though obviously it wouldn’t be ideal if I regularly transported or otherwise jostled my computer.

              1 vote
              1. teaearlgraycold
                Link Parent
                It screws into the standoffs through the motherboard.

                It screws into the standoffs through the motherboard.

                2 votes
          2. Akir
            Link Parent
            It worked fine even with a bit of sag. But there was really no reason why it should have been that heavy. The shroud over the fans was made of weirdly thick metal.

            It worked fine even with a bit of sag. But there was really no reason why it should have been that heavy. The shroud over the fans was made of weirdly thick metal.

        2. teaearlgraycold
          Link Parent
          Sapphire was like the EVGA of AMD back when I had AMD cards.

          Sapphire was like the EVGA of AMD back when I had AMD cards.

          2 votes
    3. BeardyHat
      Link Parent
      I switched to AMD a couple of years ago. Currently running an ASRock 6700XT and an ASRock 6650 XT in an eGPU enclosure and both have been good to me. I hear PowerColor and Sapphire are supposed to...

      I switched to AMD a couple of years ago. Currently running an ASRock 6700XT and an ASRock 6650 XT in an eGPU enclosure and both have been good to me.

      I hear PowerColor and Sapphire are supposed to be the higher end in the AMD space.

      3 votes
    4. ButteredToast
      Link Parent
      Both Sapphire Nitro+ cards I’ve owned in the past several years (5700XT and 6900XT) have been great. Quiet, great performance, generally do what they’re supposed to with minimal fuss. My main...

      Both Sapphire Nitro+ cards I’ve owned in the past several years (5700XT and 6900XT) have been great. Quiet, great performance, generally do what they’re supposed to with minimal fuss.

      My main gaming tower has an EVGA 3080Ti FTW3 which is one of the best Nvidia cards of that generation and I think I would’ve been just as happy with the 6900XT.

      That said, I’m not the biggest proponent of the sorts of bells and whistles people have been buying Nvidia cards for recently (raytracing, fake frames, CUDA) so YMMV.

      3 votes
    5. TheJorro
      Link Parent
      They'll offer different coolers and video port connections. Some may also have different power requirements (due to the cooler or any factory overclocking). Examine the specs and choose what works...

      They'll offer different coolers and video port connections. Some may also have different power requirements (due to the cooler or any factory overclocking). Examine the specs and choose what works best for you, or is in your price/availability zone. They're all reputable brands at least.

      2 votes
    6. [5]
      Grzmot
      Link Parent
      While Sapphire cards are generally the most recommended ones, I believe that they are on of the few companies that went with the 12VHPWR connector for the AMD card too, which probably isn't a good...

      While Sapphire cards are generally the most recommended ones, I believe that they are on of the few companies that went with the 12VHPWR connector for the AMD card too, which probably isn't a good thing considering how many problems it's causing. I'd look for a card that uses the standard 8-pin connectors.

      1. [3]
        teaearlgraycold
        Link Parent
        This card doesn’t use as much power as the Nvidia cards that have caused melted cables. So I wouldn’t be too worried. It’s also a rare issue in practice. Just make sure the cable is fully seated.

        This card doesn’t use as much power as the Nvidia cards that have caused melted cables. So I wouldn’t be too worried. It’s also a rare issue in practice. Just make sure the cable is fully seated.

        1. Grzmot
          Link Parent
          You're right, plus there's a chance that Sapphire actually put in a load balancer onto the PCB, but the issues with 12VHPWR have gone beyond poorly seated cables, especially with the new 5090:...

          You're right, plus there's a chance that Sapphire actually put in a load balancer onto the PCB, but the issues with 12VHPWR have gone beyond poorly seated cables, especially with the new 5090: https://www.youtube.com/watch?v=oB75fEt7tH0

          It's not just making sure that it's seated correctly.

          I found this (didn't watch it all though): https://www.youtube.com/watch?v=2HjnByG7AXY and it seems that Sapphire is not actively load balancing anything.

        2. CptBluebear
          Link Parent
          It's that the connector is pushing too much current through single pins rather than spreading it, causing the 16A rated pin to use 20A and heating up until it melts. Roughly. It was mainly a...

          It's that the connector is pushing too much current through single pins rather than spreading it, causing the 16A rated pin to use 20A and heating up until it melts. Roughly. It was mainly a seating issue in the 40xx but it's reported on cards with correctly seated cables in the 50xx series.

          Grzmot's linked video of der8auer outlines this perfectly.

  2. [3]
    AugustusFerdinand
    Link
    I've been rocking AMD hardware for years (CPU and GPU) and have been very happy. Intel and Nvidia's prices for the performance has never sat well with me, their business practices even less so....

    I've been rocking AMD hardware for years (CPU and GPU) and have been very happy. Intel and Nvidia's prices for the performance has never sat well with me, their business practices even less so.
    Are they usually the fastest overall? Sure.
    Are they twice as fast for twice (or in this case thrice) the price? Nope and never have been.

    I also don't game in 4k, nor see the point in doing so as 1440p at 120hz has yet to leave me feeling like I'm missing something (and I have gamed in 4k with no noticeable difference, we're well within the realm of diminishing returns), nor have any intention to ever buy a single component of my PC that costs as much as my mortgage.
    If I wasn't perfectly content, and never had any issues with, my current 6900XTXH I'd upgrade to this and hand down my 6900XTXH to my wife (and subsequently hand down my former Vega 64 that she currently uses to someone else).

    7 votes
    1. [2]
      ButteredToast
      Link Parent
      I don’t really see the point of 4k gaming at desktop monitor sizes either. In that situation generally I’m too involved with the game to gain much benefit from increased PPI… it’s not like with...

      I don’t really see the point of 4k gaming at desktop monitor sizes either. In that situation generally I’m too involved with the game to gain much benefit from increased PPI… it’s not like with text-heavy applications like code where sharper characters can make a substantial difference.

      To me the place where 4k gaming starts to become more of a consideration is with large (75”+) TVs, but even then it depends on how far away one sits.

      4 votes
      1. vord
        Link Parent
        I've noticed a fair difference at 4k for FPS games, where that higher resolution translates to more pixels for far away objects. With a well-tuned mouse, it makes it easier to make longer shots.

        I've noticed a fair difference at 4k for FPS games, where that higher resolution translates to more pixels for far away objects.

        With a well-tuned mouse, it makes it easier to make longer shots.

        4 votes
  3. Pavouk106
    Link
    It seems like I gave my evening video program set up for me. I'm really curious if AMD took the opportunity and I hope they did.

    It seems like I gave my evening video program set up for me. I'm really curious if AMD took the opportunity and I hope they did.

    2 votes
  4. [4]
    scrambo
    Link
    I think I'll try to snag one of these tomorrow morning. I'm currently rocking a GTX 970, so it's about time for an upgrade. Maybe I'll even use it to run some local AI model, who knows. All I...

    I think I'll try to snag one of these tomorrow morning. I'm currently rocking a GTX 970, so it's about time for an upgrade. Maybe I'll even use it to run some local AI model, who knows. All I know, is that Rocket League and Helldivers are going to look and run SO much better if I can get this. I might even get into VR for Dirt Rally!

    2 votes
    1. vord
      Link Parent
      I did finally turn in my 970 for a 7800XT. Pretty happy with it so far.

      I did finally turn in my 970 for a 7800XT. Pretty happy with it so far.

      3 votes
    2. [2]
      sqew
      Link Parent
      I’m also still running a GTX 970! Thinking about getting one of these or something last gen from AMD, but these prices sting compared to the MSRPs in the 970’s era…

      I’m also still running a GTX 970! Thinking about getting one of these or something last gen from AMD, but these prices sting compared to the MSRPs in the 970’s era…

      1 vote
      1. Venko
        Link Parent
        I'm also running a 970 still. Unfortunately I wasn't able to buy a 9070 XT before the stock vanished in all UK stores though so I'll be sticking with my Xbox Series X for gaming for the...

        I'm also running a 970 still. Unfortunately I wasn't able to buy a 9070 XT before the stock vanished in all UK stores though so I'll be sticking with my Xbox Series X for gaming for the foreseeable future.

        1 vote
  5. Nihilego
    Link
    This makes the 4070 Ti that I spent around $700 (Before taxes) look stupid, but I couldn’t guarantee either global availability or being competitive for the AMD cards so went with a bad value GPU...

    This makes the 4070 Ti that I spent around $700 (Before taxes) look stupid, but I couldn’t guarantee either global availability or being competitive for the AMD cards so went with a bad value GPU because it was the most “”reasonable”” for the price GPU I saw, GPU market has a chance for recovery with these cards.

    Though this makes it more exciting what kind of upgrades for the price I’d see after 2 or 3 generations.

    2 votes
  6. [6]
    Wafik
    Link
    My g-sync monitor is the only thing keeping me loyal at this point. Correct me if I am wrong, but my monitor aside, an AMD GPU is just as plug and play as Nvidia right? Like I would need to change...

    My g-sync monitor is the only thing keeping me loyal at this point.

    Correct me if I am wrong, but my monitor aside, an AMD GPU is just as plug and play as Nvidia right? Like I would need to change anything else on my Intel PC if I switch in the future? I have always been Intel/Nvidia out of pure laziness/comfort level.

    1. [2]
      creesch
      Link Parent
      You probably do want to get DDU and make sure the Nvidia drivers are truly gone once you installed an AMD card. Basically boot to safe mode, run it and you are good to go.

      You probably do want to get DDU and make sure the Nvidia drivers are truly gone once you installed an AMD card.

      Basically boot to safe mode, run it and you are good to go.

      2 votes
      1. Wafik
        Link Parent
        Appreciate the info and suggestions!

        Appreciate the info and suggestions!

    2. [3]
      vord
      Link Parent
      A lot of newer gsync monitors also support freesync, AMD has a list

      A lot of newer gsync monitors also support freesync, AMD has a list

      1 vote
      1. [2]
        Wafik
        Link Parent
        My monitor is pretty old so I doubt it but I'll confirm. I was thinking this would be a good excuse to upgrade my monitor as well.

        My monitor is pretty old so I doubt it but I'll confirm. I was thinking this would be a good excuse to upgrade my monitor as well.

        2 votes
        1. sqew
          Link Parent
          You can get pretty great monitors for not too much these days, too. I picked up a 27” 1440p IPS 144hz monitor from Acer a few years ago for ~$200, and its current version seems to be around that...

          You can get pretty great monitors for not too much these days, too. I picked up a 27” 1440p IPS 144hz monitor from Acer a few years ago for ~$200, and its current version seems to be around that price still.

          1 vote
  7. 0x29A
    Link
    While I have a new enough Nvidia card that I won't be upgrading for a while, the switch to AMD is the definite path for me going forward. Switched to AMD CPU recently and want to do the same for...

    While I have a new enough Nvidia card that I won't be upgrading for a while, the switch to AMD is the definite path for me going forward. Switched to AMD CPU recently and want to do the same for the GPU once budget allows for it. Probably one or two more hardware generations in the future I'll make the leap

  8. feylec
    Link
    I’m considering upgrading my RTX3070 and wouldn’t mind popping over to AMD higher end. Is this new card AMD new top of the line card?

    I’m considering upgrading my RTX3070 and wouldn’t mind popping over to AMD higher end. Is this new card AMD new top of the line card?