31 votes

Nvidia announces four next-gen RTX 5000 GPUs

61 comments

  1. [18]
    teaearlgraycold
    Link
    $2000 for a gaming card? 16 GB on the 5080?? As usual, the best model I could see myself recommending will probably be the 80 Ti variant. I’ve seen quite a lot of use of the 4090 for AI...

    $2000 for a gaming card? 16 GB on the 5080?? As usual, the best model I could see myself recommending will probably be the 80 Ti variant. I’ve seen quite a lot of use of the 4090 for AI workstations. In those environments it’s actually a good deal. But the 90 class cards don’t make much sense for gaming outside of epeen and ridiculous setups.

    20 votes
    1. [13]
      Akir
      Link Parent
      Buying a super high end graphics card always sounded dumb to me. The law of diminishing returns have been strong for quite a long time. But those returns have been diminishing at an astonishing...

      Buying a super high end graphics card always sounded dumb to me. The law of diminishing returns have been strong for quite a long time. But those returns have been diminishing at an astonishing rate. My 2080 Super is still able play everything I have been able to throw at it reasonably well, short of some demos that are designed to push limits. The pace in which graphics have been improving has been getting slower and slower; I honestly haven't seen any examples where modern games on screaming PCs are notably better than a PS5 version without some serious pixel peeping.

      Nvidia has made more than enough money. Maybe we can use our hands instead of shovels and wheelbarrows when we're throwing our money at them?

      16 votes
      1. [11]
        stu2b50
        Link Parent
        It’s just a hobby for enthusiasts. $2k isn’t a whole lot of money. Car people spend way more on their hobby. I’m about to pay $4k to Apple to have the privilege of having 32gb of ram. I paid $6000...

        It’s just a hobby for enthusiasts. $2k isn’t a whole lot of money. Car people spend way more on their hobby. I’m about to pay $4k to Apple to have the privilege of having 32gb of ram. I paid $6000 for a lens last year.

        You’re definitely getting marginal returns on your money, but it is at least demonstrably measurable benefit.

        12 votes
        1. Promonk
          Link Parent
          You and I occupy very different tax brackets.

          You and I occupy very different tax brackets.

          23 votes
        2. [7]
          asparagus_p
          Link Parent
          My issue with this is that it's become normalized to think of high prices as just part and parcel of enjoying a hobby. Obviously, some technology is just expensive, but I'm not sure comparing PC...

          My issue with this is that it's become normalized to think of high prices as just part and parcel of enjoying a hobby. Obviously, some technology is just expensive, but I'm not sure comparing PC components to car parts is equivalent. I'm not familiar with the ins and outs of PC parts manufacturing, but I remember when a high-end graphics card was a few hundred dollars, and my concern is that the the huge leap in price is not simply because the raw materials/manufacturing is as expensive as the retail price suggests. My hunch is that these companies know how much consumers are willing to pay because our expectations have changed. So a profit margin of 20% has now become 50%, for example. Yes, that's business, supply and demand, etc, but consumers can and should still try and fight against it rather than just saying, "it's not a whole lot of money if you're an enthusiast". It is a whole lot of money, and we should still be fighting for good value. Let's face it, there's very little competition in this space.

          12 votes
          1. [4]
            stu2b50
            Link Parent
            That’s not a hunch, or a conspiracy, that’s like, exactly how things are supposed to be priced in a market economy. Supply and demand - if demand increases, prices go up, even if supply costs stay...

            my concern is that the the huge leap in price is not simply because the raw materials/manufacturing is as expensive as the retail price suggests. My hunch is that these companies know how much consumers are willing to pay because our expectations have changed.

            That’s not a hunch, or a conspiracy, that’s like, exactly how things are supposed to be priced in a market economy. Supply and demand - if demand increases, prices go up, even if supply costs stay constant.

            Everyone should simply look at the offerings, and make a decision based on how much utility they would personally gain from the purchase. In that respect, the 5090 is not the right card for most people, and that’s fine. It’s nothing more than cost and benefit.

            Perhaps the price depreciation of GPU performance has largely ended, but so has the demand for GPU performance (outside of compute tasks like MLP training).

            It use to be that if you weren’t getting a high tier GPU, you wouldn’t be able to run cutting edge games like crysis at all. But that isn’t really the case anymore.

            6 votes
            1. [3]
              asparagus_p
              Link Parent
              Yes, of course, and I mentioned this in my comment. But it shouldn't be viewed as the system working perfectly if the market has become skewed. There is barely any competition with just 2 major...

              Supply and demand - if demand increases, prices go up, even if supply costs stay constant.

              Yes, of course, and I mentioned this in my comment. But it shouldn't be viewed as the system working perfectly if the market has become skewed. There is barely any competition with just 2 major players, and 1 also-ran. And of the two major players, 1 is far ahead in market share. So Nvidia know they can charge these very high prices.

              I know this is the way the market economy works, but gaming is not just an enthusiast hobby. It's mainstream, and more casual users are paying very high prices because they don't have much choice. We need more choice, and this is why I'm glad Intel are trying to break into it. But Nvidia has such a stranglehold, it's not looking good.

              3 votes
              1. cdb
                Link Parent
                There is plenty of choice out there. You can buy a very capable gaming PC for $500. The entire thing, not just the video card. No one needs a 5090 or even a 5070 to enjoy PC gaming. Top end...

                casual users are paying very high prices because they don't have much choice

                There is plenty of choice out there. You can buy a very capable gaming PC for $500. The entire thing, not just the video card. No one needs a 5090 or even a 5070 to enjoy PC gaming. Top end components have never been a good value, but luckily they also have never been necessary for entering the hobby.

                The fact that there are more expensive alternatives doesn't signal that the system is rotten. What's rotten is that there is some idea out there that if you don't have the best equipment out there, then your system sucks.

                5 votes
              2. stu2b50
                Link Parent
                "AI" is a different story, as it seems more and more a strategic resource. But gaming is ultimately a luxury market. No one needs to game, and that has large implications on how the demand curve...

                "AI" is a different story, as it seems more and more a strategic resource. But gaming is ultimately a luxury market. No one needs to game, and that has large implications on how the demand curve works.

                If your $300, which use to get you a 70 class card, now gets you a 60 class card... boo hoo? You'll have to lower some sliders?

                Not to mention that Intel entered the market precisely because it seems that demand far surpasses supply in the gaming market right now. There's a market gap, and beleaguered Intel can make a bet in the lower end. That's price signals in action. Not to mention Apple also entered the market, in their own way.

                The gaming market is broad, as well. GPUs aren't simply in competition with other GPUs, they're in competition with all-in-ones like the PS5, or the Switch, or the Steam deck.


                All in all, I don't think consumers need to be more strategic than simply evaluating how much they need, and how much they're willing to spend. If the 5090 is the right card for you, because you're an enthusiasts and you want to drive your 4k screen with maximum ray tracing, that's fine - pull the trigger. In the end, the shiftings of the gaming market don't matter that much.

          2. raze2012
            Link Parent
            That's technically always been true. Gaming has simply been an exception outside of the ones who would invest $3000 in a rig, even back when GPUs weren't $500+. Most sports you get into as a hobby...

            My issue with this is that it's become normalized to think of high prices as just part and parcel of enjoying a hobby

            That's technically always been true. Gaming has simply been an exception outside of the ones who would invest $3000 in a rig, even back when GPUs weren't $500+. Most sports you get into as a hobby can get expensive, especially if you need to rent out a field/court. Clothing can easily hit thousands, following pro sports has super expensive merch and live games go into the hundreds per game. Making a sizeable garden requires land.

            This is more about simple corporate greed than a notion that all hobbies people go deep into are cheap. But if people keep buying...

            3 votes
          3. skybrian
            (edited )
            Link Parent
            Some historical perspective: if you go back far enough, personal computing was a very high-priced hobby, particularly if you bought a Mac. A Mac Plus (with a black and white screen and no hard...

            Some historical perspective: if you go back far enough, personal computing was a very high-priced hobby, particularly if you bought a Mac. A Mac Plus (with a black and white screen and no hard drive) was $2600 when introduced, corresponding to $7220 in today’s dollars.

            Nowadays a Mac Mini is a very powerful machine for $600 new. $2k for a graphics card really is a lot in comparison, but that’s because there actually is low-priced competition that serves most people pretty well.

            (Also, nobody actually needs a 4k screen.)

            2 votes
        3. [2]
          Akir
          Link Parent
          You might remember this story making the rounds a while back. $2000 is a whole lot of money for the vast majority of people. I'm not against the idea of high end products being expensive per se,...

          You might remember this story making the rounds a while back. $2000 is a whole lot of money for the vast majority of people.

          I'm not against the idea of high end products being expensive per se, but I think the value proposition isn't there for anyone but the most hardcore, and for those who do think that it's reasonable to buy one of these, well, maybe they should reconsider. Even if just sticking to nVidia, it makes much more sense to pay half the amount for a 5080. You can put the money you saved in a savings account and then spend it next year to buy the next generation equivalent alongside any extra hardware capabilities that may have come out at that time, and likely make a profit from the interest.

          2 votes
          1. stu2b50
            Link Parent
            I don't mean in an absolute sense; $2k is not a lot in the realm of expensive hobbies middle to high income people have, which is a pretty broad category. It's too much for some people, but no one...

            I don't mean in an absolute sense; $2k is not a lot in the realm of expensive hobbies middle to high income people have, which is a pretty broad category. It's too much for some people, but no one is forced to buy it.

            My point is that there's more than enough money sloshing around in the "mildly expensive hobbist items" for 5090 sales to make sense.

            I'm not against the idea of high end products being expensive per se, but I think the value proposition isn't there for anyone but the most hardcore

            Perhaps, but that's nvidia's problem, if it is a problem. If it isn't a problem, then it isn't a problem, evidently.

            By all accounts, the value proposition is there, at least enough to sell through the units.

            2 votes
      2. Baeocystin
        Link Parent
        I have two machines, one with a 3060Ti and one with a 3090Ti. The 3060 variant is driving a 1080P display, the 3090 a 4k, and honestly the perceptual difference between the two in the middle of a...

        I have two machines, one with a 3060Ti and one with a 3090Ti. The 3060 variant is driving a 1080P display, the 3090 a 4k, and honestly the perceptual difference between the two in the middle of a game is close to nil. As long as the framerate stays high, you really aren't going to notice the extra detail in the middle of a firefight. I feel no need to upgrade the 3060, much less the 3090, and probably won't for a long time.

        1 vote
    2. [3]
      SteeeveTheSteve
      Link Parent
      Makes as much sense as buying a $1500 phone, $3000 big screen tv just to watch the big game or to clearly see the strings going to the hand in Thunderbirds, sports car, low-rider truck, a car with...

      Makes as much sense as buying a $1500 phone, $3000 big screen tv just to watch the big game or to clearly see the strings going to the hand in Thunderbirds, sports car, low-rider truck, a car with a lift kit, a truck that will be used purely for commuting (extra points for a lift kit and huge mud tires that will never leave pavement), tickets to sit so far from a game that you need binoculars, >$4k a year for all the tv channels, collector toys you'll never play with or even open the box, yards and yards of fabric you may never use, a portable mill saw to make your own lumber, a boat, a bunch of expensive fishing poles and reals, a 10" telescope with great light filters that lets you see Uranus' crack slightly clearer than an indistinct blob and if you're lucky make out Titania and Oberon, a Manta, an industrial 3D printer, a freeze dryer, ... this list could go on a while.

      In the end it all depends on your niche, people are willing to fork over quite a bit for things they like even if it doesn't make sense to the rest of us. Personally, I like the challenge of building a computer that can run any game on high settings for as cheap as possible but still be high enough quality it won't break down within 5 years which got A LOT harder when miners came along and F'd over the video card market to the point my old used card cost more than I payed for it at one point. No longer can we buy last year's or the year before's card that still ran the majority of games on the highest settings for dirt cheap. :/

      3 votes
      1. [2]
        AriMaeda
        Link Parent
        Most everything you've described sounds like a poor purchasing decision, an addiction to buying new things given the label of "hobby" to justify them. I think they ought to be viewed critically,...

        Most everything you've described sounds like a poor purchasing decision, an addiction to buying new things given the label of "hobby" to justify them. I think they ought to be viewed critically, not accepted as a valid niche; the expensive cards in question are no different.

        1 vote
        1. steezyaspie
          Link Parent
          If we can at least assume the person buying these things is a responsible adult who isn’t buying a TV, telescope, whatever, instead of paying their bills, there is nothing wrong with spending...

          If we can at least assume the person buying these things is a responsible adult who isn’t buying a TV, telescope, whatever, instead of paying their bills, there is nothing wrong with spending money on something they enjoy. Just because they aren’t sticking every cent in an index fund and eating beans for dinner every night doesn’t mean they’re necessarily making a poor purchasing decision.

          6 votes
    3. TheJorro
      Link Parent
      I'd be very surprised if the upgraded model of the 5080 (likely will be another Super) has more RAM. The 4080 Super had the same amount as the regular 4080 and it came out two years later. And it...

      I'd be very surprised if the upgraded model of the 5080 (likely will be another Super) has more RAM. The 4080 Super had the same amount as the regular 4080 and it came out two years later. And it seems one of the big changes with this generation is improved VRAM usage.

      1 vote
  2. [28]
    whs
    (edited )
    Link
    /rant I'm buying the 5090. Tech media and people on news aggregators are saying "don't buy NVIDIA", and so I've held on to my 1080 for years. I haven't heard any tech media says "this is the one...

    /rant

    I'm buying the 5090.

    Tech media and people on news aggregators are saying "don't buy NVIDIA", and so I've held on to my 1080 for years. I haven't heard any tech media says "this is the one to buy" in regard to any highest performing gaming GPU since the 1080, and at best 4090 get praised as the best 40 series cards in term of performance leap from the 30 series.

    At the same time it feel like the latest wave of games are the signs to upgrade my 1080. In the past I could get stable 60fps from any games I play at medium settings. Now Horizon Forbidden West won't run at 60fps on any settings without XeSS (there's no DLSS on this card). There's a plenty of time I died in PoE2 because at the lowest settings any upscaler can't handle lightning spike effects and I can't see any AoE under me (nor is my partner who is on 4060 + DLSS). I haven't play any UE5 games yet, but it should be clear that the 9 years old card is now showing its age. The only thing it still work pleasantly is running 22B LLM which the math behind that didn't even exists before the GPU. The 8GB VRAM is a limiting factor, but 22B model is surprisingly give quality answer while running at bearable speed.

    I don't think 5090 would meet my expectation - stable 120fps 1440p path tracing at native resolution. There'd probably be DLAA and Ray Reconstruction in that so it's not 100% AI-free yet. But I can't hold out for another generation.

    13 votes
    1. Pavouk106
      (edited )
      Link Parent
      When you are upgrading from almost 10 years old generation GPU, it is understandable that you go for it even with such price tag. If the 5090 serves you as well as 1080 did (multiple years), it's...

      When you are upgrading from almost 10 years old generation GPU, it is understandable that you go for it even with such price tag. If the 5090 serves you as well as 1080 did (multiple years), it's not bad investment.

      The thing is - where will this price rising up stop? And where are xx50 series cards that were affordable and still relevant and useful (I have 1650 in my HTPC that I bought for 170€ before crypto boom). Nvidia went all out on power and money and lost relevance in low end gaming sector. AMD stepped in as the reasonable choice - You don't have to whip out as much money and you may get similar performance (but worse power/performance ratio I guess). And nowadays Intel went for the lowest sector price-wise. They managed to make a card that is better in price/performance than both Nvidia and AMD while still being "cheap".

      I'm sure there are customers for 5090. And for other GPUs as well. I'm not one of them though. I would never buy such card for gaming/personal use.

      For two grand you could buy two Steam Decks and Framework laptop! Or two middle-class gaming PCs (not GPUs, the whole towers). Or multiple good smartphones. Or used car. It is really hefty price to pay for a GPU. But again - if you didn't upgrade for multiple generation, I can kinda understand buying it. Still, I would buy Steam Deck, Framework laptop and have a few hundred bucks left. (Actually if I had two grand of free money, I wouldn't buy those, I'd have other actually practical things to buy)

      11 votes
    2. [19]
      kaffo
      Link Parent
      [continue rant] I guess I kind of get the "don't guy Nvidia" flag waving. They do price hike more and more. But honestly, in my humble opinion, they still release the best, most stable graphics...

      [continue rant]
      I guess I kind of get the "don't guy Nvidia" flag waving. They do price hike more and more. But honestly, in my humble opinion, they still release the best, most stable graphics cards in the market.
      I bought a 3080ti after my 1070ti and the difference was pretty sizable. I play at 1440p and I went from medium/high at 60fps to high/ultra at 144fps for most games.

      I've also had AMD cards and I've had dozens of friends with AMD cards bought into the "don't buy Nvidia" sales pitch and more than half have the card die on them 3+ years after getting it, which I just don't hear about from Nvidia cards.
      I mean if you are happy to get a new card every few years, sure go for it. But personally I'd rather pay out of pocket now for a good card and have it last until the end of time. Every card I've bought I've passed on to a freind or a younger brother who's wanted something to play games with and I don't think I could do that with AMD.
      My brother still uses my 780ti I gifted him god knows how many years ago.

      7 votes
      1. [8]
        trim
        Link Parent
        As a Linux user, I'm not going anywhere near nvidia cards. They were terrible for the longest time, and I've recently done an upgrade on an older system where the nonfree driver borked the process...

        As a Linux user, I'm not going anywhere near nvidia cards. They were terrible for the longest time, and I've recently done an upgrade on an older system where the nonfree driver borked the process again.

        The situation may be a bit better now, but I'm never going back

        15 votes
        1. [2]
          kaffo
          Link Parent
          I've seen others say the same thing as you. Admittedly I haven't used Linux as a daily driver for a few years now but on my 1070 and 780 I never had any issues with the nonfree drivers. But I...

          I've seen others say the same thing as you. Admittedly I haven't used Linux as a daily driver for a few years now but on my 1070 and 780 I never had any issues with the nonfree drivers. But I cannot say anything about the newer cards I'm afraid.

          5 votes
          1. Pavouk106
            Link Parent
            I use Nvidia only on my Linux-only desktop PC since GTS450 came out. I used that, then GTX750 and currently I have 1650 in my desktop and 960 in daughters and I also have myold 750 in my server to...

            I use Nvidia only on my Linux-only desktop PC since GTS450 came out. I used that, then GTX750 and currently I have 1650 in my desktop and 960 in daughters and I also have myold 750 in my server to transcode on the fly and I have 1050 in my other server to do the same. I never had any problems with their proprietary drivers. And on top of that al the machines run Gentoo whih can bepain in the ass all by itself. No problems at all! But just like you, I don't know anything about newer cards.

            I don't use the cards for CUDA, rendering (other than video) or any professional stuff, just gaming. Not even recording or streaming my gaming. You could say my usage is very narrow, but it always worked for me.

            But if I was doing new desktop, I would lean away from Nvidia. First - price. Second - I would support Intel in their crusade (matter of principle, also their B580 seems like cad just for me - cheap, yet powerful enough).

            I also have friend rocking AMD 7900XT on his Arch linux gaming PC. He is very pleased with how it works and says he wouln't go back to Nvidia.

            There are many different people with many different experiences and also different kinds of usage (multimonitor, streaming, professional stuff...).

            4 votes
        2. [3]
          davek804
          Link Parent
          Not attempting to change your mind or invalidate your POV in any manner. I've recently finished an Unraid build with two GPUs in it: a GTX 970 (my old card <3) and a GTX 1660. I'm able to pass...

          Not attempting to change your mind or invalidate your POV in any manner.

          I've recently finished an Unraid build with two GPUs in it: a GTX 970 (my old card <3) and a GTX 1660. I'm able to pass those two GPUs through to containers / VMs hosted on the Unraid server without any issues. Sure, it took a day or so of learning and config to get it all set up properly. I had trial-by-fire because I saw how slow tdarr transcodes were going on my CPU.

          3 votes
          1. [2]
            trim
            Link Parent
            There certainly was a huge chunk of time during which the proprietary nature of the nvidia driver caused massive kernel building and packaging headaches. Every update to anything would break...

            There certainly was a huge chunk of time during which the proprietary nature of the nvidia driver caused massive kernel building and packaging headaches. Every update to anything would break something or other.

            Glad for nvidia users if it's not like that now, but my journey has been long, and pepperidge farm remembers.

            5 votes
            1. davek804
              Link Parent
              Ha. Yep. I know the feel!

              Ha. Yep. I know the feel!

        3. whs
          Link Parent
          My Linux hearsay about AMD cards is when it was ATi. I remembered back then people who run ATi were told to buy a new GPU for Linux, Intel has the only plug and play setup but is too slow to run...

          My Linux hearsay about AMD cards is when it was ATi. I remembered back then people who run ATi were told to buy a new GPU for Linux, Intel has the only plug and play setup but is too slow to run Compiz Fusion so only the NVIDIA people get the eye candy. So far I think the NVIDIA experience is consistent - it always worked for me, maybe rocky at times but there will be a combination that works.

          2 votes
        4. WiseassWolfOfYoitsu
          Link Parent
          That's the boat I'm in. My work RHEL laptop has an Nvidia graphics card and causes me no end of headaches dealing with the drivers. My home Fedora system has a 7900XTX. I had a couple problems in...

          That's the boat I'm in. My work RHEL laptop has an Nvidia graphics card and causes me no end of headaches dealing with the drivers. My home Fedora system has a 7900XTX. I had a couple problems in the beginning as a result of the newness of the card when I got it but they got fixed pretty quickly, and since then it's been rock solid. Having the open source kernel integrated driver also be the mainline vendor supported driver is a huge advantage here. And this isn't even for doing VM or AI stuff, it's actually for gaming under Proton - it works great and has been essentially zero hassle.

          It's not that I'm a militant AMD fan. Had Nvidia for the two cards before this one. But for this use case AMD makes a lot of sense.

          2 votes
      2. adutchman
        Link Parent
        It's anecdotal evidence, granted, but my AMD card has been going strong for 4 and a half years. Honestly, I don't care about which brand is better, the only thing I care about is Linux support and...

        It's anecdotal evidence, granted, but my AMD card has been going strong for 4 and a half years. Honestly, I don't care about which brand is better, the only thing I care about is Linux support and AMD is plain better at that. I hope Nvidia improves so we get more choice. Hence I am only buying AMD for the foreseeable future.

        6 votes
      3. [3]
        whs
        Link Parent
        The card I have before the 1080 is a GTX 950. It played every games 1-2 years before its release at max settings. I think it was Unity 4 that killed it. A few years later I bought an Mini-ITX PC...

        The card I have before the 1080 is a GTX 950. It played every games 1-2 years before its release at max settings. I think it was Unity 4 that killed it.

        A few years later I bought an Mini-ITX PC for a home server. The shop called me "your AMD APU doesn't actually have a GPU". So I put the 950 into the server to install the OS. After that it sit dormant for years until I realize Frigate (Software NVR for CCTV) can use the GPU. I tested out and it seems that GPU detection consume more power than CPU, but GPU video decoding does reduce CPU temp.

        Later on I added ollama. It can do at most 8B models which does RAG tasks fine, but without much internal knowledge.

        Sadly I don't think 1080 fit into the Mini-ITX case so the server wouldn't get the hands-me-down.

        1 vote
        1. [2]
          bret
          Link Parent
          doesn't an APU definitionally have a GPU lol

          "your AMD APU doesn't actually have a GPU".

          doesn't an APU definitionally have a GPU lol

          3 votes
          1. whs
            Link Parent
            I was expecting every AMD CPU to be an APU since it's their major selling point, so I ordered a Ryzen5 3500. I didn't realize AMD to have CPU without a GPU, especially one that doesn't have a...

            I was expecting every AMD CPU to be an APU since it's their major selling point, so I ordered a Ryzen5 3500. I didn't realize AMD to have CPU without a GPU, especially one that doesn't have a suffix (like the "F" suffix on Intel indicating no integrated graphics).

            1 vote
      4. Macha
        Link Parent
        For some counter anecdata, my 1080 is my only GPU that has ever died on me.

        For some counter anecdata, my 1080 is my only GPU that has ever died on me.

        1 vote
      5. [5]
        asparagus_p
        Link Parent
        Not sure of your exact stats there, but "more than half [of dozens] die after 3+ years" sounds like a terrible rate of failure. I'm not sure the situation is that bad with AMD. I think we also...

        I've had dozens of friends with AMD cards bought into the "don't buy Nvidia" sales pitch and more than half have the card die on them 3+ years after getting it,

        Not sure of your exact stats there, but "more than half [of dozens] die after 3+ years" sounds like a terrible rate of failure. I'm not sure the situation is that bad with AMD. I think we also need more info on what brands are being bought, because there's a big difference between say a Sapphire card and a Club 3D card.

        1 vote
        1. teaearlgraycold
          Link Parent
          The discussion sounds far too similar to 1000s of reddit threads on AMD vs. Nvidia. I don't trust those kinds of anecdotes. Personally I have a 3080 Ti right now but I've had many AMD and many...

          The discussion sounds far too similar to 1000s of reddit threads on AMD vs. Nvidia. I don't trust those kinds of anecdotes. Personally I have a 3080 Ti right now but I've had many AMD and many Nvidia cards over the years. The only issues I can recall with AMD were 2 games having minor AMD-specific graphical glitches.

        2. Akir
          Link Parent
          It’s very likely that it’s market bias. The cards get made by third parties and AMD is the cheap option, so lower quality products are more likely to be AMD.

          It’s very likely that it’s market bias. The cards get made by third parties and AMD is the cheap option, so lower quality products are more likely to be AMD.

        3. [2]
          kaffo
          Link Parent
          Yeah that's the issue with subjective evidence I suppose. I don't have exact stats and I'm probably bias towards Nvidia to some extent. I guess I'll put it this way, in my circles I've only ever...

          Yeah that's the issue with subjective evidence I suppose. I don't have exact stats and I'm probably bias towards Nvidia to some extent.
          I guess I'll put it this way, in my circles I've only ever heard of 1 Nvidia failure from a power user who uses Premier. Everything else (I'd say around 5 failures I rememeber happening in the last 10 years) has been AMD failures and then the outcome has usually been "I'm not buying that again"
          So I guess take that as you will

          1. BeardyHat
            Link Parent
            Weren't Nvidia cards torching their PSU connectors though? I have no team affiliation, I run both brands in my various machines that game and have only had one issue with AMD, which is related to...

            Weren't Nvidia cards torching their PSU connectors though?

            I have no team affiliation, I run both brands in my various machines that game and have only had one issue with AMD, which is related to switchable graphics and my eGPU and only for one particular game. I just buy whatever gives me the best performance for the money I'm willing to spend, which has been AMD the last two times I upgraded a graphics card and they've been just fine.

            Thinking of my graphics cards habits going back awhile

            Voodoo 1 > Matrox...g800? I don't remember exactly what model > GeForce 256 > GeForce for the next 24 years > 6700 XT and a 6650 XT

            First time I've ever actually owned ATi/AMD

    3. bret
      Link Parent
      The performance leap from 1080 to a 5090 is huuuge. I think you'll be happy with it :)

      The performance leap from 1080 to a 5090 is huuuge. I think you'll be happy with it :)

      2 votes
    4. [2]
      Macha
      Link Parent
      The 3080 got pretty widespread praise before the scalping tripled the de facto price.

      The 3080 got pretty widespread praise before the scalping tripled the de facto price.

      2 votes
      1. bret
        Link Parent
        I got lucky getting it when it was first released at amazon, decided it was too much power for my system, someone drove 3 hours to come buy it from me for cost

        I got lucky getting it when it was first released at amazon, decided it was too much power for my system, someone drove 3 hours to come buy it from me for cost

        1 vote
    5. onceuponaban
      (edited )
      Link Parent
      To be fair, if there was any NVIDIA GPU to hold onto for a decade, it would absolutely be the 1080 (and to a greater extent the 1080 Ti). The GTX 10 lineup was a leap in performance that wasn't...

      To be fair, if there was any NVIDIA GPU to hold onto for a decade, it would absolutely be the 1080 (and to a greater extent the 1080 Ti). The GTX 10 lineup was a leap in performance that wasn't seen before in quite a while, and so far was never seen since, and a big part of this can be attributed to fear of AMD's own competing lineup convincing NVIDIA to pull out all the stops... only for AMD to seriously under-deliver meaning NVIDIA had absolutely no reason to keep it up afterward, hence every lineup since being comparatively disappointing. The fact that it took 9 years for it to no longer be able to keep up with modern games' spec requirements when 9 years before that a single generation rendering the previous one obsolete was a regular occurrence is a testament to how special that lineup's launch was.

      But yeah at this point it's clear the 1080's time has passed, in your place I wouldn't bat an eye at picking this upcoming gen to upgrade regardless of how good or bad the performance increase over the previous gen turns out to be... so long as it's not negative, at least.

      2 votes
    6. [2]
      tauon
      Link Parent
      As someone who doesn’t have and never had a “tower” desktop PC – why not get a 4090 now on (presumably) sale?

      As someone who doesn’t have and never had a “tower” desktop PC – why not get a 4090 now on (presumably) sale?

      1 vote
      1. WiseassWolfOfYoitsu
        Link Parent
        Unfortunately current speculation I've seen is that it isn't going to drop much, at least any time soon. Availability is still tight and it's heavily scalped, plus there's a lot of concern about...

        Unfortunately current speculation I've seen is that it isn't going to drop much, at least any time soon. Availability is still tight and it's heavily scalped, plus there's a lot of concern about the tariffs increasing prices generally here in a couple months, so retailers have been stocking up in advance.

        1 vote
    7. Gazook89
      Link Parent
      If you have decent (fiber) internet, meaning ~300-500mbps, i suggest looking into Shadow.Tech PC. It’s a streaming desktop service, which gives you a windows 11 desktop with the ability to install...

      If you have decent (fiber) internet, meaning ~300-500mbps, i suggest looking into Shadow.Tech PC. It’s a streaming desktop service, which gives you a windows 11 desktop with the ability to install Steam or whatever else you would normally do on your own desktop. Games will play with maxed out settings.

      For $30/month for their capable mid tier offering, you’d have 5 years of a good gaming PC for the cost of a 5000 series graphics card. And Shadow manages the upgrades over time, and houses the pc/server so you don’t need a big rig at all that slow goes out of date.

      For the top tier $50/month option, you’d have an excellent machine for 3.3 years (compared against just the graphics card).

      I started doing this in 2020 when I had a 2014 MacBook Pro. Being able to play PC games on max settings from a tiny MacBook while on my couch, while also doing windows development on the same machine was awesome. I’ve had it for 4-5 years now with no issues at all, though now I have a Mac mini.

      I know I know, another subscription?!? I don’t own my computer?!? Well, either you are front loading a huge upfront cost for rapidly depreciating hardware, or you can spread your cost over time with no depreciation. You’ve still have your own computer for doing your own private work like financials or browsing, but use Shadow just for gaming.

      Anyway, enough of my free advertising I guess.

      (Also, if you don’t have fast internet, this is not really an option especially for games that require it like FPSs).

  3. Greg
    Link
    They’ve also announced DIGITS, an integrated GPU and ARM CPU with 128GB (shared V)RAM for $3,000. A heavily enhanced Jetson, or a heavily cut down GB200, depending how you look at it. Looks to be...

    They’ve also announced DIGITS, an integrated GPU and ARM CPU with 128GB (shared V)RAM for $3,000. A heavily enhanced Jetson, or a heavily cut down GB200, depending how you look at it.

    Looks to be aimed as a “low end” entry point for LLM development, from what I can see. Enough memory for actually large models, accessible to people who can’t spend the $30k+ that would otherwise cost, but with enough drawbacks that it won’t cannibalise sales of the more expensive hardware.

    7 votes
  4. [2]
    updawg
    Link
    Not as bad as I expected, to be honest.

    Four new RTX 50-series GPUs are on the way, including the $1,999 RTX 5090 and $999 RTX 5080.

    Not as bad as I expected, to be honest.

    5 votes
    1. mild_takes
      Link Parent
      I can't see them dropping prices when people are still buying them, and I can't see a good reaction if prices go higher. Still expensive... I still can't see myself paying more than $500 (if even...

      I can't see them dropping prices when people are still buying them, and I can't see a good reaction if prices go higher.

      Still expensive... I still can't see myself paying more than $500 (if even that) for a card any time soon. I'm fairly happy with 1080p and cheaper cards right now though.

      8 votes
  5. [4]
    danke
    Link
    The 5090's 15-30% raster uplift over the 4090 is comparable to the 2080Ti launch, not terrible but unexciting. Some of the "neural textures" they demonstrated using ¼ the memory, particularly the...

    The 5090's 15-30% raster uplift over the 4090 is comparable to the 2080Ti launch, not terrible but unexciting. Some of the "neural textures" they demonstrated using ¼ the memory, particularly the silk texture, were so bad I think I'd outright prefer ¼ mipmapped textures instead.

    I'll be grabbing one regardless though, in spite of my thorough disillusionment on the AI slop that dominated the keynote. Dolphin brings even a 4090 to its knees before maxing every setting.

    4 votes
    1. [3]
      tape
      Link Parent
      I'm down for not terrible. My 2080ti is still chugging along but I've got 3x1440p with the main one being 360hz and I'd love to even get over 100 on it in stuff lol. Not to mention I do VR which...

      I'm down for not terrible. My 2080ti is still chugging along but I've got 3x1440p with the main one being 360hz and I'd love to even get over 100 on it in stuff lol. Not to mention I do VR which doesn't use AI dlss nonsense anyway, so I need the best raster I can get.

      3 votes
      1. Sodliddesu
        Link Parent
        I'm still using a 1070ti here, even for VR, but I'll likely buy the rumored standalone VR from Valve before I drop $1999 on a GPU even when that card has latest longer than some... Hell, most of...

        I'm still using a 1070ti here, even for VR, but I'll likely buy the rumored standalone VR from Valve before I drop $1999 on a GPU even when that card has latest longer than some... Hell, most of my relationships!

        That said, I'll wait to see AMD's offering as October means the gaming PC needs an OS upgrade...

        3 votes
      2. danke
        Link Parent
        Ah, I completely forgot about the DP2.1b UHBR20 support. Yeah, that's a complete game changer that'll have me seriously considering a 4K 240Hz OLED monitor.

        Ah, I completely forgot about the DP2.1b UHBR20 support. Yeah, that's a complete game changer that'll have me seriously considering a 4K 240Hz OLED monitor.

        1 vote
  6. [6]
    Venko
    Link
    I have an over ten year old gaming desktop that's really showing its age with an Nvidia 970 3.5/4GB GPU. I'd like to build a replacement soon. For the GPU I'm wondering about the 5080 versus the...

    I have an over ten year old gaming desktop that's really showing its age with an Nvidia 970 3.5/4GB GPU. I'd like to build a replacement soon. For the GPU I'm wondering about the 5080 versus the 5070 Ti as both feature 16GB of RAM (I can't justify buying a card with less memory). I don't have a 4k monitor but am considering getting a 2k one.

    In the UK the list prices are:

    • 5080 £979
    • 5070 Ti £729

    Does the 5070 Ti look suitable?
    Which processor would people recommend pairing with the GPU? I know that I'll need to figure out all of the other components too.

    3 votes
    1. [4]
      infpossibilityspace
      (edited )
      Link Parent
      Best CPU right now is AMD 9800X3D https://gamersnexus.net/cpus/best-cpus-2024-intel-vs-amd-gaming-production-budget-efficiency I built my computer in 2023 and got a 7800X3D (previous model,...

      Best CPU right now is AMD 9800X3D

      https://gamersnexus.net/cpus/best-cpus-2024-intel-vs-amd-gaming-production-budget-efficiency

      I built my computer in 2023 and got a 7800X3D (previous model, despite the number difference), and it's still a top CPU if you want to save a bit of money. Plus the motherboard should work for many CPU generations. AMD kept AM4 going for like 5 generations and if they do something similar with AM5, it'll be good for another 2-3 generations.

      Not passing judgement of the GPUs until the reviews drop :)

      4 votes
      1. [3]
        Greg
        Link Parent
        The X3D chips are impressive as hell, but even so I’m not sure I’d recommend them to most users. The place they really shine is CPU-bound extremely high fps gaming, and that makes them great for...

        The X3D chips are impressive as hell, but even so I’m not sure I’d recommend them to most users. The place they really shine is CPU-bound extremely high fps gaming, and that makes them great for things like esports titles at 1080p on a 360Hz display, but it’s more into diminishing returns at “only” 120fps or so.

        The 9900X has a full four more cores and slightly higher maximum clock speeds, and the 9600X is enough to avoid bottlenecking in most situations at half the price, so either of those can be compelling options over the 9800X3D depending on the exact priorities.

        4 votes
        1. trim
          Link Parent
          I recently installed the Ryzen 7 5800X3D as the last hurrah for my old AM4 platform and it's a juicy monster. 32GB of fast RAM, pair of 2TB NVMe, Should last another 2, 3 maybe more, years, I reckon.

          I recently installed the Ryzen 7 5800X3D as the last hurrah for my old AM4 platform and it's a juicy monster. 32GB of fast RAM, pair of 2TB NVMe, Should last another 2, 3 maybe more, years, I reckon.

          3 votes
        2. infpossibilityspace
          Link Parent
          Since they specified gaming desktop in their comment, it made sense to prioritise games. But the link goes through multiple scenarios (efficiency, production etc.) so even if gaming isn't the...

          Since they specified gaming desktop in their comment, it made sense to prioritise games. But the link goes through multiple scenarios (efficiency, production etc.) so even if gaming isn't the priority, there's plenty there to make an informed decision :)

          2 votes
    2. whs
      Link Parent
      Looking at the GamerNexus 4070 Ti Super review you get 145FPS in Resident Evil 4 1440p without AI upscalers. The 50 series is expected to be 10-30% better than that. With max raytracing (without...

      Looking at the GamerNexus 4070 Ti Super review you get 145FPS in Resident Evil 4 1440p without AI upscalers. The 50 series is expected to be 10-30% better than that. With max raytracing (without path tracing) in Cyberpunk 2077 you get 35.2FPS before any AI shenanigans.

      Adding AI to that, the result for 1440p RE4 RT Medium with FSR Quality is 121fps (GamerNexus nor LTT labs didn't have frame generation result). I think you'll be set for 120fps gaming for a while, and if you're using it past its prime DLSS4 1:3 frame generation may also help in longevity.

      4 votes
  7. [2]
    canekicker
    (edited )
    Link
    I'll be curious about the reviews when these are launched in the next few weeks. I was one of the lucky few who was able to get a 3rd party 3080 at launch for MSRP and while it performs just fine,...

    I'll be curious about the reviews when these are launched in the next few weeks. I was one of the lucky few who was able to get a 3rd party 3080 at launch for MSRP and while it performs just fine, the thermal management seems to be crap despite good airflow (Meshify case) in my case and the thing sounds like a jet engine next to my desk, even though I undervolted it.

    Having a 250W 5070 vs a 320W 3080 seems like an unnecessary upgrade but 3080s appear to be going for mid 300s right now so with the 5000s launch, I can still see getting a few hundred for it which would offset the upgrade price.

    Someone tell me I'm an idiot or that this makes no sense.

    1. elight
      Link Parent
      I bought a 3090 4 years ago. While it still performs well, I admit I'm curious about the 5070 ti. If the claim is true (2x 4070 ti performance), then it should blow away the 3090 in many cases...

      I bought a 3090 4 years ago. While it still performs well, I admit I'm curious about the 5070 ti. If the claim is true (2x 4070 ti performance), then it should blow away the 3090 in many cases that don't require 24GB VRAM.

      2 votes