109 votes

Practically no one's buying current generation video cards

115 comments

  1. [54]
    CptBluebear
    Link
    To the absolute surprise of nobody at all, the current generation of GPUs is not selling well. With the ridiculous pricing and very small or non existent incremental improvements in performance...

    To the absolute surprise of nobody at all, the current generation of GPUs is not selling well.
    With the ridiculous pricing and very small or non existent incremental improvements in performance over the previous generation cards, it's no wonder there's no movement in the market.

    While the absolute high end cards perform well it seems with a sticker price of nearly $2,000 on the RTX 4090, it's practically only interesting for people trying to make money using the cards rather than people buying them recreationally.

    On the lower end of the spectrum, we see benchmark testing at a rather... Disappointing level over the same tier card from the generation prior.

    It seems both AMD and nVidia are suffering from this but neither is seemingly willing to do something about it. nVidia is banking hard on the AI, upscaling technologies, and power consumption (though it seems to me that's just a side-effect instead of the intent) to make their cards worthwhile. Meanwhile AMD is plodding along doing not much of either and while they're releasing decent cards at a decent (not good, decent) price point, they are showing just ever so slightly worse performance at tech people find important (raytracing is one)

    I've been looking for an improvement over my now very old GTX 1080 but I just can't see the current value being worth it.

    Couple of things I'm interested in:
    What's your current setup?
    Have you thought about upgrading or did you do it already?
    What do you think would be an appropriate change in the market?
    Any thoughts on the reason they're not selling well?
    What would make a new card worth it for you at this price?

    69 votes
    1. [20]
      vxx
      Link Parent
      I also get the impression that games are stagnant in graphics and don't evolve because they're stuck with consoles, so there's absolutely no reason to upgrade your graphics card.

      I also get the impression that games are stagnant in graphics and don't evolve because they're stuck with consoles, so there's absolutely no reason to upgrade your graphics card.

      23 votes
      1. [10]
        Halio
        Link Parent
        I doubt it actually. The PS4/XBO were outdated even when they launched a decade ago, but we still got more than a few multi-platform games that pushed the hardware on PC. The current generation of...

        I doubt it actually. The PS4/XBO were outdated even when they launched a decade ago, but we still got more than a few multi-platform games that pushed the hardware on PC.

        The current generation of consoles have very capable hardware in comparison, even better than the average Steam user, so I do not think they’ll be what’s holding games back graphically, especially since graphics can scale pretty well without affecting overall game design.

        22 votes
        1. [2]
          CptBluebear
          Link Parent
          Games are currently often made for multiple generations of consoles, in that sense I can sort of see how they may not be particularly pushing the envelope on current design.

          Games are currently often made for multiple generations of consoles, in that sense I can sort of see how they may not be particularly pushing the envelope on current design.

          1 vote
          1. Halio
            Link Parent
            That’s true, but right now we are held back by the last generation rather than the current.

            That’s true, but right now we are held back by the last generation rather than the current.

            5 votes
        2. [7]
          vxx
          Link Parent
          The only games that are programmed well in that regard seem to be the console exclusives with a few exceptions like Cyberpunk. When they finally release on PC, they can run on old hardware pretty...

          The only games that are programmed well in that regard seem to be the console exclusives with a few exceptions like Cyberpunk. When they finally release on PC, they can run on old hardware pretty well.

          Do you have any examples for games with exceptional graphics that make it worth upgrading a graphics card when you go with a single monitor setup?

          1 vote
          1. [6]
            Halio
            Link Parent
            Metro Exodus, Control, and Red Dead Redemption 2 are recent examples, they released at the end of last gen and looked fantastic on PC. Especially the first two with ray tracing.

            Metro Exodus, Control, and Red Dead Redemption 2 are recent examples, they released at the end of last gen and looked fantastic on PC. Especially the first two with ray tracing.

            8 votes
            1. [2]
              vxx
              Link Parent
              You got a point. So now we're waiting for the realease of GTA VI to determine if an upgrade to newest generation makes sense.

              You got a point. So now we're waiting for the realease of GTA VI to determine if an upgrade to newest generation makes sense.

              1. Halio
                Link Parent
                We’ll see, but that won’t prove anything really. I just don’t think consoles is the reason graphics don’t evolve as they did 15 years ago, it’s because we’ve reached a point where negligable...

                We’ll see, but that won’t prove anything really. I just don’t think consoles is the reason graphics don’t evolve as they did 15 years ago, it’s because we’ve reached a point where negligable improvements have a huge performance impact (e.g. path tracing).

                When we get to a point where lighting can be perfectly simulated, games will look almost lifelike. Consoles aren’t the problem here as even 4090 struggles with path tracing.

                6 votes
            2. [3]
              havok
              Link Parent
              Looked fantastic but Exodus was merely decent as it had a few game-breaking bugs (which still existed when I played a year later and again on my replay) while featuring raytracing without it being...

              Looked fantastic but Exodus was merely decent as it had a few game-breaking bugs (which still existed when I played a year later and again on my replay) while featuring raytracing without it being useful unless you were okay with 30fps. Control was a wonderful game from a technical standpoint (again, didn't play on release since it was another Epic exclusive, but when I did play it I was amazed by how well it ran) so no arguments here. RDR2 was and I believe still is a horrible port with many performance issues, a poster child for bad console ports. Sure, all three look great, but performance matters and the performance roulette for console ports gets spun every time one of those games come out.

              1. Halio
                Link Parent
                It's still a game that is clearly not being held back by consoles, it was being held back by PC hardware at the time. I personally didn't encounter any game-breaking bugs so I'm not quite sure...

                Looked fantastic but Exodus was merely decent as it had a few game-breaking bugs (which still existed when I played a year later and again on my replay) while featuring raytracing without it being useful unless you were okay with 30fps.

                It's still a game that is clearly not being held back by consoles, it was being held back by PC hardware at the time. I personally didn't encounter any game-breaking bugs so I'm not quite sure what you're referring to there.

                RDR2 was and I believe still is a horrible port with many performance issues, a poster child for bad console ports.

                It's a demanding game but nowhere near a horrible port. It always ran fine and looked great on low/med settings, people who complained that it was a shit port cranked it up to ultra where it was crazy demanding for little visual improvement. It performed a little bit less than expected at launch and has improved since, but was never a poster child for a bad port, people are just idiots.

                3 votes
              2. st3ph3n
                Link Parent
                I'm currently playing RDR2 in 1440p locked at 60fps with most of the bells and whistles turned on. I didn't have the hardware for it when it was new, but with my current Ryzen 9 5900X and 6750XT...

                I'm currently playing RDR2 in 1440p locked at 60fps with most of the bells and whistles turned on. I didn't have the hardware for it when it was new, but with my current Ryzen 9 5900X and 6750XT combo it runs fabulously. Maybe it sucked harder on the hardware typical when it was released on PC in 2019, but in its current state I love it.

                1 vote
      2. [5]
        PuddleOfKittens
        Link Parent
        I agree games are stagnant (or at least slow) in graphics, but I disagree that it's caused by consoles - I think visuals are just running out of room - back in 2000, graphics improvements were...

        I also get the impression that games are stagnant in graphics and don't evolve because they're stuck with consoles

        I agree games are stagnant (or at least slow) in graphics, but I disagree that it's caused by consoles - I think visuals are just running out of room - back in 2000, graphics improvements were night and day, but nowadays you can't easily guess whether something was released in 2023 or 2017. The new Tomb Raider games all have good graphics, and holy crap the most recent one came out in 2018? I thought it was 202X. Which I guess proves my point.

        The other aspect is that as photorealistic graphics become more detailed (i.e. more photorealistic), they're becoming more expensive - it takes more time to place details when you have a higher number of details, after all.

        Come to think of it, what if the causation is backwards here? What if we're stuck with old-hardware consoles because graphics haven't evolved enough to justify a new console generation?

        I think there's value in raytracing and upscaling (and personally I'd like both, if only for trying with Minecraft/Minetest), but until that's available sub-$200 that's kind of irrelevant.

        15 votes
        1. luka
          (edited )
          Link Parent
          I'm also thinking that people don't buy games for photo-realistic graphics anymore the way they did maybe a decade ago. This is considering the rise of indie studios in that time frame, which...

          I think visuals are just running out of room

          I'm also thinking that people don't buy games for photo-realistic graphics anymore the way they did maybe a decade ago. This is considering the rise of indie studios in that time frame, which generally do not have the budget to focus on this aspect. It's much more about the art style and atmosphere, which is why even a game like Witcher 3 that was released in 2015 doesn't feel like it has dated graphics, even though it may not be on the same technical level as a contemporary game.

          6 votes
        2. [3]
          TreeFiddyFiddy
          Link Parent
          I’ll try and find a link but this was actually a topic of discussion online last year. From what I understand, it’s not a hardware limitation but a software limitation. We’re running up against...

          I think visuals are just running out of room

          I’ll try and find a link but this was actually a topic of discussion online last year. From what I understand, it’s not a hardware limitation but a software limitation. We’re running up against the edge of what we’re able to graphically code. It will take a minor revolution in coding to breakthrough to truly next gen graphics

          4 votes
          1. supergauntlet
            Link Parent
            I would say it's also that making things look better takes more and more time as fidelity increases. More complex models take more time to make.

            I would say it's also that making things look better takes more and more time as fidelity increases. More complex models take more time to make.

            7 votes
          2. luka
            Link Parent
            That breakthrough will probably be generative art.

            That breakthrough will probably be generative art.

            3 votes
      3. [2]
        the9tail
        Link Parent
        Exactly. Unless you are doing something special that needs ridiculous resolutions like VR you don’t need the latest cards to play the latest games and the price point just pushes people out as well.

        Exactly. Unless you are doing something special that needs ridiculous resolutions like VR you don’t need the latest cards to play the latest games and the price point just pushes people out as well.

        4 votes
        1. sunset
          Link Parent
          I think the relationship is the other way around. The latest games don't require the latest cards because the studios don't believe people would spend that much on a high-end GPU. You nailed it on...

          you don’t need the latest cards to play the latest games

          I think the relationship is the other way around. The latest games don't require the latest cards because the studios don't believe people would spend that much on a high-end GPU.

          the price point

          You nailed it on the head here, this is what it boils down to. A 4080 is $1200. A 4090 is $2000. Normal people don't have that much money to spend on a GPU, especially under a looming recession, those prices are insane. Game developers know this. So they don't make games that require this hardware.

          3 votes
      4. itdepends
        Link Parent
        I believe this is true to an extent. In the PC space graphic improvement options are now past the Low-to-High settings and include multiple monitors, resolutions higher than HD, 144Hz etc. While...

        I believe this is true to an extent. In the PC space graphic improvement options are now past the Low-to-High settings and include multiple monitors, resolutions higher than HD, 144Hz etc. While in the days of old the cutting edge gamer could only hope for dual hd monitors now they can have massive, 144Hz 4K displays and VR gear that need something more than a 1080 to make use of them.

        But if like me you're content with "Ultra on HD resolution on a 60HZ monitor" being "max settings" then yeah, you can chug along with an older card for a while. In my modest setup with a GTX 1080 I can do RDR2 on max settings with a browser with several tabs open on another monitor, no discernible reduction in performance and instant alt-tabbing.

        On top of that, I guess a lot of people from the "1080 generation" are getting older and gaming at the bleeding edge of what's available is not a priority for them nor will they be buying every single AAA title each year.

        That said, Starfield and the upcoming GTA VI are the only releases I'm genuinely looking forward to and the formers recommended specs have got me wondering how much cash I'll have to drop for a general upgrade to get the best experience out of them.

        2 votes
      5. MaoZedongers
        Link Parent
        My RX 570 still runs new games just fine, so I haven't felt any need to upgrade. And due to the price I'd probably just not buy a game that did.

        My RX 570 still runs new games just fine, so I haven't felt any need to upgrade.

        And due to the price I'd probably just not buy a game that did.

        1 vote
    2. [3]
      Ranovex
      Link Parent
      GPU: 8 GB RX 580 RAM: 16GB CPU: i7 I could use a better GPU but can't justify the price point of an upgrade. Until my card fries or I find a game I cannot play with this rig, I don't plan on...

      GPU: 8 GB RX 580
      RAM: 16GB
      CPU: i7

      I could use a better GPU but can't justify the price point of an upgrade. Until my card fries or I find a game I cannot play with this rig, I don't plan on changing anything.

      9 votes
      1. Tryptaminer
        Link Parent
        Same but a 1080 Ti. I'd certainly like to upgrade, but I'm not even close to needing an upgrade. I can't think of anything I played that my PC can't handle at or near max settings. For 1440p, I'm...

        Same but a 1080 Ti. I'd certainly like to upgrade, but I'm not even close to needing an upgrade. I can't think of anything I played that my PC can't handle at or near max settings. For 1440p, I'm fine.

        5 votes
      2. admicos
        Link Parent
        I have the exact same setup as you (i7 7700k for the CPU, just be be a bit more precise) and if someone gave me money and said "upgrade your pc right now" the upgrades I'd go for would probably be...

        I have the exact same setup as you (i7 7700k for the CPU, just be be a bit more precise) and if someone gave me money and said "upgrade your pc right now" the upgrades I'd go for would probably be a new keyboard (& maybe mouse, but my keyboard is broken right now so that's a priority) and better headphones. Mayyybe another 16GBs of RAM just because I have the slots for it but that's really not that important.

        1 vote
    3. [3]
      Minithra
      Link Parent
      GTX 3080, upgraded my entire PC from an old one with a 1080... massive improvement. At the time, a similar build with a 3090 was surprisingly more, so I stopped at 3080. I'm satisfied with the...

      Couple of things I'm interested in:

      What's your current setup?

      GTX 3080, upgraded my entire PC from an old one with a 1080... massive improvement. At the time, a similar build with a 3090 was surprisingly more, so I stopped at 3080. I'm satisfied with the performance (I play a ton of games and do some graphic stuff)

      Any thoughts on the reason they're not selling well?

      Too expensive unless you have fuck off money or it's a business expense.

      What would make a new card worth it for you at this price?

      Honestly, I'm not going to think about a new GPU unless I hit the point where my current one can't fit my needs - if new games come out that I love, but my GPU is struggling, I'll look into upgrading. But as long as it works, it works.

      6 votes
      1. [2]
        lightning2x
        Link Parent
        Do you think it's worth it to upgrade from a 2080s to a 3080?

        Do you think it's worth it to upgrade from a 2080s to a 3080?

        1. Minithra
          Link Parent
          Unless you're struggling, definitely not. while the 3080 is better in almost all ways, it's a marginal improvement - The 2080S should still be able to run all modern games at comfortable settings....

          Unless you're struggling, definitely not. while the 3080 is better in almost all ways, it's a marginal improvement - The 2080S should still be able to run all modern games at comfortable settings. Unless you're doing crazy demanding VR or massive resolutions, no point in an upgrade.

          4 votes
    4. [4]
      disk
      Link Parent
      I'm currently running a 6600 out of necessity, since I moved recently and couldn't take my old computer with me. I think it's a fair card, the default fan curve was absurdly aggressive, that's...

      I'm currently running a 6600 out of necessity, since I moved recently and couldn't take my old computer with me. I think it's a fair card, the default fan curve was absurdly aggressive, that's its' only fault.

      Since this was built fairly recent, I don't plan on upgrading (except RAM or storage) for the next 5 years, especially considering the prices and the types of game I play, which fail to take advantage of RTX enabled or otherwise more powerful cards.

      I'm genuinely hoping that there is a return to a proper low end graphics card segment, that isn't bottlenecked by frankly idiotic decisions with regards to lane width or memory limitations, because currently, we've got a number of cards with some potential that are hampered by what I hope are unfortunate technical decisions and not greed.

      Adding to that, the gains are simply ridiculously tiny. The 4060 Ti was dead on arrival because it failed to provide better performance than a 6700XT (at the same price point, a two year old card, with better memory). We're not seeing any reason to upgrade from mid-range two year old cards, and this is because companies flat out refuse to drop prices. The 7600 showed some promise in price reduction, but that was a jerk reaction, not a calculated decision, and the 7600 is also quite bad for its price.

      5 votes
      1. [3]
        CptBluebear
        Link Parent
        It boggles the mind. The 4060, 4060TI, and 4070 all look incredibly unappealing with their low VRAM and/or bus width. 8GB VRAM is already starting to bottleneck on certain games.

        that isn't bottlenecked by frankly idiotic decisions with regards to lane width or memory limitations

        It boggles the mind. The 4060, 4060TI, and 4070 all look incredibly unappealing with their low VRAM and/or bus width.

        8GB VRAM is already starting to bottleneck on certain games.

        10 votes
        1. st3ph3n
          Link Parent
          That gives me serious General Motors back in the day vibes, where nothing was allowed to be better than the Corvette even if they had the tech already developed.

          That gives me serious General Motors back in the day vibes, where nothing was allowed to be better than the Corvette even if they had the tech already developed.

          3 votes
        2. majromax
          Link Parent
          I think it might be an attempt to segment the AI market towards more expensive, "professional" cards. Graphics cards/TPUs face strong memory pressure for ML training. More memory means both bigger...

          I think it might be an attempt to segment the AI market towards more expensive, "professional" cards.

          Graphics cards/TPUs face strong memory pressure for ML training. More memory means both bigger models and more training data "close at hand" for batch processing. If consumer GPUs consistently offered 16GB of VRAM, then they'd further displace the more expensive, more profitable cards that NVidia wants to sell to outfits like OpenAI.

          1 vote
    5. [2]
      paddirn
      Link Parent
      I’ve had an RTX 2070 Super from before the pandemic and it still seems to play every game I throw at it pretty well. I’d like to upgrade so that arbitrary, pretty numbers go up, but every time I...

      I’ve had an RTX 2070 Super from before the pandemic and it still seems to play every game I throw at it pretty well. I’d like to upgrade so that arbitrary, pretty numbers go up, but every time I randomly decide to start looking at better models, the prices are too ridiculous and the benefit doesn’t seem worth it. I’ll probably keep chugging along until my card gives up the ghost or until I see a good deal come along.

      5 votes
      1. Hytechlowlife
        Link Parent
        I'm in the same boat, with the same card. It was already like pulling teeth to get the 2070 Super but in hindsight that was actually a good deal in comparison to the ridiculously nonexistent...

        I’d like to upgrade so that arbitrary, pretty numbers go up, but every time I randomly decide to start looking at better models, the prices are too ridiculous and the benefit doesn’t seem worth it.

        I'm in the same boat, with the same card. It was already like pulling teeth to get the 2070 Super but in hindsight that was actually a good deal in comparison to the ridiculously nonexistent price/perf increases of this current generation of cards. If anything the GPU market seems to be going backwards instead of forwards.

        While I can afford to buy top of the line now and not think about upgrading for another 5 years or so, there simply isn't enough benefit to my gaming and I can't justify supporting how corrupt and anti-consumer the industry has become.

        Guess I'm sticking to older games until my 2070 kicks the bucket or prices return to sane levels, which seems unlikely.

    6. [6]
      teaearlgraycold
      Link Parent
      I’ve got a 12700KF, 32GB of DDR4, and a 3080 Ti. My current computer is too new to justify an upgrade. But I wouldn’t be super happy with the prospect of buying a 40 series card. I would probably...

      I’ve got a 12700KF, 32GB of DDR4, and a 3080 Ti. My current computer is too new to justify an upgrade. But I wouldn’t be super happy with the prospect of buying a 40 series card. I would probably get a top end AMD card (and one of their CPUs) if I was building my computer today. I don’t get the AMD graphics card hate. Their GPU works amazingly in the Steam Deck - and that’s while also running in a compatibility layer. I’ve had multiple AMD cards before. I can think of only one glitch that wouldn’t have happened on an NVidia card (Metro 2033 shaders ran out of order and caused fire shaders to interact with scope reticles). Also I don’t know if I’ve ever done more than load up a game that supports DLSS or RTX. I could not care less about AMD’s lack of support.

      I think new generations of graphics cards should ideally make you excited about what you can now get for a certain price.

      3 votes
      1. [5]
        supergauntlet
        Link Parent
        AMD is pretty good but I don't think VCE works very well in Parsec yet, so I'm waiting for that. The 7900 xt and xtx are looking less and less bad with time, but honestly I'm more interested in...

        AMD is pretty good but I don't think VCE works very well in Parsec yet, so I'm waiting for that. The 7900 xt and xtx are looking less and less bad with time, but honestly I'm more interested in Battlemage. Intel's dedicated GPU offerings are surprisingly good now that their drivers are up to par. Battlemage should have cards at a more reasonable price point but with better performance, but we're looking at Q2 24 at the earliest for that. That still leaves a year of price cuts and hopefully refurb cards as the ai hype dies down or as fab capacity increases to cope with the demand.

        There have been worse times to buy a GPU. Lightly used 30 series cards are going for great prices, and those are very powerful cards. the 3070 is no slouch. If I had to buy a card right now it would probably be an r/hardwareswap 3080 or 3070.

        1 vote
        1. [4]
          CptBluebear
          Link Parent
          Did they ever look bad? Aside from some driver nonsense at the start they've been top on the performance side from the get go.

          The 7900 xt and xtx are looking less and less bad with time,

          Did they ever look bad? Aside from some driver nonsense at the start they've been top on the performance side from the get go.

          2 votes
          1. [3]
            supergauntlet
            Link Parent
            Driver problems and the vapor chamber issues, but the performance even after the initial updates that made it work at a basic level was a bit disappointing. Was just hoping for more is all.

            Driver problems and the vapor chamber issues, but the performance even after the initial updates that made it work at a basic level was a bit disappointing. Was just hoping for more is all.

            1 vote
            1. [2]
              CptBluebear
              Link Parent
              Oh I forgot about that vapor chamber stuff. That was a relatively small batch in reference cards only though right? The performance of the xtx at the moment seems to marginally edge out the 4080...

              Oh I forgot about that vapor chamber stuff. That was a relatively small batch in reference cards only though right?

              The performance of the xtx at the moment seems to marginally edge out the 4080 at a lower price point. I think that's worthwhile if you're looking for something in that price range.

              1 vote
              1. supergauntlet
                Link Parent
                Yeah, they fixed it pretty fast. The 7900 series is definitely more interesting than the 4080 but I think that's mostly a testament to just how bad the 40 series is.

                Yeah, they fixed it pretty fast. The 7900 series is definitely more interesting than the 4080 but I think that's mostly a testament to just how bad the 40 series is.

                1 vote
    7. draconicrose
      Link Parent
      I'm currently running with an Nvidia GeForce GTX 1660. High-end games haven't appealed to me in a decade and the MMOs and indie games I do play run fine on it. I have no reason to upgrade other...

      I'm currently running with an Nvidia GeForce GTX 1660. High-end games haven't appealed to me in a decade and the MMOs and indie games I do play run fine on it. I have no reason to upgrade other than eh maybe things would be a little bit smoother which is not worth the prices being asked right now.

      I fully believe that the only reason the cards aren't selling are the exorbitant prices.

      3 votes
    8. Pavouk106
      Link Parent
      I hae recebtly bought used 1080 for 140€ for my relative. I consider it best buy for years to come still. First NEW card of kinda the same performance is 300€ here, while card that has a bit...

      I hae recebtly bought used 1080 for 140€ for my relative. I consider it best buy for years to come still. First NEW card of kinda the same performance is 300€ here, while card that has a bit better performance is far over 400€. Why would I buy it then?

      Friend still has 1070. He sad that if his new card isn't affordabke and doesn't offer +100% power, he ain't switching.

      2 votes
    9. lunaronyx
      Link Parent
      I'm soon to be going from a 1080 to a 4070 (have it, just haven't built my new PC yet). If I were trying to build on a budget, I would have gone with a cheaper option, since there are definitely...

      I'm soon to be going from a 1080 to a 4070 (have it, just haven't built my new PC yet). If I were trying to build on a budget, I would have gone with a cheaper option, since there are definitely cheaper decent cards out there. And if I had a more recent card, I probably wouldn't bother upgrading at all yet. But I paid $430 for a 770 back in 2013, and $520 for a 1080 when that was relatively new, so $599 for a 4070 didn't feel that much more painful to me.

      2 votes
    10. VoidSage
      Link Parent
      I've got a 2060 super, still works great for 99% of games I've considered upgrading to mid range amd card a few times (mainly because I use Linux) but just haven't been able to justify it

      I've got a 2060 super, still works great for 99% of games

      I've considered upgrading to mid range amd card a few times (mainly because I use Linux) but just haven't been able to justify it

      1 vote
    11. badamsz
      Link Parent
      I replaced my 1080 Ti with a 3080 that I was able to get from EVGA at MSRP during the great shortage. It's in my Threadripper 2950X workstation which is also starting to show it's age. It's a good...

      I replaced my 1080 Ti with a 3080 that I was able to get from EVGA at MSRP during the great shortage. It's in my Threadripper 2950X workstation which is also starting to show it's age. It's a good card and I'm not tempted by the 40 series cards at the moment but honestly I have been reaching for my steam deck more often than not lately.

      1 vote
    12. prota
      Link Parent
      My desktop’s motherboard died, and instead of upgrading I decided to simply get a Steam Deck, which ended up being significantly cheaper and covers most of my gaming interests. I’ve fallen out of...

      My desktop’s motherboard died, and instead of upgrading I decided to simply get a Steam Deck, which ended up being significantly cheaper and covers most of my gaming interests. I’ve fallen out of love with much of the current AAA landscape, and the Deck handles most games outside of those without significant compromise.

      1 vote
    13. WrathOfTheHydra
      Link Parent
      I upgraded form my 970 to the 3080 (right before Nvidia went full troll-mode, if I had known at the time I would have gone AMD). I'm going to be honest, if I hadn't upgraded then, I'd probably...

      I upgraded form my 970 to the 3080 (right before Nvidia went full troll-mode, if I had known at the time I would have gone AMD). I'm going to be honest, if I hadn't upgraded then, I'd probably have hopped to AMD the year after out of spite. I think an appropriate change in the market would be everyone else going AMD and Intel and watching Nvidia try to pan-handle out their cards for a while. They're not selling well because everyone who'd buy one is an enthusiast and anyone who's an enthusiast knows Nvidia is just trying to squeeze money out of people for as little gain as possible. A new card would be worth it if they came down several hundred dollars (for the higher-end card) and managed to actually make a jump in both a) power consumption and b) performance.

      I think people are looking for new cards that can outperform the previous by more than marginal percent differences.

    14. Protected
      Link Parent
      I bit the bullet and upgraded from my 1070ti to a 3070ti during the pandemic, for a mind-boggling €850. VR can use the 3070ti to its fullest and more. But why would I upgrade to a 4070ti so "soon"...

      I bit the bullet and upgraded from my 1070ti to a 3070ti during the pandemic, for a mind-boggling €850. VR can use the 3070ti to its fullest and more. But why would I upgrade to a 4070ti so "soon" when it has an even heftier RSP of €900 to €1200 (that being more than $1300?) For the upgrade to be technologically worthwhile, I'd want at least a 4080, with an RSP of €1400 to €1700 ($1850). A top of the line 4090 might cost more than €2100 (almost $2300). They're all bad deals and I can't afford to spend that kind of money right now. No new card would be worth it for me at these prices because the current one is still good for at least another couple of years.

    15. [2]
      AboyBboy
      Link Parent
      My current setup is an fx-4350, an rx560 4gb with 896 compute units, and 8gb ddr-3. I built this PC in late 2016 and have only lightly upgraded it since. The upgrades include a better WiFi card...

      My current setup is an fx-4350, an rx560 4gb with 896 compute units, and 8gb ddr-3. I built this PC in late 2016 and have only lightly upgraded it since. The upgrades include a better WiFi card more storage, and that's about it. The rx560 is effectively a like for like replacement for an rx460 4gb that started artifacting during the peak of the crypto craze.

      I've strongly considered buying a used fx-8350 so I can get more consistent performance while playing Sea of Thieves and Deep Rock Galactic. I've no immediate interest in upgrading anything else as I have no need to. I've noticed that the types of games I'm interested in aren't typically all that demanding in terms of hardware. Additionally, upgrading the GPU would likely lead to bottlenecks elsewhere, in fact I think I am CPU bottlenecked in certain scenarios already. Upgrading off of the fx series would also require a new motherboard and ram.

      An appropriate change in the market would obviously be massive price drops, but also a renewed focus on the mid to low end.

      The newer generation cards aren't selling well because they are poor value, and because they do not provide a substantial enough performance uplift over previous generations, especially when you consider price to performance.

      2 things that would be necessary to make a new card worth it would be a multitude of interesting games that are too demanding for my current setup, and much better low end offerings than the garbage we are being offered now.

      1. sqew
        Link Parent
        Can't help but reply to you since I hadn't heard of anyone still running a Piledriver chip in a while! If I remember right, I almost used one of its siblings (FX-6300 maybe?) in my first build,...

        Can't help but reply to you since I hadn't heard of anyone still running a Piledriver chip in a while! If I remember right, I almost used one of its siblings (FX-6300 maybe?) in my first build, but I got delayed building it to the point where I just waited a month or two for availability of first-gen Ryzen chips.

        I've honestly thought about picking up a used FX-8350 as well; it sounds kinda fun to have a backup PC with a "this was what was current when I first got into PC building" chip in it for the nostalgia.

        1 vote
    16. Interesting
      Link Parent
      And the frustrating thing about VR headsets only real justification for modern cards is that the graphics fidelity of VR games is almost all stuck at bottom of the barrel now, because games want...

      And the frustrating thing about VR headsets only real justification for modern cards is that the graphics fidelity of VR games is almost all stuck at bottom of the barrel now, because games want to be cross platform for the Oculus Quest. I'm hoping that there will be some non-exclusive PSVR games that push the envelope a little.

    17. Thomas-C
      Link Parent
      I've been running with laptops for years, and about a year ago upgraded mine to an Asus g513qr. I went from a 1060 to a 3070. The processor in the new one is a 5900hx, I don't remember the old...

      I've been running with laptops for years, and about a year ago upgraded mine to an Asus g513qr. I went from a 1060 to a 3070. The processor in the new one is a 5900hx, I don't remember the old one. 32gb of ram and 2, 2tb ssd's.

      I don't upgrade very often, like once every three to five years, so I won't be doing that for a while. The steam deck means it's probably gonna be longer than that - I'd rather follow that product line and just live with reduced fidelity to be totally honest.

      GPU pricing has to come down, it's that simple. Ive settled into a pattern of being one gen behind because the pricing didn't really do that. Maybe shifting to ai leads to something, but from what I can see there's dedicated hardware for that, so who knows.

      At these prices I'm not sure I know what would make them worth it. There aren't any existing things id need so bad I'd pay out like that. Honestly a tough one to answer, truly I can't come up with anything.

    18. sqew
      Link Parent
      My current PC is an R5 1600, 2x GTX 970s, 16GB RAM. I would've avoided SLI if I could've, but I built it during the first GPU shortage and couldn't afford a new GPU. My buddy upgraded to a 1080...

      My current PC is an R5 1600, 2x GTX 970s, 16GB RAM. I would've avoided SLI if I could've, but I built it during the first GPU shortage and couldn't afford a new GPU. My buddy upgraded to a 1080 and offered me the 970s as a short-term loan that eventually became permanent.

      Since building it in 2017, I haven't done too much to it. I upgraded from the stock AMD cooler to a nice tower cooler and added some extra storage, but I haven't been using it enough to really justify shelling out for a new GPU or upgraded CPU. It still runs games well enough at 1080p that I don't really mind holding off on doing anything new.

      I'm currently thinking that I'll wait another year or so to see if things cool off a bit, then go for a fresh build. All I really want is to be able to game at 1440p with whatever I build and then use it for all my productivity and programming needs, so my plan is basically just to get a decent CPU and then look at what GamersNexus or Linus Tech Tips or someone seem to think a reasonable 1440p gaming GPU is and pick that up. At some level, I'm more excited about getting one of the fancy new PCIe Gen4/5 NVMe drives than anything else, my SATA SSDs and HDD are brutally slow compared to what even a PCIe Gen3 drive can do.

    19. Caliwyrm
      Link Parent
      Corporate hindsight is 20/20. We are a blended family of gamers in my house. Myself and 4 teenagers who should have been their dream demographic. At any given time I would update at least 1 or 2...

      What do you think would be an appropriate change in the market?

      Corporate hindsight is 20/20. We are a blended family of gamers in my house. Myself and 4 teenagers who should have been their dream demographic. At any given time I would update at least 1 or 2 computers a year, we'd pick up a new console or two after the initial price dropped, etc.

      Not too long ago we could have sold our 4 year old, slightly above middle of the road graphic cards for MORE than I paid for them that's how bad it was. When we could no longer find reasonably priced cards to upgrade to we learned to replay older PC games and/or get around to our back catalogs and use our older consoles when these companies did NOTHING to stop the scalpers and miners. For anything else, they play mobile games or the switches we luckily got right before they were scalped out of existance.

      Since we weren't their intended demographic anymore, we moved on.

      While I don't wish unemployment on anyone, part of me does hope that the graphics card companies eat quite a bit of shit over their decisions to turn a blind eye on scalpers/miners.

    20. 1338
      Link Parent
      I just upgraded to a 3060 TI a month or so ago. I'm mixed whether it was even worth it, only reason I upgraded was because I was having a freezing issue I thought was related to my graphics card....

      I just upgraded to a 3060 TI a month or so ago. I'm mixed whether it was even worth it, only reason I upgraded was because I was having a freezing issue I thought was related to my graphics card. Turned out it wasn't, it was due to one of my old hard drives. Before that I was using an R9 390 and it was good enough in everything. I remember being tempted to get an RTX a few years ago for the ray-tracing feature but realistically none of the games I play often have ray tracing so what's the point. Maybe when the Cyberpunk DLC comes out and I recircle to that game it'll feel worth it.

      Either way it will be some years before I upgrade again. My CPU is an i7-8700k which would be next on the list, then going from 16 GB DDR4 RAM to 32+ DDR5, and even getting a multi-TB SSD feels infinitely more useful than the marginal difference between a few gens of GPU.

  2. [16]
    A1sound
    Link
    Why should I upgrade at this point, when a new flagship card costs the same as a decent used car, and my RX580 that I bought nearly 6 years ago runs all my games at medium-high settings?

    Why should I upgrade at this point, when a new flagship card costs the same as a decent used car, and my RX580 that I bought nearly 6 years ago runs all my games at medium-high settings?

    36 votes
    1. [7]
      bln
      Link Parent
      I think you haven’t looked recently at prices for a decent used car.

      I think you haven’t looked recently at prices for a decent used car.

      24 votes
      1. [6]
        A1sound
        Link Parent
        I look all the time! Autotrader, facebook marketplace, etc. I saw a super clean, low-mileage '98 Saab 9-3 on Autotrader the other day for £900 that actually really temped me!

        I look all the time! Autotrader, facebook marketplace, etc. I saw a super clean, low-mileage '98 Saab 9-3 on Autotrader the other day for £900 that actually really temped me!

        14 votes
        1. [4]
          g33kphr33k
          Link Parent
          1998, feels like yesterday, but there's a couple of decades in the way.

          1998, feels like yesterday, but there's a couple of decades in the way.

          10 votes
          1. [3]
            A1sound
            Link Parent
            I'll just crawl into my grave right now then, shall I?

            I'll just crawl into my grave right now then, shall I?

            6 votes
            1. [2]
              g33kphr33k
              Link Parent
              I just meant that for a car, 25 years is quite old, even though it feels like 1998 was not long ago (am an old bloke).

              I just meant that for a car, 25 years is quite old, even though it feels like 1998 was not long ago (am an old bloke).

              4 votes
              1. A1sound
                Link Parent
                Yeah, I suppose. Plenty run fine though!

                Yeah, I suppose. Plenty run fine though!

                2 votes
        2. countchocula
          Link Parent
          Nice, i once had a 2003 saab 9-3. I loved that car but it hated me. Electric gremlins and all that, engine was brilliant though

          Nice, i once had a 2003 saab 9-3. I loved that car but it hated me. Electric gremlins and all that, engine was brilliant though

          2 votes
    2. [7]
      supergauntlet
      Link Parent
      Power consumption is the big one but a $200 lightly used or new card will still at least double your performance. If you don't find yourself needing more performance I see no reason to upgrade.

      Power consumption is the big one but a $200 lightly used or new card will still at least double your performance. If you don't find yourself needing more performance I see no reason to upgrade.

      7 votes
      1. A1sound
        Link Parent
        I should also probably mention that I play at 1024x768 on a monitor pushing 20 years old... I imagine if you're playing at 1080p or even going for 1440, then you might need more performance!

        I should also probably mention that I play at 1024x768 on a monitor pushing 20 years old... I imagine if you're playing at 1080p or even going for 1440, then you might need more performance!

        3 votes
      2. [5]
        lucg
        Link Parent
        What about heavily used? Is there anything that truly wears about chips? I had always assumed that a new fan should mean a microchip that functions (if you do a benchmark or so) is good as new.

        lightly used or new card

        What about heavily used? Is there anything that truly wears about chips? I had always assumed that a new fan should mean a microchip that functions (if you do a benchmark or so) is good as new.

        1. [4]
          supergauntlet
          Link Parent
          Sure, but the fan bearings wear out and solder can fail eventually. But yeah generally it's mostly the fan.

          Sure, but the fan bearings wear out and solder can fail eventually. But yeah generally it's mostly the fan.

          1 vote
          1. [3]
            lucg
            Link Parent
            Oh, due to repeated thermal expansion or why does that happen with age (if you know)?

            solder can fail eventually.

            Oh, due to repeated thermal expansion or why does that happen with age (if you know)?

            1 vote
            1. [2]
              sqew
              Link Parent
              As far as I've heard, it usually is the expansion that does it. I think that it's usually pretty rare, though, unless there was some defect in workmanship on the original solder job that just took...

              As far as I've heard, it usually is the expansion that does it. I think that it's usually pretty rare, though, unless there was some defect in workmanship on the original solder job that just took a lot of cycles to show up.

              For the most part, it seems like old chips can usually be trusted as long as they weren't beat to hell by previous owners (bad conditions, overvolting, etc.). If I remember right, for a while people were even saying that mining GPUs might be better, since the miners typically undervolted them for energy efficiency and cooled them well.

              2 votes
              1. supergauntlet
                Link Parent
                if the fans were replaced I would trust a mining gpu just fine. especially if the discount was significant.

                if the fans were replaced I would trust a mining gpu just fine. especially if the discount was significant.

                2 votes
    3. Crimson
      Link Parent
      I've got a 1660 Ti in my machine and I honestly cannot see a point in the next 5 years (if not more) that I would upgrade it. The only way I would even consider upgrading it is if it died on me....

      I've got a 1660 Ti in my machine and I honestly cannot see a point in the next 5 years (if not more) that I would upgrade it. The only way I would even consider upgrading it is if it died on me. Honestly the 1660 Ti is overkill for the games I play.

      2 votes
  3. [4]
    skybrian
    Link
    Being able to play whatever games you want on the hardware you have seems like good news for gamers?

    Being able to play whatever games you want on the hardware you have seems like good news for gamers?

    19 votes
    1. [3]
      CptBluebear
      Link Parent
      Ha, that's an interesting take. I suppose it is, though I'm sometimes slightly miffed that there's no boundary pushing games being released. Shadow of the Colossus comes to mind. At the time of...

      Ha, that's an interesting take. I suppose it is, though I'm sometimes slightly miffed that there's no boundary pushing games being released.

      Shadow of the Colossus comes to mind. At the time of release it absolutely strained the PS3 for what it was worth with incredible results. I'm missing that feeling of seeing something truly groundbreaking.

      Nevertheless I do agree with you. I don't particularly mind that the 1080 runs most games at a decent level. It saves me the hassle of spending a thousand or more to get some improvements.

      3 votes
      1. skybrian
        Link Parent
        There are other ways that a game can be groundbreaking. Maybe it will result in less emphasis on special effects? At least, on regular computers. VR has extreme performance requirements and any...

        There are other ways that a game can be groundbreaking. Maybe it will result in less emphasis on special effects?

        At least, on regular computers. VR has extreme performance requirements and any performance improvements there will make a big difference.

        1 vote
      2. FeminalPanda
        Link Parent
        The only reason I upgraded my GPU was for VR, I think that's the most taxing games right now. At least for base playability.

        The only reason I upgraded my GPU was for VR, I think that's the most taxing games right now. At least for base playability.

  4. [5]
    Earthboom
    Link
    My rule of thumb is two generations at least in between upgrades. I always buy the one right below the top one and then I forget it until new features seem interesting. You absolutely do not need...

    My rule of thumb is two generations at least in between upgrades. I always buy the one right below the top one and then I forget it until new features seem interesting.

    You absolutely do not need the bleeding edge. I didn't even need to upgrade my 1080 as well I could still pull off 2k at respectable frames for most games. I did upgrade because I wanted 4k and ray tracing and all the other goodies.

    If I didn't want 4k, or any of the bells and whistles? I wouldn't have upgraded. No reason to. The 1080 is a solid card and plays most games at 1080p fairly well. The gaming industry has kind of plateaued anyway on the graphics department. There's some beauties coming out but they've slowed down because of the amount of work and sacrifice required to make them.

    13 votes
    1. [4]
      Octofox
      Link Parent
      2K? Do you mean 1440p? 2k would be basically the same as 1080p.

      pull off 2k

      2K? Do you mean 1440p? 2k would be basically the same as 1080p.

      3 votes
      1. owyn_merrilin
        Link Parent
        1440p is often marketed to gamers as 2k for some bizarre reason, even though you're absolutely right that 1080p is a 2k resolution itself. Then again the whole xk thing is intentionally confusing...

        1440p is often marketed to gamers as 2k for some bizarre reason, even though you're absolutely right that 1080p is a 2k resolution itself.

        Then again the whole xk thing is intentionally confusing marketing to begin with. They swapped which of the two numbers are used in the short hand on us to make the difference sound bigger than it is. 2160p doesn't sound as much bigger than 1080p as 4k does if you realize that the k means thousand. Even though it's exactly as much bigger because the 4k comes from it being double the width, and the 2160 comes from it being double the height.

        9 votes
      2. [2]
        Earthboom
        Link Parent
        2k is 1440p I thought with 4k being 2160p.

        2k is 1440p I thought with 4k being 2160p.

        3 votes
        1. teaearlgraycold
          Link Parent
          2k means about 2,000 pixels wide. 1080p would be 2k. 1440p would be 2.5k, 2160p is 4k.

          2k means about 2,000 pixels wide. 1080p would be 2k. 1440p would be 2.5k, 2160p is 4k.

          10 votes
  5. [2]
    compsciwizkid
    Link
    I built my PC in 2018, and at the time I bought a GeForce 1060 GTX for ~$300. I can still play most games fine; some require low settings. I'm starting to fall below the minimum requirements on...

    I built my PC in 2018, and at the time I bought a GeForce 1060 GTX for ~$300.

    I can still play most games fine; some require low settings. I'm starting to fall below the minimum requirements on the beastliest new games, which I guess is a sign that maybe it's time to upgrade.

    When I do, I'll want to build a new PC. It's not like that GPU would be the only bottleneck on my performance. I just can't really convince myself to do it yet.

    I really like https://www.logicalincrements.com/

    It helps me get an idea of what is out there, and if I follow a budget like my 2018 build that'd put me in the ~$1200 "very good" category, which includes the RTX 3070 for $480. I really don't love the idea of spending that much on a GPU, and I was hoping prices would come down more, though that's probably not very likely. It's not entirely clear to me, either, whether that would be an "enormous" upgrade. Will I still be able to play most games in 5 years with a card like that?

    I'm hopeful that their sales tanking like this means that they will adjust their prices to fit the market.

    7 votes
    1. sparksbet
      Link Parent
      I honestly think they got spoiled by the GPU shortage prices and are trying to sell these like it's then. I snagged my 3070 right at the beginning of the shortage for 600€ and was lucky to do so...

      I honestly think they got spoiled by the GPU shortage prices and are trying to sell these like it's then. I snagged my 3070 right at the beginning of the shortage for 600€ and was lucky to do so because it was easily going for 2-3x that price a few months later. But we're no longer in that environment and the most recent gen just isn't enough better to be worth the prices they're asking. Hopefully this brings them back to reality when it comes to pricing.

      4 votes
  6. [4]
    dave1234
    Link
    I'm still using a GTX 970, in its second PC. I built my current PC in late 2021 and planned to upgrade the GPU in 2022, when the outrageous prices were finally coming down. But even when the...

    I'm still using a GTX 970, in its second PC.

    I built my current PC in late 2021 and planned to upgrade the GPU in 2022, when the outrageous prices were finally coming down.

    But even when the prices came down, I didn't feel like it was worth upgrading. The GTX 970 still runs everything I want to play well enough, and I'm finding it difficult to justify spending hundreds of Dollars just for better graphics. Maybe I'm not the gamer that I once was.

    I'll upgrade when my GTX 970 can no longer play the games that I want to play.

    4 votes
    1. [2]
      Not_Enough_Gravitas
      Link Parent
      The GTX 970 is what I refer to as "that old mate" a card that can show up, not put up a fuss, and get the job done. You never have problems with it and it's always fun to hang out with. I own a...

      The GTX 970 is what I refer to as "that old mate" a card that can show up, not put up a fuss, and get the job done. You never have problems with it and it's always fun to hang out with.

      I own a few of them and realized that if I play games with these cards I focus less on maxing out my graphics and focus more on just enjoying the games. My 1080ti is also slowly becoming an old mate, but im not in any position to upgrade at this time and don't really see a need to.

      2 votes
      1. Kryvens
        Link Parent
        I think I’ll be a bit sad when I eventually upgrade from my 1080ti. It does everything I need at 1440p, and the only thing that give me a real desire to upgrade is ray tracing, but I can’t justify...

        I think I’ll be a bit sad when I eventually upgrade from my 1080ti. It does everything I need at 1440p, and the only thing that give me a real desire to upgrade is ray tracing, but I can’t justify an upgrade.

        I’ve not really thought about it until your comment but you hit the nail bang on the head. It’s like an old reliable mate. Not a rock star but someone you’re very happy to spend hours with at the pub!

        2 votes
    2. Diff
      Link Parent
      Same here. My GTX 970 is theoretically getting long in the tooth, but honestly it's still able to handle all the tasks I throw at it. Especially since I got my Steam Deck, it's really driven home...

      Same here. My GTX 970 is theoretically getting long in the tooth, but honestly it's still able to handle all the tasks I throw at it. Especially since I got my Steam Deck, it's really driven home that you don't need to be able to max things out to look good and play well and have fun.

      And in non-gaming things, while there's much tighter limits on what I can do, I do half my work on my MacBook Pro which is far, far more limited than my desktop. As long as I can still get things done on my laptop, how could I justify needing more on my desktop?

      1 vote
  7. [5]
    adam_kadmon
    Link
    Why would anyone? The only people who really want was are trend-chasers, people who insist on absolutely maximum graphical fidelity with 4k and ray tracing, and maybe people who self-host various...

    Why would anyone? The only people who really want was are trend-chasers, people who insist on absolutely maximum graphical fidelity with 4k and ray tracing, and maybe people who self-host various AI stuff. They are also have to be at least upper-middle class even if they are from richer parts of the world, since a whole new PC supporting something like 4090 can easily cost over three, four or five grand.

    Vast majority of people I know game on 1080p/60hz monitors, and you don't need latest gen GPU for that. Last year near Christmas I've found a deal for a 3440x1440 144hz monitor for less than $300, and finally upgraded my build from slowly dying GT1030 to RX6600(quite a jump, I know), and it's honestly enough for me. Whole thing costed like $800 and I can play whatever there is on the market. Maybe sometimes sacrificing some graphical settings to get to 60fps on newest, graphically impressive releases like The Last of Us, or just turning on FSR if I can't be bothered. I can even run Stable Diffusion for some quick concept art iterating.
    My point is - even if I could comfortably afford it, or the 4090 suddenly cost $200, there's literally no reason for me to upgrade. People who own 3090s have even less.

    If anyone is interested in my full build:
    PowerColor Fighter RX6600
    Intel i3-12100f
    ASUS Prime H610M-K D4 motherboard
    Corsair Vengeance LPX 3200 2x8GB sticks
    Samsung 980 1TB SSD
    ASUS TUF Gaming Bronze 750W PSU
    And Zalman S2 TG for a case.

    You can probably go cheaper by getting some 500-600W power supply(I measured and didn't see it drain more than 250W from the socket) and some lower priced Kingston SSDs, but overall I consider this a fairly economic, future proof setup. You don't need to replace CPU cooling on 12100f, but the stock fan is pretty loud. I just jammed whatever compatible decent radiator I found in my local tech shop and now I can't hear my PC at all.

    4 votes
    1. [4]
      bertro
      Link Parent
      Well, I'm no trend chaser, but I got a 4070Ti because the only thing I play is flight simulation and it's so CPU-bottlenecked that Nvidia's frame generation feature was my only hope of getting...

      Well, I'm no trend chaser, but I got a 4070Ti because the only thing I play is flight simulation and it's so CPU-bottlenecked that Nvidia's frame generation feature was my only hope of getting decent FPS across the 3 screens in my home cockpit, totaling a resolution of 8560x1440. I would have happily gone for the 4080 if the price were more reasonable, because at this resolution, the sim definitely pushes my VRAM to the limit.

      6 votes
      1. [3]
        adam_kadmon
        Link Parent
        But you have to agree, this is a very specific scenario which vast doesn't apply to vast majority of users. It's a rather niche title with an incredibly niche hardware configuration.

        But you have to agree, this is a very specific scenario which vast doesn't apply to vast majority of users. It's a rather niche title with an incredibly niche hardware configuration.

        4 votes
        1. [2]
          bertro
          Link Parent
          Yeah, absolutely. What I meant is that I didn't buy it just to say that I'm on the latest gen. It actually has a feature that makes a huge difference to me. If it weren't for frame gen, I would...

          Yeah, absolutely. What I meant is that I didn't buy it just to say that I'm on the latest gen. It actually has a feature that makes a huge difference to me. If it weren't for frame gen, I would have definitely skipped this generation

          1 vote
          1. adam_kadmon
            Link Parent
            I didn't mean for my comment to come off like that, sorry. There are a lot of possible reasons someone might want or need top of the line GPU. Specific games and setups require it, some jobs too,...

            I didn't mean for my comment to come off like that, sorry. There are a lot of possible reasons someone might want or need top of the line GPU. Specific games and setups require it, some jobs too, but it's still insane for Nvidia to expect great sells for $2k card after the decimation of cryptofarming. Those powerhouses have their place, but really, what consumers want are decent cheap cards, while corporations are intent on producing homemade jet engines.

            1 vote
  8. ChingShih
    Link
    The Steam Hardware & Software Survey: June 2023 is out. It doesn't tell us much about the changes in GPU usage aside from DX version, but if you scroll down and select the "Windows only" filter...

    The Steam Hardware & Software Survey: June 2023 is out. It doesn't tell us much about the changes in GPU usage aside from DX version, but if you scroll down and select the "Windows only" filter for the combined stats it shows a pretty stark picture of GTX 1650 being the most common GPU across DX versions (which implies across OS versions as well). The next most common GPUs are Intel HD variants. Both of those are common in laptops and that suggests to me that's where a lot of gaming is being done.

    As laptop sales have picked up over the last 10+ years and people really prefer having fully portable devices, that seems like it's leaving less room in the market for people to upgrade their own hardware. And there are fewer people who know how because they've grown up on a system where hardware upgrades weren't possible.

    On a related note, I'd like to see a chart comparing "frequent/infrequent gamers" to the systems they own. The increase in hardware profiles recognized by Steam in the late May/early June timeframe implies that a lot of people with Intel/Nvidia computers turned them on at that time to check out a new game, but didn't stick around long. Maybe some new AAA title or F2P content dropped? But if it doesn't hold their interest, then their representation in the stats returns to normal. I suspect people who game more regularly (who are more likely to game on common hardware) have a higher representation on these hardware surveys.

    4 votes
  9. interrobang
    Link
    I've always used eVGA cards and now that they're gone I am not looking forward to trying another brand. Going to hold onto my 2070 as long as I can.

    I've always used eVGA cards and now that they're gone I am not looking forward to trying another brand. Going to hold onto my 2070 as long as I can.

    4 votes
  10. pete_the_paper_boat
    Link
    My CPU is more ancient than a GTX 960. I've got no good reason to upgrade unless I do the whole PC. And I don't think that's worth it at the moment. I'm not playing very demanding titles, if I've...

    My CPU is more ancient than a GTX 960. I've got no good reason to upgrade unless I do the whole PC.

    And I don't think that's worth it at the moment. I'm not playing very demanding titles, if I've even got time for that anymore.

    3 votes
  11. [2]
    zzzz
    Link
    I recently upgraded to a 4090 purely for stable diffusion and local LLaMA. I would say it has not been cost effective, but on the other hand I doubt I would be experimenting as much if I were...

    I recently upgraded to a 4090 purely for stable diffusion and local LLaMA. I would say it has not been cost effective, but on the other hand I doubt I would be experimenting as much if I were paying hourly via vast.ai . I don't play modern gaming titles, and so only the occasional flight simulator run benefits much at all.

    I think right now the Nvidia valuation is propped up almost entirely by the AI valuation bubble. If they don't translate into profits soon there is some risk of a pop, given the high interest rate environment.

    And on the laptop end it's absolutely insane. Even someone extremely wealthy does a double take when laptops cost over $5000 USD, and that's with only 16gb VRAM compared to 24gb for the desktop 4090.

    2 votes
    1. pageupdraws
      Link Parent
      I am in your camp. I built a 4090 system at the start of 2023 and have only used it for locally run generative models. Mostly SD, but also LLM and now some audio generation. I have always been a...

      I am in your camp. I built a 4090 system at the start of 2023 and have only used it for locally run generative models. Mostly SD, but also LLM and now some audio generation.

      I have always been a gamer but so far I have installed literally zero games because the generative local models have been way more interesting.

      Games on the Switch are better, despite older graphics there.

  12. [3]
    GFXcrossfire
    Link
    Because plenty of people are happy with 1440p 144 instead of having to spend a crazy amount to upgrade to 4K and still have to lock to 60 due to poor optimization. I’m still running a 2600K and...

    Because plenty of people are happy with 1440p 144 instead of having to spend a crazy amount to upgrade to 4K and still have to lock to 60 due to poor optimization. I’m still running a 2600K and 1070TI for casual level 1440p gaming and that’s more than sufficient. It also seems like there hasn’t been a wow factor worth of improvement in several GPU generations now.

    2 votes
    1. [2]
      dhcrazy333
      Link Parent
      I'd say a majority of gamers are fine with 1080p 60hz. You don't need the latest gen for that. As you start getting into the 144hz 1440p range, yeah more people are willing to buy better cards to...

      I'd say a majority of gamers are fine with 1080p 60hz. You don't need the latest gen for that. As you start getting into the 144hz 1440p range, yeah more people are willing to buy better cards to achieve those specs for a smoother and higher fidelity gaming experience, but the current gen cards are just a terrible value.

      6 votes
      1. vord
        Link Parent
        Going higher than 1080p also introduces new problems, like poorly scaling UIs. I think 1080p is that magic sweet spot where resolution stopped being the primary limiting for gameplay.

        Going higher than 1080p also introduces new problems, like poorly scaling UIs.

        I think 1080p is that magic sweet spot where resolution stopped being the primary limiting for gameplay.

        1 vote
  13. JoshuaJ
    Link
    I think the most taxing game I play is star citizen. I upgraded my RAM for that and cities skylines. I can routinely use >20gb of ram for those. But nothing else really taxes my system and it’s...

    I think the most taxing game I play is star citizen. I upgraded my RAM for that and cities skylines. I can routinely use >20gb of ram for those.

    But nothing else really taxes my system and it’s pretty modest.

    1 vote
  14. BeardyHat
    Link
    The last new card I bought was a GTX 970, back in what...2015? 2016? I can't recall. Anyway, it chugged away no problem until last year, my buddy upgraded his system and gave me his old 1070 as a...

    The last new card I bought was a GTX 970, back in what...2015? 2016? I can't recall. Anyway, it chugged away no problem until last year, my buddy upgraded his system and gave me his old 1070 as a little upgrade. The 970 is still in service in the family computer connected to the TV and regularly gets gamed on, never really showing its age.

    I keep thinking about upgrading my card, but then I ask myself, why? I've been a PC gamer for almost 30 years now and I really just don't care about the latest, greatest graphics anymore; to my eyes graphics seemed to have plateaued about 10-years ago and since then, it's mostly minor little upgrades that I find don't change the game for me. Couple that with a lack of interest in anything in the AAA space currently and I find my laptop with an MX150 and my Steam Deck more than adequately cover the types of games that I want to play these days. The most mind blowing thing I've played recently was Pentiment and that'll run on pretty much anything.

    Not to mention, having older hardware, I tend to find myself exploring my library a little more. So many games over the years that I bought, put an hour into and then put down forever or even others I bought and never actually got around to playing. I have well over 1000 games between Steam, GoG, Epic, etc, so whenever I feel like playing something in a specific genre, I'll look through my libraries and see what I have, rather than playing the latest thing. Diablo 4 coming out recently gave me an itch for an ARPG, so I looked through my library and started playing Torchlight 2, a game I hadn't touched in ten years and I'm having a great time with it.

    What do I need a new GPU for again? I've got plenty to play already and wiser things to spend $500 on (cough Warhammer cough).

    1 vote
  15. Stumpdawg
    Link
    I snagged a red devil rx6900xt ultimate a few months back, upgraded my 3800x to a 5800x3d a little before that...I'm good on new hardware for a while.

    I snagged a red devil rx6900xt ultimate a few months back, upgraded my 3800x to a 5800x3d a little before that...I'm good on new hardware for a while.

    1 vote
  16. st3ph3n
    Link
    Agreed 100%. I was in the market for a new GPU a couple of months ago and wound up going with an AMD 6750XT rather than anything current-gen. Nvidia in particular have been rinsing their consumers...

    Agreed 100%. I was in the market for a new GPU a couple of months ago and wound up going with an AMD 6750XT rather than anything current-gen. Nvidia in particular have been rinsing their consumers for a while now. I think that they got fat and lazy in the crypto boom.

    1 vote
  17. LGUG2Z
    Link
    I think that with the release and popularity of the Steam Deck we'll see more and more games targeting good performance on that hardware, which will translate into longer lifespans for people with...

    I think that with the release and popularity of the Steam Deck we'll see more and more games targeting good performance on that hardware, which will translate into longer lifespans for people with older video cards. It seems like the majority of people I know who are buying the 4090s are using it to run local LLM models.

    1 vote
  18. Houdini
    Link
    Running a 1060 and have not had any issues with it really. 5 years on and I'm just now noticing it's not loading textures as well as it used to.

    Running a 1060 and have not had any issues with it really. 5 years on and I'm just now noticing it's not loading textures as well as it used to.

  19. Benson
    Link
    I have a 3080 bought when it was brand new. I don’t see myself upgrading for years, why would I? On the other hand, I’m still not sure if you can even get a ps5 yet, as every time I’ve heard of...

    I have a 3080 bought when it was brand new. I don’t see myself upgrading for years, why would I?

    On the other hand, I’m still not sure if you can even get a ps5 yet, as every time I’ve heard of them it’s been they’re sold out everywhere. Makes me think Sony might not be doing so hot selling games, since every house doesn’t have the ps5 in it.

  20. Oradi
    Link
    What kills me is 1.) I don't want to sit in front of my computer any more after I close out of work and 2.) It's so just expensive. Given I'm not all in on gaming anymore it's such a financial...

    What kills me is 1.) I don't want to sit in front of my computer any more after I close out of work and 2.) It's so just expensive. Given I'm not all in on gaming anymore it's such a financial risk. Really considering a PS5 as a result.

  21. [7]
    No-Exit-4
    Link
    Internet will go faster, cheaper and mobile in every new generation of tech. Streaming games is the logical solution.

    Internet will go faster, cheaper and mobile in every new generation of tech. Streaming games is the logical solution.

    1 vote
    1. [5]
      Octofox
      Link Parent
      GPUs will also get cheaper, smaller, and consume less power. The laws of physics set a minimum latency for streaming which can never be improved. A latency which is unacceptably high. Companies...

      GPUs will also get cheaper, smaller, and consume less power. The laws of physics set a minimum latency for streaming which can never be improved. A latency which is unacceptably high. Companies like Apple are betting pretty hard that putting super powerful chips on device and running everything possible locally is the future.

      13 votes
      1. g33kphr33k
        Link Parent
        This is a cycle trend going from run local, run server, run local, run server.

        This is a cycle trend going from run local, run server, run local, run server.

        2 votes
      2. [3]
        adam_kadmon
        Link Parent
        Latency will be less of a problem the more spread out the servers become. It's few servers on a continent right now, but it might become multiple servers in every big city at some point. I've...

        Latency will be less of a problem the more spread out the servers become. It's few servers on a continent right now, but it might become multiple servers in every big city at some point. I've played some single player games like Control and Witcher 3 on GFN for a few hours, and honestly it didn't feel bad at all.

        1. [2]
          adamcarrot
          Link Parent
          I played Elite Dangerous on gforce now for probably 100 hours or so and had no problems at all. Though I did pay solo and didn't do much combat, but still. It worked well for me.

          I played Elite Dangerous on gforce now for probably 100 hours or so and had no problems at all. Though I did pay solo and didn't do much combat, but still. It worked well for me.

          2 votes
          1. vord
            Link Parent
            Heck older Freelancer still more or less holds up, and that can run on pretty much any laptop made after 2016.

            Heck older Freelancer still more or less holds up, and that can run on pretty much any laptop made after 2016.

    2. dhcrazy333
      Link Parent
      The problem with game streaming is input lag/latency. For very casual games it may not be much of an issue, but any game that requires somewhat accurate timing it's practically unplayable. It may...

      The problem with game streaming is input lag/latency. For very casual games it may not be much of an issue, but any game that requires somewhat accurate timing it's practically unplayable. It may be an issue they can solve down the line eventually, but we are a ways off of that.

      On top of that, many areas that don't have any ISP competition have data caps. Streaming games can easily eat through that and won't be feasible for many.

      5 votes
  22. BasedOnAir
    Link
    yeah, the prices are too high!

    yeah, the prices are too high!

  23. rchiwawa
    Link
    One sure-fire way to get me to click off of an article or video is a Moore's Law Is Dead reference

    One sure-fire way to get me to click off of an article or video is a Moore's Law Is Dead reference