16 votes

Unlaunching the 12GB 4080

36 comments

  1. [23]
    KittyCat
    Link
    This means that the cheapest Nvidia card from the 4000 series now costs a quite staggering $1,199. It wasn't that long ago that that amount of money could have got you the top of the line card?...

    This means that the cheapest Nvidia card from the 4000 series now costs a quite staggering $1,199.

    It wasn't that long ago that that amount of money could have got you the top of the line card? right?

    I really hate the PC industry's hyperfixation on high end, expensive components and neglect of reasonable and affordable components (particularly from gpu vendors).

    11 votes
    1. [18]
      vektor
      Link Parent
      It wasn't even long ago that that amount got you a very decent gaming rig.

      It wasn't that long ago that that amount of money could have got you the top of the line card? right?

      It wasn't even long ago that that amount got you a very decent gaming rig.

      18 votes
      1. [16]
        Greg
        Link Parent
        I think the silver lining in all of this is that 30 series cards are now "previous gen" pricing, but the 40 series are only appealing to a very small niche, so you can still build a very solid...

        I think the silver lining in all of this is that 30 series cards are now "previous gen" pricing, but the 40 series are only appealing to a very small niche, so you can still build a very solid setup around a new 3060 or used 3070 for around a grand. Intel coming into the ring adds some interesting variables at the mid to low end too, even if it is very clearly a first iteration with some pretty rough edges.

        It's still not ideal that the high end of the market is anchoring quite so absurdly expensive nowadays, it does seem to be dragging the rest upwards, but if all you care about is a decent experience at 1080p there are still some deals to be had.

        6 votes
        1. [2]
          KittyCat
          Link Parent
          It's possible that part of the reason they pulled this card is that it would otherwise be cheap enough to compete with some of the 3000 series stock, which presumably they want to sell.

          It's possible that part of the reason they pulled this card is that it would otherwise be cheap enough to compete with some of the 3000 series stock, which presumably they want to sell.

          6 votes
          1. Greg
            Link Parent
            Could well be! I was kind of betting on it being quickly rebranded rather then pulled or delayed, but it’s pretty much impossible to tell for now.

            Could well be! I was kind of betting on it being quickly rebranded rather then pulled or delayed, but it’s pretty much impossible to tell for now.

            2 votes
        2. [13]
          TheRtRevKaiser
          (edited )
          Link Parent
          I haven't been paying much attention the last few years since mining and chip shortages had put a new GPU way out of my price range, and my PC has been working okay running things at 1080. I've...

          if all you care about is a decent experience at 1080p there are still some deals to be had.

          I haven't been paying much attention the last few years since mining and chip shortages had put a new GPU way out of my price range, and my PC has been working okay running things at 1080. I've started noticing issues even at that resolution lately, though, and I know my CPU is a bottleneck as well, but I can't really upgrade without upgrading my motherboard, so I've been musing about just building a new PC altogether. What're some good cards these days if I don't care about 4k gaming and just want good framerates at 1080?

          2 votes
          1. Autoxidation
            (edited )
            Link Parent
            A 3060ti (or 2080 super or 2080 ti) would be plenty for 1080p gaming and would support higher refresh rates like 144 hz. Here are some benchmarks form gamersnexus that includes 1080p.

            A 3060ti (or 2080 super or 2080 ti) would be plenty for 1080p gaming and would support higher refresh rates like 144 hz. Here are some benchmarks form gamersnexus that includes 1080p.

            5 votes
          2. [4]
            MimicSquid
            Link Parent
            What're your thoughts on the Steam Deck?

            What're your thoughts on the Steam Deck?

            3 votes
            1. [3]
              TheRtRevKaiser
              Link Parent
              Huh. Honestly, I hadn't really thought about it at all. I work from home so I have a desk setup where I tend to play games, so I was thinking of another desktop PC. I don't know, though, like I...

              Huh. Honestly, I hadn't really thought about it at all. I work from home so I have a desk setup where I tend to play games, so I was thinking of another desktop PC. I don't know, though, like I said I hadn't really given it much consideration.

              2 votes
              1. [2]
                MimicSquid
                Link Parent
                I'm in the same situation, where I've got a home work/play setup with a solid PC, but I've been pondering getting off the PC upgrade cycle in favor of a Steam Deck plus a PC that handles my work...

                I'm in the same situation, where I've got a home work/play setup with a solid PC, but I've been pondering getting off the PC upgrade cycle in favor of a Steam Deck plus a PC that handles my work without the sort of overhead needed for gaming as well. It may not work as well for you if your work requires some power in your PC, but most of mine is just dealing with stuff in the cloud and so really doesn't need much.

                2 votes
                1. teaearlgraycold
                  Link Parent
                  I have a very nice gaming computer but it's nice to have a Steam Deck for transit and couch gaming.

                  I have a very nice gaming computer but it's nice to have a Steam Deck for transit and couch gaming.

                  3 votes
          3. [6]
            Greg
            Link Parent
            It's almost definitely going to be worth giving it two weeks until the RDNA 3 announcement from AMD, just because that'll likely push the current AMD prices down a bit as well and make everything...

            It's almost definitely going to be worth giving it two weeks until the RDNA 3 announcement from AMD, just because that'll likely push the current AMD prices down a bit as well and make everything even more competitive.

            Depending on budget and exactly what's available where you are (especially variable if you're looking second hand), anything from a 6600XT, 3060, 6700XT, or 3060Ti will serve you well - those are listed in approximate order of performance. I'm seeing the 6600XT around £250 in completed eBay listings and the 3060Ti about £325. A 3060Ti Founders is £369 new from Nvidia at the moment.

            For context, Cyberpunk is one of the most resource intensive games to benchmark on right now and they'll all just about hit 60fps at ultra settings, pushing towards 70fps at the 3060Ti end. Drop the settings a bit and you're moving towards 90fps even from the 6600XT, and on more optimised games like Shadow of the Tomb Raider you can run max settings from ~120 (6600XT) to ~150fps (3060Ti).

            3 votes
            1. [3]
              TheRtRevKaiser
              Link Parent
              Thanks! Yeah I'm in no particular hurry. The PC I've got now is still working pretty well, I'm just starting to notice more issues and I'm getting close to the point where I think it will make...

              Thanks! Yeah I'm in no particular hurry. The PC I've got now is still working pretty well, I'm just starting to notice more issues and I'm getting close to the point where I think it will make sense to build something new.

              I'll probably gravitate more toward Nvidia because I'm also interested in doing some AI art generation, and most of the stuff I've looked at is designed to run on CUDA cores (although I've seen some hacks to get Stable Diffusion running on AMD hardware so it might not be a big deal in a few months).

              2 votes
              1. [2]
                Greg
                Link Parent
                That makes sense, and yeah, I'd definitely go Nvidia if you're doing any non-gaming stuff with it - although the AMD announcement could still push down prices across the board depending what they...

                That makes sense, and yeah, I'd definitely go Nvidia if you're doing any non-gaming stuff with it - although the AMD announcement could still push down prices across the board depending what they come up with.

                You'll want to consider VRAM capacity as well in that case, I sometimes find even 10GB to be a bit limiting for ML work and the 3060Ti is only 8GB. Depending how serious you are about it, I do think the 3090 is kind of a steal at £750; it seems like a lot of the same people who were willing to buy a super-mega-overkill card at pandemic prices are the same ones who'll sell it for whatever as soon as there's a new top end model available. It's ludicrous overkill for 1080p gaming, but it'll absolutely rip through CUDA tasks and it's still the max VRAM you'll get in a consumer card.

                For £1550 (£800 for this machine sans 6700XT, £750 for a second hand 3090) you're getting a pretty incredible machine. Maybe swap for a slightly beefier PSU, maybe tell me that whole idea is absurd and a <£1000 total will be fine, but you get the idea.

                2 votes
                1. TheRtRevKaiser
                  Link Parent
                  Yeah it's certainly tempting. I'll have to keep an eye on prices and see what things look like after the new year maybe.

                  Yeah it's certainly tempting. I'll have to keep an eye on prices and see what things look like after the new year maybe.

                  2 votes
            2. [2]
              teaearlgraycold
              Link Parent
              That almost never happens

              that'll likely push the current AMD prices down a bit

              That almost never happens

              1 vote
              1. Greg
                Link Parent
                Fair point, "likely" was perhaps a stretch - but for the sake of a two week wait I'll still take a bet on "almost"! The only time I remember things actively going the other way was last generation...

                Fair point, "likely" was perhaps a stretch - but for the sake of a two week wait I'll still take a bet on "almost"! The only time I remember things actively going the other way was last generation chip shortage madness, I think?

                2 votes
          4. Grzmot
            Link Parent
            It depends on what kinda games you play and what quality you want to achieve. But for 1080p the 3080 is already overkill. I think like 3060 is good? AMD is even better at that price range, and...

            It depends on what kinda games you play and what quality you want to achieve. But for 1080p the 3080 is already overkill. I think like 3060 is good? AMD is even better at that price range, and they had a massive price drop recently. I think it's gonna drop again once the new cards get announced.

            2 votes
      2. babypuncher
        Link Parent
        It still gets you an excellent gaming rig. The RTX 4000 launch is just bonkers. Intel's new GPUs are surprisingly competent, and RDNA 3 is just around the corner. Nvidia will either adjust their...

        It still gets you an excellent gaming rig. The RTX 4000 launch is just bonkers. Intel's new GPUs are surprisingly competent, and RDNA 3 is just around the corner.

        Nvidia will either adjust their pricing, launch lower end cards, and/or cede the low and midrange market to Intel and AMD. But that market isn't going anywhere any time soon.

        4 votes
    2. Toric
      Link Parent
      Heck, my current rigs first iteration (its a bit of a ship of theseus) cost less than that back in 2016!

      Heck, my current rigs first iteration (its a bit of a ship of theseus) cost less than that back in 2016!

      1 vote
    3. [3]
      Atvelonis
      Link Parent
      I haven't followed the GPU scene closely for years. Does AMD really not have anything that can compete with Nvidia? I'd like to build a new rig within the next year, but not at such an...

      I haven't followed the GPU scene closely for years. Does AMD really not have anything that can compete with Nvidia? I'd like to build a new rig within the next year, but not at such an unreasonable price point.

      1 vote
      1. AugustusFerdinand
        (edited )
        Link Parent
        Unless you're obsessed with ray tracing, benchmarking, or measure your personal worth by watching the frames per second of every game you play AMD goes toe to toe with NVidia. And for much nicer...

        Unless you're obsessed with ray tracing, benchmarking, or measure your personal worth by watching the frames per second of every game you play AMD goes toe to toe with NVidia. And for much nicer price points during this and the last generation. It depends on how much bench racing you want to do and what kind of rig you plan to build.
        The main question being: 1080p, 1440p, or 4k?

        Being that you don't intend to build a rig at an unreasonable price point, I'm going to assume you aren't going for a 4k build.

        Assuming 1440p at Ultra settings:

        GPU Price FPS FPS per $
        4090 $1600 140 $11.42
        6950XT $950 115 $8.26
        3090ti $1100 114 $9.64
        6900XT $700 106 $6.60
        3090 $1000 106 $9.43
        6800XT $600 100 $6.00
        3080ti $900 103 $8.73
        3080 $700 95 $7.36
        6800 $550 87 $6.32

        The AMD 6800XT is the best performance deal on the market, the AMD 7000 series launch is imminent to answer NVidia's 4000 series.

        Shopping around and research can net you some deals of course, but I've yet to see NVidia beat AMD's price points even during a fire sale. I just built my new rig to hand down my old one to my wife. New build has a 6900XTXH which is a 6950XT in a 6900XT box (and price) and is more than enough to survive until the 8000 series is imminent (I like to build top of the line of the outgoing generation). My old build/wife's new rig is a Vega 64 setup which will still pull 100fps at 1080p Ultra which is perfectly fine for her and the games she plays (first game she booted up was Stray, second was Stardew Valley).

        8 votes
      2. Whom
        Link Parent
        We're currently at the launch of a new generation, where we know what's going on with Nvidia but AMD hasn't revealed anything of substance. We don't yet know what AMD's going to offer or what...

        We're currently at the launch of a new generation, where we know what's going on with Nvidia but AMD hasn't revealed anything of substance. We don't yet know what AMD's going to offer or what prices they'll be at.

        Lately the deal has been that Nvidia has the more powerful cards in general, but there are certain workload / price point combinations where AMD makes more sense. I'm certainly no industry analyst, but I wouldn't be surprised if that continued to be the case, or even if AMD kept with roughly previous gen pricing and made some gains like that.

        Regardless, Nvidia's pricing here is almost certainly just to scare people into buying up the extra 3000 series stock that was overproduced before the crypto crash. Nvidia have proven to be assholes time and time again, but I don't think they'd be dumb enough to have these be their actual long term price points.

        3 votes
  2. [4]
    Grzmot
    Link
    I have no idea what this means. Surely not a recall? I feel like they're saving the 12GB 4080 and are going to turn it into the 4070Ti later on, but this really is the weirdest way to do it.

    I have no idea what this means. Surely not a recall? I feel like they're saving the 12GB 4080 and are going to turn it into the 4070Ti later on, but this really is the weirdest way to do it.

    4 votes
    1. [3]
      Greg
      Link Parent
      It always looked more like a 70 card in comparison to the 16GB so maybe it will just be the 4070 now? Or perhaps they even flip the usual release schedule and the 4070Ti (i.e. this card) comes out...

      It always looked more like a 70 card in comparison to the 16GB so maybe it will just be the 4070 now? Or perhaps they even flip the usual release schedule and the 4070Ti (i.e. this card) comes out before the normal 4070 that was already planned. Could even be a new middle ground like a 4075 or something I guess. Either way, it seems like a bit of a marketing shitshow, and I'm surprised that they expected the $900 GPU consumer base not to pull them up on the fact that the branding only mentioned the memory when everything else was different as well.

      There are some weird decisions in this range all around, though. The fact they've released the first card that can reasonably run games at 4K faster than 120Hz* and then didn't add DisplayPort 2.0 seems like a huge oversight, especially when a $100 Intel Arc card does include it. The only thing I can think is that they really are expecting the only buyers to be professional users rather than gamers, which honestly makes more sense to me, but then their entire marketing pitch contradicts that.


      *No I don't really know why you'd need that either, but it's clearly important to some people because there are monitors that support it!

      5 votes
      1. [2]
        AugustusFerdinand
        (edited )
        Link Parent
        It shouldn't even be a 4070, it's a 4060ti. From another thread comparing generational improvements: 60% performance of the flagship is typically relegated to xx70 tier, and 50% to xx60ti. No...
        • Exemplary

        It shouldn't even be a 4070, it's a 4060ti.

        From another thread comparing generational improvements:


        60% performance of the flagship is typically relegated to xx70 tier, and 50% to xx60ti. No matter what the marketing department has named them, or what the tech engineers named the AD10x chips, Nvidia's value proposition is for you to spend $900 for a midrange chip with 50% or less of the flagship's performance.

        Take a look at what would happen if we added a fourth card to the mix:

        GPU Core Count Boost Clock Core * Clock Core*Clock % of Flagship Should be Named
        4090 AD104 16384 2.52GHz 41,289 100% 4090
        n/a 12800ish 2.51GHz 32,128 78% 4080
        4080 16GB AD103 9728 2.51GHz 24,320 59% 4070
        4080 12GB AD104 7680 2.61GHz 19,968 48% 4060ti

        The above would follow other generations' naming-performance schemes. Here's the proof:

        2020: Ampere
        Note how close 3080 is to 3090, justifying its positioning as "the flagship". Performance was actually even closer than 88% due to under-utilization on the 3090. Best price:perf value in a high-end NVidia chip in the last 10 years!

        GPU Core Count Boost Clock Core * Clock Core*Clock % of Flagship
        3090 10240 1.66GHz 16,998 100%
        3080 8704 1.71GHz 14,884 88%
        3070 5888 1.72GHz 10,127 60%
        3060Ti 4864 1.67GHz 8,123 48%

        2019: Turing v2 (Super)

        GPU Core Count Boost Clock Core * Clock Core*Clock % of Flagship
        2080Ti 4352 1.65GHz* 7,180 100%
        2080 Super 3072 1.82GHz 5,591 78%
        2070 Super 2560 1.77GHz 4,531 63%
        2060 Super 2176 1.65GHz 3,590 50%

        *approx speed of cards with refined silicon which typically reached higher boost clocks than first-year 2080Ti.

        2018: Turing v1

        GPU Core Count Boost Clock Core * Clock Core*Clock % of Flagship
        2080Ti 4352 1.55GHz 6,746 100%
        2080 2944 1.71GHz 5,034 75%
        2070 2304 1.62GHz 3,732 55%
        2060* 1920 1.68GHz 3,226 48%

        *GTX 1660 / 1660 held the xx60 non-ti performance slot, with RTX 2060 where a "Ti" card would normally be. Note how each of these was a poor value compared to the flagship, even before you remember how overpriced the RTX cards were given that in terms of rasterization performance (which was the only thing back then) 2080 = 1080Ti, 2017 = 1080, 2060 = 1070-1070Ti. There was literally no reason to buy before the Super cards debuted.

        2017: Pascal v2
        GTX 1070 takes its true place as the 60% card. There was no xx60ti, with a huge gap between 1060 6GB and 1070. 1060 3GB was discontinued, and to shift the bottom up, 2GB 1050 was replaced with 3GB 1050, with the same core count and higher clocks than 1050ti. It was a weird time.

        GPU Core Count Boost Clock Core * Clock Core*Clock % of Flagship
        1080Ti 3584 1.58GHz 5,663 100%
        1080 2560 1.8GHz* 4,608 81%
        1070 1920 1.68GHz 3,226 59%

        2016: Pascal v1
        Note that GTX 1070's too-heavily cut GP-104 chip looked bad when compared to a nearly-full GP102 die in 1080Ti, so Nvidia held back the GTX 1080ti and made uncut GP-104 the flagship for 2016. As such, the numbers are all over the place. As venerable as GTX 1060 6GB eventually proved to be, Spending an extra $100 to step up to 6GB wasn't the greatest value at the time.

        GPU Core Count Boost Clock Core * Clock Core*Clock % of Flagship
        1080 2560 1.73GHz 4,429 100%
        1070 1920 1.68GHz 3,226 73%
        1060 6GB 1280 1.71GHz 2,189 49%
        1060 3GB 1152 1.71GHz 1,970 44%

        2014-15: Maxwell

        GPU Core Count Boost Clock Core * Clock Core*Clock % of Flagship
        980Ti 2816 1.08GHz 3,041 100%
        980 2048 1.22GHz 2499 82%
        970 1664 1.18GHz 1964 65%
        960 OEM 1280 1.20GHz 1536 51%

        EPIC FINAL CONCLUSION:
        Nvidia either has a cut-back GA102 die (or uncut GA103?) waiting and stockpiling. This unannounced card should have been the RTX 4080. Either that, or they're sandbagging to force higher prices on AMD's announcement in October after which they will "come around to what our customers want". Or they're complete idiots for leaving a 40% gap between flagship 4090 and the thusfar-named 4080 16GB. Which to be fair, they've done before with GTX 1060 and 1070...

        But my best guess is that they think we're idiots who will buy anything up for any price.

        17 votes
        1. Greg
          Link Parent
          It's detail like this that makes me love this site - that's extremely interesting to see, and I'm genuinely surprised that even my cynicism about Nvidia wasn't cynical enough!

          It's detail like this that makes me love this site - that's extremely interesting to see, and I'm genuinely surprised that even my cynicism about Nvidia wasn't cynical enough!

          3 votes
  3. [9]
    Tygrak
    Link
    I have a 8 year old graphics card and it still runs everything I want to do. It's a bit hard for me to understand why most people would need such overpriced new cards. I'd guess it's probably...

    I have a 8 year old graphics card and it still runs everything I want to do. It's a bit hard for me to understand why most people would need such overpriced new cards. I'd guess it's probably mostly really useful nowadays for ML and rendering stuff. And I guess it allows games developers to be lazier when optimizing their games for one :D (speaking as a game dev).

    I would be interested to hear what people who want to buy new graphics cards here want to use them for.

    1 vote
    1. Greg
      (edited )
      Link Parent
      I think you definitely hit the nail on the head with ML and other professional uses - cloud servers with decent GPUs can hit a couple of grand in usage relatively quickly, so it’s not too hard for...

      I would be interested to hear what people who want to buy new graphics cards here want to use them for.

      I think you definitely hit the nail on the head with ML and other professional uses - cloud servers with decent GPUs can hit a couple of grand in usage relatively quickly, so it’s not too hard for a top end card to be economical if it’s a business expense that you’re making money on.

      Does seem a bit odd that the marketing is all gaming all the time, though. At a guess they know the pros will figure out their needs anyway, and marketing to gamers will catch enough people with high disposable income and a propensity towards the best and shiniest at any cost?


      [Edit] That's me specifically talking about the 90 series for sure, and probably the newer iterations of the 80s as well, BTW. The current gen mid tier and previous gen top tier still absolutely have a place in gaming if you want high fps on newer monitors or VR.

      3 votes
    2. [4]
      AugustusFerdinand
      Link Parent
      Which is?

      I have a 8 year old graphics card and it still runs everything I want to do.

      Which is?

      2 votes
      1. [3]
        Tygrak
        Link Parent
        Nvidia GTX 980, I even tried out stable diffusion which I thought would be the thing that would finally not work yet it worked pretty great.

        Nvidia GTX 980, I even tried out stable diffusion which I thought would be the thing that would finally not work yet it worked pretty great.

        2 votes
        1. [2]
          AugustusFerdinand
          Link Parent
          Nice, how long did the image generation take? Is that what you usually do with the card, image creation?

          Nice, how long did the image generation take?
          Is that what you usually do with the card, image creation?

          3 votes
          1. Tygrak
            Link Parent
            It takes like +-20 seconds. Not really, I use it for everything, the most performance intensive stuff I sometimes do is probably some 3d modeling (and rendering of it), game dev and just games I...

            It takes like +-20 seconds.

            Not really, I use it for everything, the most performance intensive stuff I sometimes do is probably some 3d modeling (and rendering of it), game dev and just games I guess.

            1 vote
    3. [3]
      cfabbro
      (edited )
      Link Parent
      My (admittedly overkill) monitor setup, which includes a 100hz 3440x1440, 60hz 2560x1080, and 3x 60hz 1920x1080 monitors (one of which is my TV on the opposite side of the room). Edit: Oh, and a...

      I would be interested to hear what people who want to buy new graphics cards here want to use them for.

      My (admittedly overkill) monitor setup, which includes a 100hz 3440x1440, 60hz 2560x1080, and 3x 60hz 1920x1080 monitors (one of which is my TV on the opposite side of the room). Edit: Oh, and a 1024x600 in-case monitor now too, for displaying an AIDA64 sensor panel.

      And my old 1070Ti simply wasn't cutting it anymore (framerate wise) with that setup, especially for gaming at full resolution on the 1440p monitor, even with middling game quality settings. I now have a 3070Ti and it rarely goes above 70% utilization in games with Gsync/Vsync enabled (1440p at 100hz/100fps), even with absolutely maxed out game quality settings.

      2 votes
      1. Tygrak
        Link Parent
        Wow! That's a setup :D. I am still working on a single 1920x1080 monitor at home, but that's a good point, at work I have a 4k + 2560x1440 monitor and there it definitely becomes more worthwhile.

        Wow! That's a setup :D. I am still working on a single 1920x1080 monitor at home, but that's a good point, at work I have a 4k + 2560x1440 monitor and there it definitely becomes more worthwhile.

        1 vote
      2. Protected
        Link Parent
        This is similar to my upgrade path, from 1070 to 3070ti (last year), and my experience is the same, gaming on a 1440p monitor (currently running at 120hz though, vsync usually on). The 1070 was...

        This is similar to my upgrade path, from 1070 to 3070ti (last year), and my experience is the same, gaming on a 1440p monitor (currently running at 120hz though, vsync usually on). The 1070 was struggling, on games, VR and video encoding, as well as running unity with heavy scenes. The 3070ti can handle everything I need easily, games usually run on whatever max/ultra settings loadout they provide, though I rarely bother to explore the highest most outrageous forms of antialiasing that are sometimes provided. No more lag in Beat Saber, and raytracing works well.

        1 vote