3070: $499, available in October. 3080: $699, available September 17. 2090: $1499, available September 24. Fabricated by Samsung on a "custom" 8nm process. Digital Foundry has gone hands on with a...
3070: $499, available in October.
3080: $699, available September 17.
2090: $1499, available September 24.
Fabricated by Samsung on a "custom" 8nm process.
Digital Foundry has gone hands on with a 3080. Performance is between 165% and 200% that of a 2080.
That's great. I knew that the next generation would be coming soon, but I didn't think that the expensive graphics card I bought just about a month ago would be eclipsed.
That's great. I knew that the next generation would be coming soon, but I didn't think that the expensive graphics card I bought just about a month ago would be eclipsed.
AMD is starting to be competive again. Still not there yet at the high end, but good enough at the midrange for those of us looking to escape NVIDIA. Amazing how performance leapfrogs when there's...
AMD is starting to be competive again. Still not there yet at the high end, but good enough at the midrange for those of us looking to escape NVIDIA.
Amazing how performance leapfrogs when there's a viable competitor.
This is actually in line with Nvidia's usual pattern. Every other generation of graphics cards represents a big leap in performance. It is somewhat similar to Intel's old tick-tock pattern. Turing...
This is actually in line with Nvidia's usual pattern. Every other generation of graphics cards represents a big leap in performance. It is somewhat similar to Intel's old tick-tock pattern. Turing was a major architectural change from Maxwell and didn't bring much better performance, but came with a lot of new features. Ampere is an iteration on Turing and brings basically the same feature set with a big performance boost.
I'm aware of that. It's why it feels like nVidia in particular is screwing over the market. Their pricing is clearly designed to extract the most money out of consumers' pockets (though to be...
I'm aware of that. It's why it feels like nVidia in particular is screwing over the market. Their pricing is clearly designed to extract the most money out of consumers' pockets (though to be fair, what product is not?), and I can't help but wonder if they tune their products' performance down in such a way that they can quickly reveal a higher-end product as soon as there is a whisper of competition coming to the market.
This pattern feels more like a product of how engineering works than some grand conspiracy to abuse their market share. AMD products show similar patterns despite not being dominant in the GPU...
This pattern feels more like a product of how engineering works than some grand conspiracy to abuse their market share. AMD products show similar patterns despite not being dominant in the GPU market, and only recently becoming competitive in the CPU market.
If we look at the transistor count for Turing, it definitely was a big leap from Maxwell, but it didn't show in benchmarks because a lot of those new transistors were dedicated to features very few games took advantage of.
Well, like I said, it's just a feeling. When it comes to GPUs I am very skeptical about just about every statement they make. They make me nervous because they are essentially a giant "trade...
Well, like I said, it's just a feeling. When it comes to GPUs I am very skeptical about just about every statement they make. They make me nervous because they are essentially a giant "trade secret" black box that we will never truly understand as long as the owners can get away with it.
It's fair to feel that way, these patterns always pop up in tech. Consider what Nvidia did to get this boost - they changed over from a 12nm to an 8nm die. When you shrink your chips like this you...
It's fair to feel that way, these patterns always pop up in tech. Consider what Nvidia did to get this boost - they changed over from a 12nm to an 8nm die. When you shrink your chips like this you always get monster performance gains, but retooling chip hardware like this is difficult as hell, takes years and massive investments. Samsung got there first and won a contract, so Nvidia changed chip manufacturers for this generation of cards.
It's quite a luxury now having multiple companies you can approach for chip designs that are so cutting edge. They'll change again if they can get a 6nm process or 4nm process. It would surprise me if this was a sudden move Nvidia had been keeping in their back pocket. I expect they've been chomping at the bit to get better fabrication and jumped at it the moment it was on the table. One can time the launch for better impact, but a 200% performance gain from die size reduction is not something you sit on for long or expect to have often.
You want to maximize your impact on your competition, and the longer you sit on it, the more money they make that could have gone to you instead, and the more time you gift them to catch up to you or surpass you.
I started thinking that my 1060 was getting long in the tooth a couple of months ago. If that's accurate I'm really glad I waited to see what the 3xxx series brought to the table. Planning to go...
Performance is between 165% and 200% that of a 2080.
I started thinking that my 1060 was getting long in the tooth a couple of months ago. If that's accurate I'm really glad I waited to see what the 3xxx series brought to the table. Planning to go for a 3080/3090 this time around since I tend to keep my video cards for quite a while. Going to be quite interesting to see the difference.
There's an AMA happening on Reddit with quite a few people from NVIDIA involved. They're just collecting questions today and will be posting the answers tomorrow.
I wrote this up for my own submission but forgot to post it yesterday, heh. Just a highlights summary. NVIDIA Marbles at Night Demo New technology: NVIDIA Reflex NVIDIA Broadcast RTX IO (related:...
I wrote this up for my own submission but forgot to post it yesterday, heh. Just a highlights summary.
This GPU generation seems like the biggest leap in power ever if NVIDIA's claims are true
The $500 RTX 3070 outperforms the $1200 RTX 2080Ti
The last time a GPU generation's new 3rd tier card outperformed the previous generation's highest end card was in 2004 when the GeForce 6600GT outperformed the GeForce FX 5950 Ultra
All possible thanks to NVIDIA's new partnership with Samsung on an 8nm process, their previous generation was infamously priced due to NVIDIA's problems with their older fabrication process
The RTX 3090 is effectively the new Titan, NVIDIA claims it is rated for 8K60fps gaming (the above Marbles at Night demo is running in realtime on one RTX 3090 at 1440p)
Fortnite will support RTX, and both NVIDIA Reflex and Broadcast
Comment-wise: three weeks ago I said I expected to wait five years before I could get a GPU that can run 4K120Hz and yet the RTX 3080 can apparently hit games at that resolution like my GTX 1070...
Comment-wise: three weeks ago I said I expected to wait five years before I could get a GPU that can run 4K120Hz and yet the RTX 3080 can apparently hit games at that resolution like my GTX 1070 does at 1080p.
Unbelievable. This is an insane jump in GPU power if NVIDIA's claims are true.
Fabricated by Samsung on a "custom" 8nm process.
Digital Foundry has gone hands on with a 3080. Performance is between 165% and 200% that of a 2080.
That's great. I knew that the next generation would be coming soon, but I didn't think that the expensive graphics card I bought just about a month ago would be eclipsed.
AMD is starting to be competive again. Still not there yet at the high end, but good enough at the midrange for those of us looking to escape NVIDIA.
Amazing how performance leapfrogs when there's a viable competitor.
This is actually in line with Nvidia's usual pattern. Every other generation of graphics cards represents a big leap in performance. It is somewhat similar to Intel's old tick-tock pattern. Turing was a major architectural change from Maxwell and didn't bring much better performance, but came with a lot of new features. Ampere is an iteration on Turing and brings basically the same feature set with a big performance boost.
I'm aware of that. It's why it feels like nVidia in particular is screwing over the market. Their pricing is clearly designed to extract the most money out of consumers' pockets (though to be fair, what product is not?), and I can't help but wonder if they tune their products' performance down in such a way that they can quickly reveal a higher-end product as soon as there is a whisper of competition coming to the market.
This pattern feels more like a product of how engineering works than some grand conspiracy to abuse their market share. AMD products show similar patterns despite not being dominant in the GPU market, and only recently becoming competitive in the CPU market.
If we look at the transistor count for Turing, it definitely was a big leap from Maxwell, but it didn't show in benchmarks because a lot of those new transistors were dedicated to features very few games took advantage of.
Well, like I said, it's just a feeling. When it comes to GPUs I am very skeptical about just about every statement they make. They make me nervous because they are essentially a giant "trade secret" black box that we will never truly understand as long as the owners can get away with it.
It's fair to feel that way, these patterns always pop up in tech. Consider what Nvidia did to get this boost - they changed over from a 12nm to an 8nm die. When you shrink your chips like this you always get monster performance gains, but retooling chip hardware like this is difficult as hell, takes years and massive investments. Samsung got there first and won a contract, so Nvidia changed chip manufacturers for this generation of cards.
It's quite a luxury now having multiple companies you can approach for chip designs that are so cutting edge. They'll change again if they can get a 6nm process or 4nm process. It would surprise me if this was a sudden move Nvidia had been keeping in their back pocket. I expect they've been chomping at the bit to get better fabrication and jumped at it the moment it was on the table. One can time the launch for better impact, but a 200% performance gain from die size reduction is not something you sit on for long or expect to have often.
You want to maximize your impact on your competition, and the longer you sit on it, the more money they make that could have gone to you instead, and the more time you gift them to catch up to you or surpass you.
I started thinking that my 1060 was getting long in the tooth a couple of months ago. If that's accurate I'm really glad I waited to see what the 3xxx series brought to the table. Planning to go for a 3080/3090 this time around since I tend to keep my video cards for quite a while. Going to be quite interesting to see the difference.
There's an AMA happening on Reddit with quite a few people from NVIDIA involved. They're just collecting questions today and will be posting the answers tomorrow.
I wrote this up for my own submission but forgot to post it yesterday, heh. Just a highlights summary.
NVIDIA Marbles at Night Demo
New technology:
More coverage:
Quick version:
Comment-wise: three weeks ago I said I expected to wait five years before I could get a GPU that can run 4K120Hz and yet the RTX 3080 can apparently hit games at that resolution like my GTX 1070 does at 1080p.
Unbelievable. This is an insane jump in GPU power if NVIDIA's claims are true.
I had been toying with the idea of upgrading my current 1070. By the looks of it, the 3080 seems a fantastic replacement!
While I still have to wait for actual benchmarks, the 3080 is looking like an amazing upgrade for my 980ti.