I'm not sure that I buy this argument, particularly towards the end of the article when it goes into economic ramifications of slowing progress in miniaturization. How much is really riding on...
I'm not sure that I buy this argument, particularly towards the end of the article when it goes into economic ramifications of slowing progress in miniaturization. How much is really riding on massive year-over-year improvements in transistor counts? What, exactly, do we need to prepare for?
If economists are right, and much of the growth in the 1990s and early 2000s was a result of microchips—and if, as some suggest, the sluggish productivity growth that began in the mid-2000s reflects the slowdown in computational progress—then, says Thompson, “it follows you should invest enormous amounts of money to find the successor technology. We’re not doing it. And it’s a public policy failure.”
I think that's a bit of a logical leap. If the slower productivity growth is even related to computerization, I think it's much more likely to do with all of the low hanging fruit being picked, rather than slower gains in raw computational power.
A worker with a computer twice as fast is not going to be twice as productive, and I think the vast majority of the gains are either from productivity software or embedded chips. The former doesn't really get better with more powerful computers, once you're past a certain baseline that was achieved a long time ago, even with modern software bloat. The latter is mostly ancient chips on ancient process nodes, churned out for pennies.
Plus we can deploy more computers. Cloud data centers are growing and they’re deploying GPUs and TPUs. Hell the growth in parallel computing in TPU style chips means we’re still going to be...
Plus we can deploy more computers. Cloud data centers are growing and they’re deploying GPUs and TPUs. Hell the growth in parallel computing in TPU style chips means we’re still going to be building ever more powerful chips with these newer architectures. Sure single threaded performance might not grow but I assume there’s lots of room to make beefier machine learning chips.
I've been hearing hand wringing over the end of Moore's law for years, and I don't really get the big deal for the majority of consumers. I'd bet most of us write a not, maybe do some...
I've been hearing hand wringing over the end of Moore's law for years, and I don't really get the big deal for the majority of consumers. I'd bet most of us write a not, maybe do some spreadsheets, browse the web, and play games. The gaming might take more power, but we've had specialized computers for those for ages. Moore's law ending might be bad for cutting edge research, but I don't see it being a huge deal for regular consumers.
Honestly, even for games I feel like you reach a point where it's not even worth it to take advantage of available GPU power, just because designing all that photorealistic stuff takes a lot of...
Honestly, even for games I feel like you reach a point where it's not even worth it to take advantage of available GPU power, just because designing all that photorealistic stuff takes a lot of time, money, and effort, and might look like crap in a generation or two anyway, whereas a distinctive art style is cheaper, lower on requirements, and ages better.
That last bit's a super interesting thought. I've wondered before about whether the end of moore's law could bring about the 'age of optimization', where we work to eke out every last drop of...
That last bit's a super interesting thought. I've wondered before about whether the end of moore's law could bring about the 'age of optimization', where we work to eke out every last drop of performance from stagnant nodes through more and more efficient programming and processor design. Looking back at just how much old computers could do, and on a microscopic fraction of today's transistor budgets, really makes you appreciate the gains that could be had there. The people that designed and coded those things were some goddamn wizards.
l'd like to add that Blender has Optix denoising since a few versions. Essentially this removes noise from a 3D render through some neural net powered mathematical formula, which enables you to...
l'd like to add that Blender has Optix denoising since a few versions. Essentially this removes noise from a 3D render through some neural net powered mathematical formula, which enables you to make pretty good-looking renders in a fraction of the time it used to take, as the algorithm "guesses" what should be in place of the missing/noisy pixels.
I think this has also been implemented in the editor itself(maybe in a beta) so near-realtime photorealistic renders are also possible.
I was watching a WWDC video on ray tracing with Metal, and it turns out that Metal has built-in shaders for doing this type of denoising! I thought that was kind of crazy because I just saw this...
I was watching a WWDC video on ray tracing with Metal, and it turns out that Metal has built-in shaders for doing this type of denoising! I thought that was kind of crazy because I just saw this type of work presented at SIGGRAPH last year.
I'm a fan of well done cel shading. Wind waker is almost 20 years old, and it still looks like it could've been released yesterday (at least when some resolution hacks are applied). Not too many...
I'm a fan of well done cel shading. Wind waker is almost 20 years old, and it still looks like it could've been released yesterday (at least when some resolution hacks are applied). Not too many titles of the same era can say the same.
I agree. I think the only place where this has a real chance of becoming a serious roadblock is for really heavy simulation games like dwarf fortress or HoI4.
I agree. I think the only place where this has a real chance of becoming a serious roadblock is for really heavy simulation games like dwarf fortress or HoI4.
Fearing the end of Moore's Law seems like being afraid of not finding any stronger metals after the discovery of titanium. As if such an end means that all industries dependent upon metal will...
Fearing the end of Moore's Law seems like being afraid of not finding any stronger metals after the discovery of titanium. As if such an end means that all industries dependent upon metal will collapse, or new companies can't start up using tried-and-true materials. Sure we can't create a steel-toed boot that can protect you from Mt. Everest getting dropped on your foot, but the important difficult problems are ones that can scale up. You can reach the moon, you just need thick walls on your ship. We can crunch a ton of data, it will just take a lot of parallel processing.
And if I'm wrong, maybe it's a good thing that we are stopped short of creating AGI.
Edit:
They mention 3D transistors in the article. Would chips with an arbitrary numbers of layers nearly triple the number of transistors every 2 years? Under Moore's Law a 2D chip doubles in density every 2 years. Across 1 dimension that's 2½ increase in density, and I suppose 23⁄2 across 3 (approx. 2.8).
I'm not sure that I buy this argument, particularly towards the end of the article when it goes into economic ramifications of slowing progress in miniaturization. How much is really riding on massive year-over-year improvements in transistor counts? What, exactly, do we need to prepare for?
I think that's a bit of a logical leap. If the slower productivity growth is even related to computerization, I think it's much more likely to do with all of the low hanging fruit being picked, rather than slower gains in raw computational power.
A worker with a computer twice as fast is not going to be twice as productive, and I think the vast majority of the gains are either from productivity software or embedded chips. The former doesn't really get better with more powerful computers, once you're past a certain baseline that was achieved a long time ago, even with modern software bloat. The latter is mostly ancient chips on ancient process nodes, churned out for pennies.
Plus we can deploy more computers. Cloud data centers are growing and they’re deploying GPUs and TPUs. Hell the growth in parallel computing in TPU style chips means we’re still going to be building ever more powerful chips with these newer architectures. Sure single threaded performance might not grow but I assume there’s lots of room to make beefier machine learning chips.
I've been hearing hand wringing over the end of Moore's law for years, and I don't really get the big deal for the majority of consumers. I'd bet most of us write a not, maybe do some spreadsheets, browse the web, and play games. The gaming might take more power, but we've had specialized computers for those for ages. Moore's law ending might be bad for cutting edge research, but I don't see it being a huge deal for regular consumers.
Honestly, even for games I feel like you reach a point where it's not even worth it to take advantage of available GPU power, just because designing all that photorealistic stuff takes a lot of time, money, and effort, and might look like crap in a generation or two anyway, whereas a distinctive art style is cheaper, lower on requirements, and ages better.
That last bit's a super interesting thought. I've wondered before about whether the end of moore's law could bring about the 'age of optimization', where we work to eke out every last drop of performance from stagnant nodes through more and more efficient programming and processor design. Looking back at just how much old computers could do, and on a microscopic fraction of today's transistor budgets, really makes you appreciate the gains that could be had there. The people that designed and coded those things were some goddamn wizards.
l'd like to add that Blender has Optix denoising since a few versions. Essentially this removes noise from a 3D render through some neural net powered mathematical formula, which enables you to make pretty good-looking renders in a fraction of the time it used to take, as the algorithm "guesses" what should be in place of the missing/noisy pixels.
I think this has also been implemented in the editor itself(maybe in a beta) so near-realtime photorealistic renders are also possible.
I was watching a WWDC video on ray tracing with Metal, and it turns out that Metal has built-in shaders for doing this type of denoising! I thought that was kind of crazy because I just saw this type of work presented at SIGGRAPH last year.
That's a good point. I'm not a huge gamer, but I do really love the pixel art style.
I'm a fan of well done cel shading. Wind waker is almost 20 years old, and it still looks like it could've been released yesterday (at least when some resolution hacks are applied). Not too many titles of the same era can say the same.
I agree. I think the only place where this has a real chance of becoming a serious roadblock is for really heavy simulation games like dwarf fortress or HoI4.
Fearing the end of Moore's Law seems like being afraid of not finding any stronger metals after the discovery of titanium. As if such an end means that all industries dependent upon metal will collapse, or new companies can't start up using tried-and-true materials. Sure we can't create a steel-toed boot that can protect you from Mt. Everest getting dropped on your foot, but the important difficult problems are ones that can scale up. You can reach the moon, you just need thick walls on your ship. We can crunch a ton of data, it will just take a lot of parallel processing.
And if I'm wrong, maybe it's a good thing that we are stopped short of creating AGI.
Edit:
They mention 3D transistors in the article. Would chips with an arbitrary numbers of layers nearly triple the number of transistors every 2 years? Under Moore's Law a 2D chip doubles in density every 2 years. Across 1 dimension that's 2½ increase in density, and I suppose 23⁄2 across 3 (approx. 2.8).