13 votes

We’re not prepared for the end of Moore’s Law

12 comments

  1. [2]
    spctrvl
    Link
    I'm not sure that I buy this argument, particularly towards the end of the article when it goes into economic ramifications of slowing progress in miniaturization. How much is really riding on...

    I'm not sure that I buy this argument, particularly towards the end of the article when it goes into economic ramifications of slowing progress in miniaturization. How much is really riding on massive year-over-year improvements in transistor counts? What, exactly, do we need to prepare for?

    If economists are right, and much of the growth in the 1990s and early 2000s was a result of microchips—and if, as some suggest, the sluggish productivity growth that began in the mid-2000s reflects the slowdown in computational progress—then, says Thompson, “it follows you should invest enormous amounts of money to find the successor technology. We’re not doing it. And it’s a public policy failure.”

    I think that's a bit of a logical leap. If the slower productivity growth is even related to computerization, I think it's much more likely to do with all of the low hanging fruit being picked, rather than slower gains in raw computational power.

    A worker with a computer twice as fast is not going to be twice as productive, and I think the vast majority of the gains are either from productivity software or embedded chips. The former doesn't really get better with more powerful computers, once you're past a certain baseline that was achieved a long time ago, even with modern software bloat. The latter is mostly ancient chips on ancient process nodes, churned out for pennies.

    12 votes
    1. tlalexander
      Link Parent
      Plus we can deploy more computers. Cloud data centers are growing and they’re deploying GPUs and TPUs. Hell the growth in parallel computing in TPU style chips means we’re still going to be...

      Plus we can deploy more computers. Cloud data centers are growing and they’re deploying GPUs and TPUs. Hell the growth in parallel computing in TPU style chips means we’re still going to be building ever more powerful chips with these newer architectures. Sure single threaded performance might not grow but I assume there’s lots of room to make beefier machine learning chips.

      3 votes
  2. [9]
    acdw
    Link
    I've been hearing hand wringing over the end of Moore's law for years, and I don't really get the big deal for the majority of consumers. I'd bet most of us write a not, maybe do some...

    I've been hearing hand wringing over the end of Moore's law for years, and I don't really get the big deal for the majority of consumers. I'd bet most of us write a not, maybe do some spreadsheets, browse the web, and play games. The gaming might take more power, but we've had specialized computers for those for ages. Moore's law ending might be bad for cutting edge research, but I don't see it being a huge deal for regular consumers.

    8 votes
    1. [8]
      spctrvl
      Link Parent
      Honestly, even for games I feel like you reach a point where it's not even worth it to take advantage of available GPU power, just because designing all that photorealistic stuff takes a lot of...

      Honestly, even for games I feel like you reach a point where it's not even worth it to take advantage of available GPU power, just because designing all that photorealistic stuff takes a lot of time, money, and effort, and might look like crap in a generation or two anyway, whereas a distinctive art style is cheaper, lower on requirements, and ages better.

      11 votes
      1. [4]
        hungariantoast
        Link Parent
        You're not wrong, designing photorealistic graphics is simply not feasible for even most AAA game companies, they simply have neither the time nor the money to push their products to such a...

        You're not wrong, designing photorealistic graphics is simply not feasible for even most AAA game companies, they simply have neither the time nor the money to push their products to such a ridiculously high standard (and it absolutely would not be worth it in a business sense either).

        However, we are slowly moving towards an era when words like "machine learning" and "AI" are starting to get thrown around as being responsible for generating photorealistic graphics.

        Andrew Price, also known as Blender Guru, also known as the person who makes the Blender donut tutorial videos, did an excellent talk a year or two ago about the potential of "AI" in Blender and graphics in general:

        It's a really great talk and is easy enough to consume, being thirty-minutes in length. I cannot recommend watching it enough.

        I think, over the next few decades, our ability to generate photorealistic graphics with the help of "AI" is going to almost become the norm. What I am really interested in though, and what I have not watched or read anything about yet, is the potential for "AI" to perform optimization work on graphics or code bases in games and software. How much potential could there be for an "AI" to be fed a bunch of source code as input, and spit out other source code in an incredibly optimized format?

        5 votes
        1. spctrvl
          Link Parent
          That last bit's a super interesting thought. I've wondered before about whether the end of moore's law could bring about the 'age of optimization', where we work to eke out every last drop of...

          That last bit's a super interesting thought. I've wondered before about whether the end of moore's law could bring about the 'age of optimization', where we work to eke out every last drop of performance from stagnant nodes through more and more efficient programming and processor design. Looking back at just how much old computers could do, and on a microscopic fraction of today's transistor budgets, really makes you appreciate the gains that could be had there. The people that designed and coded those things were some goddamn wizards.

          7 votes
        2. [2]
          asoftbird
          Link Parent
          l'd like to add that Blender has Optix denoising since a few versions. Essentially this removes noise from a 3D render through some neural net powered mathematical formula, which enables you to...

          l'd like to add that Blender has Optix denoising since a few versions. Essentially this removes noise from a 3D render through some neural net powered mathematical formula, which enables you to make pretty good-looking renders in a fraction of the time it used to take, as the algorithm "guesses" what should be in place of the missing/noisy pixels.

          I think this has also been implemented in the editor itself(maybe in a beta) so near-realtime photorealistic renders are also possible.

          3 votes
          1. joplin
            Link Parent
            I was watching a WWDC video on ray tracing with Metal, and it turns out that Metal has built-in shaders for doing this type of denoising! I thought that was kind of crazy because I just saw this...

            I was watching a WWDC video on ray tracing with Metal, and it turns out that Metal has built-in shaders for doing this type of denoising! I thought that was kind of crazy because I just saw this type of work presented at SIGGRAPH last year.

            2 votes
      2. [2]
        acdw
        Link Parent
        That's a good point. I'm not a huge gamer, but I do really love the pixel art style.

        That's a good point. I'm not a huge gamer, but I do really love the pixel art style.

        2 votes
        1. spctrvl
          Link Parent
          I'm a fan of well done cel shading. Wind waker is almost 20 years old, and it still looks like it could've been released yesterday (at least when some resolution hacks are applied). Not too many...

          I'm a fan of well done cel shading. Wind waker is almost 20 years old, and it still looks like it could've been released yesterday (at least when some resolution hacks are applied). Not too many titles of the same era can say the same.

          4 votes
      3. Kuromantis
        Link Parent
        I agree. I think the only place where this has a real chance of becoming a serious roadblock is for really heavy simulation games like dwarf fortress or HoI4.

        I agree. I think the only place where this has a real chance of becoming a serious roadblock is for really heavy simulation games like dwarf fortress or HoI4.

        1 vote
  3. teaearlgraycold
    (edited )
    Link
    Fearing the end of Moore's Law seems like being afraid of not finding any stronger metals after the discovery of titanium. As if such an end means that all industries dependent upon metal will...

    Fearing the end of Moore's Law seems like being afraid of not finding any stronger metals after the discovery of titanium. As if such an end means that all industries dependent upon metal will collapse, or new companies can't start up using tried-and-true materials. Sure we can't create a steel-toed boot that can protect you from Mt. Everest getting dropped on your foot, but the important difficult problems are ones that can scale up. You can reach the moon, you just need thick walls on your ship. We can crunch a ton of data, it will just take a lot of parallel processing.

    And if I'm wrong, maybe it's a good thing that we are stopped short of creating AGI.

    Edit:

    They mention 3D transistors in the article. Would chips with an arbitrary numbers of layers nearly triple the number of transistors every 2 years? Under Moore's Law a 2D chip doubles in density every 2 years. Across 1 dimension that's 2½ increase in density, and I suppose 23⁄2 across 3 (approx. 2.8).

    7 votes