15 votes

Intel insider claims it finally lost Apple because Skylake QA 'was abnormally bad'

11 comments

  1. [11]
    gpl
    Link
    What the hell is going on at Intel? The past few years have been so bad for them. Complacency? Leadership?

    What the hell is going on at Intel? The past few years have been so bad for them. Complacency? Leadership?

    7 votes
    1. [10]
      Ellimist
      Link Parent
      I'd wager probably a combination. Their only competition has been AMD and AMD had, for a long time, focused on being the "budget" alternative with CPUs/GPUs that were good but not the same level...

      I'd wager probably a combination. Their only competition has been AMD and AMD had, for a long time, focused on being the "budget" alternative with CPUs/GPUs that were good but not the same level of Intel but you paid half the price.

      Lately, it seems like AMD is really pushing their quality to match Intel and Intel seems like it's struggling to keep their advantage

      That being said, my opinion is that of a purely enthusiast PC builder so I could be very wrong here

      7 votes
      1. [3]
        Akir
        Link Parent
        Well, it's not like they were trying to focus on the low end; they simply didn't have that good of a product at the time. They offered more cores than Intel to be sure, but single thread...

        Well, it's not like they were trying to focus on the low end; they simply didn't have that good of a product at the time. They offered more cores than Intel to be sure, but single thread performance is really important the way most applications are written at the moment, and AMD really struggled there.

        And then beyond that, Intel had more resources to deal with hardware OEMs and provide them with some less tangible benefits (things like support for engineers making computer boards and software support packages).

        I would assume that part of the problem with Intel's strategy (on top of complacency and leadership issues) was that they stopped making improvements on their CPUs and started focusing their attention on other things like manufacturing processes, cellular radios, discrete graphics cards, and storage technology.

        6 votes
        1. [2]
          Comment deleted by author
          Link Parent
          1. frostycakes
            Link Parent
            Are they going to be competitive with the integrated Vega graphics on current Ryzens, or RDNA on newer ones? I thought those were more or less as good as it gets for integrated graphics right now....

            So basically, we're getting really good integrated graphics from Gen12.

            Are they going to be competitive with the integrated Vega graphics on current Ryzens, or RDNA on newer ones? I thought those were more or less as good as it gets for integrated graphics right now. I mean, Intel released a couple chips with integrated Vega relatively recently, which I can't imagine them ever doing if their graphics solution was actually competitive.

            1 vote
        2. arghdos
          (edited )
          Link Parent
          For workstation users, I mostly agree — most important applications are single threaded (or at most, a handful of threads). But for datacenters and HPC workloads, more cores is (usually) king....

          They offered more cores than Intel to be sure, but single thread performance is really important the way most applications are written at the moment, and AMD really struggled there.

          For workstation users, I mostly agree — most important applications are single threaded (or at most, a handful of threads). But for datacenters and HPC workloads, more cores is (usually) king. Running 2x instances of a thing for a similar (or lower) price of running 1x instances at 25% faster (or whatever, not counting those semi-annual microcode updates that nerf performance) will win at scale. Hence the success of Rome for datacenters (even if Intel still probably owns the bulk of the market). Chiplets paid off in a big way

          1 vote
      2. [6]
        DeFaced
        Link Parent
        I've always figured Intel was trying to cut corners to keep up with AMD at this point. With the next Ryzen chips coming out and basically filing a market that Intel has seemingly ignored for years...

        I've always figured Intel was trying to cut corners to keep up with AMD at this point. With the next Ryzen chips coming out and basically filing a market that Intel has seemingly ignored for years and years because they could afford too, they've had to drop some QA to keep up. Intel has no true $250 CPU to compete directly with the 3600x or even the refreshes. You have the i5's that are marginally better for fewer cores and threads, but faster single thread that will eventually be pointless in the next year or two. Intel is in trouble, and losing Apple is going to put some serious heat in their corner.

        1 vote
        1. frostycakes
          Link Parent
          Not to mention with bugs like Meltdown, MDS, and the like, it seems like a lot of Intel's performance advantage over the Heavy Equipment processors from AMD was due to insecure corner-cutting. I'd...

          Not to mention with bugs like Meltdown, MDS, and the like, it seems like a lot of Intel's performance advantage over the Heavy Equipment processors from AMD was due to insecure corner-cutting.

          I'd be intrigued to see a comparison of a Broadwell and a similarly-specced Excavator chip now with all their relevant security mitigations applied now as compared to without. I bet the delta has gotten significantly smaller by now.

          3 votes
        2. [5]
          Comment deleted by author
          Link Parent
          1. [4]
            DeFaced
            Link Parent
            More applications are moving towards multi-threaded processing than they ever have before thanks to dropping 32bit support. With the latest MacOS 11 Apple is killing off 32bit altogether, Ubuntu...

            More applications are moving towards multi-threaded processing than they ever have before thanks to dropping 32bit support. With the latest MacOS 11 Apple is killing off 32bit altogether, Ubuntu tried to kill off 32bit altogether with 20.04, MSFT is pushing UWP hard these days with Xbox and Series X is going to push that even further. There's never been a true need to move to fully 64bit OS's until recently (and by recently I mean within the past 2 or 3 years), not to mention the push for DX12 and more developers are utilizing Vulkan, so better access to bare metal resources is going to get even better with 64bit operating systems. Of course this could all just be nothing and 32bit could forever be a requirement in modern OS's.

            EDIT: essentially I feel like we've hit a limit that we've never achieved before in computing. Performance right now is so damn good no matter what you buy, AMD or Intel, that the only thing holding back the hardware is the software. It's always been the other way around, the hardware has always held back the software, now things are turning on their heads.

            4 votes
            1. stu2b50
              Link Parent
              I don't see what 32 bit has to do with it. The size of your registers doesn't change whether or not you can write parallel code. It does allow you to write faster programs via larger SIMD...

              I don't see what 32 bit has to do with it. The size of your registers doesn't change whether or not you can write parallel code.

              It does allow you to write faster programs via larger SIMD instructions, but that's by definition on a single core.

              3 votes
            2. frostycakes
              Link Parent
              I'd imagine the ubiquity of multicore/multithread processors anymore (I haven't had a 1C/1T PC since a 2002-vintage Dell with a Pentium 3, and every Android phone I've owned has had multiple cores...

              I'd imagine the ubiquity of multicore/multithread processors anymore (I haven't had a 1C/1T PC since a 2002-vintage Dell with a Pentium 3, and every Android phone I've owned has had multiple cores since 2012), especially in the mobile space with big.LITTLE would have a lot to do with it as well. When having a 6-8 core chip is standard in a power and temperature-constrained device, people are going to look towards optimizing for what the hardware has.

              1 vote
            3. babypuncher
              Link Parent
              Apple actually killed 32-bit support in macOS last year with Catalina. The only valid reason for any platform to keep it around at this point should be support for legacy apps. Developers should...

              Apple actually killed 32-bit support in macOS last year with Catalina.

              The only valid reason for any platform to keep it around at this point should be support for legacy apps. Developers should not be offering 32-bit builds of their software.