15 votes

Intel insider claims it finally lost Apple because Skylake QA 'was abnormally bad'

15 comments

  1. [15]
    gpl
    Link
    What the hell is going on at Intel? The past few years have been so bad for them. Complacency? Leadership?

    What the hell is going on at Intel? The past few years have been so bad for them. Complacency? Leadership?

    7 votes
    1. [14]
      Ellimist
      Link Parent
      I'd wager probably a combination. Their only competition has been AMD and AMD had, for a long time, focused on being the "budget" alternative with CPUs/GPUs that were good but not the same level...

      I'd wager probably a combination. Their only competition has been AMD and AMD had, for a long time, focused on being the "budget" alternative with CPUs/GPUs that were good but not the same level of Intel but you paid half the price.

      Lately, it seems like AMD is really pushing their quality to match Intel and Intel seems like it's struggling to keep their advantage

      That being said, my opinion is that of a purely enthusiast PC builder so I could be very wrong here

      7 votes
      1. [5]
        Akir
        Link Parent
        Well, it's not like they were trying to focus on the low end; they simply didn't have that good of a product at the time. They offered more cores than Intel to be sure, but single thread...

        Well, it's not like they were trying to focus on the low end; they simply didn't have that good of a product at the time. They offered more cores than Intel to be sure, but single thread performance is really important the way most applications are written at the moment, and AMD really struggled there.

        And then beyond that, Intel had more resources to deal with hardware OEMs and provide them with some less tangible benefits (things like support for engineers making computer boards and software support packages).

        I would assume that part of the problem with Intel's strategy (on top of complacency and leadership issues) was that they stopped making improvements on their CPUs and started focusing their attention on other things like manufacturing processes, cellular radios, discrete graphics cards, and storage technology.

        6 votes
        1. [3]
          hungariantoast
          Link Parent
          One of these I like, the other... not so much. Intel's discrete graphics work is actually (hopefully) going to provide some very real benefits for consumers. The upcoming Tiger Lake chips should...

          discrete graphics cards, and storage technology

          One of these I like, the other... not so much.

          Intel's discrete graphics work is actually (hopefully) going to provide some very real benefits for consumers. The upcoming Tiger Lake chips should have vastly improved graphics performance over Ice Lake (which was already a good improvement over <previous-14nm-revision>), while being more efficient than the competing low-end Nvidia chips.

          So basically, we're getting really good integrated graphics from Gen12.

          Discrete graphics... not so much. Still a lot of catching up to do there.

          As for their storage technology, specifically Optane, it seems like a pain more than anything else. As far as I can tell, getting an Optane-enabled SSD means getting a slower SSD with a hyper-performant, 32GB Optane cache. That can provide serious performance benefits if you allow the Optane drive to do its thing, using up the CPU, draining battery...

          Oh, and it doesn't work on Linux?

          So if you're like me, and are shopping around for a laptop that you want to run Linux on, all Optane guarantees for you is a slower SSD and a 32GB separate, hyper-fast drive.

          I'd rather just have a regularly NVME drive with better speeds, personally.

          4 votes
          1. [2]
            frostycakes
            Link Parent
            Are they going to be competitive with the integrated Vega graphics on current Ryzens, or RDNA on newer ones? I thought those were more or less as good as it gets for integrated graphics right now....

            So basically, we're getting really good integrated graphics from Gen12.

            Are they going to be competitive with the integrated Vega graphics on current Ryzens, or RDNA on newer ones? I thought those were more or less as good as it gets for integrated graphics right now. I mean, Intel released a couple chips with integrated Vega relatively recently, which I can't imagine them ever doing if their graphics solution was actually competitive.

            1 vote
            1. hungariantoast
              (edited )
              Link Parent
              Yes. Keeping in mind that there are no official benchmarks and all we have to go on right now are leaks, it so far appears that the upcoming Gen12 integrated graphics will be neck-and-neck with...

              Are they going to be competitive with the integrated Vega graphics on current Ryzens

              Yes. Keeping in mind that there are no official benchmarks and all we have to go on right now are leaks, it so far appears that the upcoming Gen12 integrated graphics will be neck-and-neck with the Vega 8 chips, like those in the Ryzen 7 4800U.

              So yeah, Intel's next-generation integrated graphics are keeping up with AMD's current offerings. Tiger Lake's multi-core CPU performance probably won't beat the current Ryzen Renoir chips though.

              Intel released a couple chips with integrated Vega relatively recently

              Yes, the Kaby Lake G processors that were released at the beginning of 2018. However, these were not integrated graphics, but discrete graphics. They were basically AMD alternatives to Nvidia's MX series chips, and they still perform quite well today, handily beating current Vega 8/10/11 integrated chips in benchmarks.


              Honestly, if you're shopping around for an ultrabook or other laptop with integrated graphics, the only reasons I can think of for picking an Intel-based laptop over an AMD one right now would be subjective stuff like design, or you want a Thunderbolt 3 port.

              So hopefully by this time next year we will be seeing a plethora of AMD laptops with USB4, Thunderbolt 3 compatible ports and can knock out one of those reasons.

              (Oh and also, AMD and Intel need to get on the ball with supporting their mobile CPUs on Linux. AMD's CPU support is just plain lacking. Ryzen Renoir chips still don't have accelerometer support in the kernel. Intel's integrated graphics support is split between two different drivers. For everyday use the Linux situation is fine, but they both need to up their game and fix these corner cases.)

        2. arghdos
          (edited )
          Link Parent
          For workstation users, I mostly agree — most important applications are single threaded (or at most, a handful of threads). But for datacenters and HPC workloads, more cores is (usually) king....

          They offered more cores than Intel to be sure, but single thread performance is really important the way most applications are written at the moment, and AMD really struggled there.

          For workstation users, I mostly agree — most important applications are single threaded (or at most, a handful of threads). But for datacenters and HPC workloads, more cores is (usually) king. Running 2x instances of a thing for a similar (or lower) price of running 1x instances at 25% faster (or whatever, not counting those semi-annual microcode updates that nerf performance) will win at scale. Hence the success of Rome for datacenters (even if Intel still probably owns the bulk of the market). Chiplets paid off in a big way

          1 vote
      2. [8]
        DeFaced
        Link Parent
        I've always figured Intel was trying to cut corners to keep up with AMD at this point. With the next Ryzen chips coming out and basically filing a market that Intel has seemingly ignored for years...

        I've always figured Intel was trying to cut corners to keep up with AMD at this point. With the next Ryzen chips coming out and basically filing a market that Intel has seemingly ignored for years and years because they could afford too, they've had to drop some QA to keep up. Intel has no true $250 CPU to compete directly with the 3600x or even the refreshes. You have the i5's that are marginally better for fewer cores and threads, but faster single thread that will eventually be pointless in the next year or two. Intel is in trouble, and losing Apple is going to put some serious heat in their corner.

        1 vote
        1. frostycakes
          Link Parent
          Not to mention with bugs like Meltdown, MDS, and the like, it seems like a lot of Intel's performance advantage over the Heavy Equipment processors from AMD was due to insecure corner-cutting. I'd...

          Not to mention with bugs like Meltdown, MDS, and the like, it seems like a lot of Intel's performance advantage over the Heavy Equipment processors from AMD was due to insecure corner-cutting.

          I'd be intrigued to see a comparison of a Broadwell and a similarly-specced Excavator chip now with all their relevant security mitigations applied now as compared to without. I bet the delta has gotten significantly smaller by now.

          3 votes
        2. [6]
          hungariantoast
          Link Parent
          Why specifically the next year or two?

          but faster single thread that will eventually be pointless in the next year or two

          Why specifically the next year or two?

          2 votes
          1. [5]
            DeFaced
            Link Parent
            More applications are moving towards multi-threaded processing than they ever have before thanks to dropping 32bit support. With the latest MacOS 11 Apple is killing off 32bit altogether, Ubuntu...

            More applications are moving towards multi-threaded processing than they ever have before thanks to dropping 32bit support. With the latest MacOS 11 Apple is killing off 32bit altogether, Ubuntu tried to kill off 32bit altogether with 20.04, MSFT is pushing UWP hard these days with Xbox and Series X is going to push that even further. There's never been a true need to move to fully 64bit OS's until recently (and by recently I mean within the past 2 or 3 years), not to mention the push for DX12 and more developers are utilizing Vulkan, so better access to bare metal resources is going to get even better with 64bit operating systems. Of course this could all just be nothing and 32bit could forever be a requirement in modern OS's.

            EDIT: essentially I feel like we've hit a limit that we've never achieved before in computing. Performance right now is so damn good no matter what you buy, AMD or Intel, that the only thing holding back the hardware is the software. It's always been the other way around, the hardware has always held back the software, now things are turning on their heads.

            4 votes
            1. stu2b50
              Link Parent
              I don't see what 32 bit has to do with it. The size of your registers doesn't change whether or not you can write parallel code. It does allow you to write faster programs via larger SIMD...

              I don't see what 32 bit has to do with it. The size of your registers doesn't change whether or not you can write parallel code.

              It does allow you to write faster programs via larger SIMD instructions, but that's by definition on a single core.

              3 votes
            2. hungariantoast
              Link Parent
              I did a quick bit of searching and found nothing, so I ask this as a sincere question: What does 64 vs. 32-bit have to do with multi-threading? Is it just the lack of memory constraints on 64-bit?

              I did a quick bit of searching and found nothing, so I ask this as a sincere question:

              What does 64 vs. 32-bit have to do with multi-threading? Is it just the lack of memory constraints on 64-bit?

              2 votes
            3. frostycakes
              Link Parent
              I'd imagine the ubiquity of multicore/multithread processors anymore (I haven't had a 1C/1T PC since a 2002-vintage Dell with a Pentium 3, and every Android phone I've owned has had multiple cores...

              I'd imagine the ubiquity of multicore/multithread processors anymore (I haven't had a 1C/1T PC since a 2002-vintage Dell with a Pentium 3, and every Android phone I've owned has had multiple cores since 2012), especially in the mobile space with big.LITTLE would have a lot to do with it as well. When having a 6-8 core chip is standard in a power and temperature-constrained device, people are going to look towards optimizing for what the hardware has.

              1 vote
            4. babypuncher
              Link Parent
              Apple actually killed 32-bit support in macOS last year with Catalina. The only valid reason for any platform to keep it around at this point should be support for legacy apps. Developers should...

              Apple actually killed 32-bit support in macOS last year with Catalina.

              The only valid reason for any platform to keep it around at this point should be support for legacy apps. Developers should not be offering 32-bit builds of their software.