39 votes

Elon Musk threatens to ban iPhones and MacBooks at his companies after Apple announces OpenAI partnership

29 comments

  1. [19]
    DeaconBlue
    Link
    One of the only things he's said that I agree with him on. Of course, the same can be said for every other application you have installed (including Twitter).

    Apple has no clue what's actually going on once they hand your data over to OpenAI

    One of the only things he's said that I agree with him on. Of course, the same can be said for every other application you have installed (including Twitter).

    40 votes
    1. [5]
      babypuncher
      Link Parent
      Which is probably why Apple's AI features only use ChatGPT as a last resort and require user confirmation every time it happens. I have a feeling Musk cares about none of this however, and is just...

      Which is probably why Apple's AI features only use ChatGPT as a last resort and require user confirmation every time it happens.

      I have a feeling Musk cares about none of this however, and is just mad that Apple is working with one of his competitors.

      58 votes
      1. [4]
        elight
        Link Parent
        And it's this guy's ego that assuredly keeps my car from working nearly as well as it could with my phone. Personally, If I knew how much of a megalomaniacal misogynistic ass he was, we'd have...

        And it's this guy's ego that assuredly keeps my car from working nearly as well as it could with my phone.

        Personally, If I knew how much of a megalomaniacal misogynistic ass he was, we'd have held out for another electric vehicle.

        25 votes
        1. [2]
          pyeri
          Link Parent
          I think Elon Musk is trying to emulate the "Iron Man" Tony Stark and his Expo!

          I think Elon Musk is trying to emulate the "Iron Man" Tony Stark and his Expo!

          5 votes
          1. babypuncher
            Link Parent
            He just doesn't realize that a big part of what made Tony Stark "Iron Man" and not just another insufferable billionaire asshole was being trapped in the desert for a few months and having a...

            He just doesn't realize that a big part of what made Tony Stark "Iron Man" and not just another insufferable billionaire asshole was being trapped in the desert for a few months and having a little "come to jesus" moment.

            17 votes
        2. dpkonofa
          Link Parent
          My biggest complaint with our 2017 Tesla is that I can’t get CarPlay on it. I cannot stand the interface on it. The sound system is the best I’ve ever had in any car but I can’t stand the UI.

          My biggest complaint with our 2017 Tesla is that I can’t get CarPlay on it. I cannot stand the interface on it. The sound system is the best I’ve ever had in any car but I can’t stand the UI.

          3 votes
    2. [11]
      stu2b50
      Link Parent
      It's only kinda true. Apple doesn't technically have any idea what's going on, but they do have contracts, and a contract with Apple is not a contract you breach lightly. This is the same with any...

      It's only kinda true. Apple doesn't technically have any idea what's going on, but they do have contracts, and a contract with Apple is not a contract you breach lightly. This is the same with any cloud service; if you use GSuite, you have no idea what Google is doing with your data, but you do have a business contract with them, and you do know you can sue them for a bajillion dollars if you find they're doing something nefarious with it.

      25 votes
      1. [9]
        Eji1700
        Link Parent
        I wouldn't bet a ton on that right now. Everyone's aware the AI is a BIG thing and they'll risk whatever lawsuit from whoever to make sure they're relevant players if they can. This is in part due...

        and a contract with Apple is not a contract you breach lightly.

        I wouldn't bet a ton on that right now.

        Everyone's aware the AI is a BIG thing and they'll risk whatever lawsuit from whoever to make sure they're relevant players if they can. This is in part due to what they see as a massive potential market to break into, but perhaps more importantly, because there's not going to be a lot of relevant case law.

        The same obfuscating nonsense that has allowed tech companies to throw phone book sized terms and conditions at users for years is now, once again, playing for and against them in the courts. There's legit unsettled questions that will have to be interpreted, and there's a lot of "probably settled but enjoy spending the 4 years fighting about it while we set up" stuff as well.

        And that's with EVERY companies lawyers probably responding to every complaint from various levels of the company with something akin to "hmmm yes...it's an interesting case...we'll have to look into it.

        14 votes
        1. [8]
          sparksbet
          Link Parent
          There's not a lot of relevant case law on the copyright side of things, but there is absolutely PLENTY of caselaw on breach of contract and misuse of customer data. Playing fast and loose with...

          This is in part due to what they see as a massive potential market to break into, but perhaps more importantly, because there's not going to be a lot of relevant case law.

          There's not a lot of relevant case law on the copyright side of things, but there is absolutely PLENTY of caselaw on breach of contract and misuse of customer data. Playing fast and loose with copyright law that hasn't been settled in court is one thing. Playing fast and loose with the terms of a contract you have with another company -- particularly one like Apple, which has more money than you to spend on a lawsuit? That's a completely different ball game, and it's absolutely not one with the same types of unsettled questions that the various copyright issues have.

          17 votes
          1. [7]
            Eji1700
            Link Parent
            Not in relation to AI training models. If they say "don't use our data for a model" and then you don't, but you do sell it to a company(which was allowed) who provides data for training models,...

            there is absolutely PLENTY of caselaw on breach of contract and misuse of customer data.

            Not in relation to AI training models.

            If they say "don't use our data for a model" and then you don't, but you do sell it to a company(which was allowed) who provides data for training models, which you use, are you in violation? There's thousands of questions like that, and I think people overestimate how on the cutting edge even top legal teams are. Sure the disney team literally sets copyright law, but that's because they've basically got THE team for it who studies and is aware of nothing else.

            That team doesn't exist for AI because it hasn't been around long enough. Frankly Musk is the ONLY idiot outside of Trump who I think could lose a case like this quickly and comprehensively. I will not be shocked if tons is settled out between all the major players over the coming years.

            2 votes
            1. sparksbet
              Link Parent
              This absolutely is not new, legally, even in relation to AI training models. I work in this space as a data scientist and dealing with "am I allowed to use this data for training" isn't new (and...

              This absolutely is not new, legally, even in relation to AI training models. I work in this space as a data scientist and dealing with "am I allowed to use this data for training" isn't new (and even when you're allowed to train on customer data, there are usually further agreements that affect how you store their data and who has access). "Am I allowed to give this data to third parties and for what purpose" is even less new, and will absolutely be spelled out in a contract -- especially a contract with a company like OpenAI whose whole shtick is doing AI stuff with your data. The company I work for has a contract with OpenAI (and other providers like them) and one of the principle things we're paying for is for them to not train on our inputs. Our customers are financial institutions, so we'd be ruined if they found out we were giving OpenAI their data to train on. We would absolutely have grounds to sue OpenAI if they broke our contract and trained on our input data anyway, and it would not be a novel legal question.

              "Do what you said you'd do in the contract and don't do things you promise not to do in the contract" is pretty basic contract law, and AI barely affects the actual legal issue there. It's possible there are contracts out there between businesses that contain language that allows for some loopholes around AI training, but any company with competent lawyers will have already updated those to cover their asses. A company like Apple, that definitely has a very competent, well-paid in-house legal team, making a contract with an AI-focused company like OpenAI? They absolutely have language in the contract covering what OpenAI is allowed to do with the data.

              9 votes
            2. [5]
              stu2b50
              Link Parent
              That has nothing to do with AI. I'm sure Apple knows how to craft a contract. There is nothing about the deal Apple has with OpenAI that has any relevance to AI as a technology. It simply would...

              If they say "don't use our data for a model" and then you don't, but you do sell it to a company(which was allowed) who provides data for training models, which you use, are you in violation?

              That has nothing to do with AI. I'm sure Apple knows how to craft a contract. There is nothing about the deal Apple has with OpenAI that has any relevance to AI as a technology. It simply would need to state that OpenAI does not store nor transfer any user data from this engagement.

              These are very common; any store you go to that accepts credit card payments has one with Visa/MC.

              There is no untested legal questions about having user data agreements.

              5 votes
              1. [4]
                Eji1700
                Link Parent
                Not sure how you're arriving at that conclusion given this is 100% what they want the data for. Also not sure how you're arriving at that conclusion either. At the end of the data modern LLM's are...

                That has nothing to do with AI.

                Not sure how you're arriving at that conclusion given this is 100% what they want the data for.

                There is nothing about the deal Apple has with OpenAI that has any relevance to AI as a technology.

                Also not sure how you're arriving at that conclusion either. At the end of the data modern LLM's are all about data and models. ANY kind of data, even indirect data, is valuable and related. Just the data of "here's the kind of users who are using our integration, and here's how they're using it" is 100% useful and related, and that's before you get into any larger data that may or may not be acceptable for them to get their hands on.

                It simply would need to state that OpenAI does not store nor transfer any user data from this engagement.

                I'm willing to bet a decent amount the TOS on this, despite being a war and peace novel, is not going to be that cut and dry. Their model can't work if you can't transfer any user data. The mere act of interacting with the LLM is a transfer of user data. You can say "don't store it", but this gets back to "well we didn't store anything, we did however run data analysis on it to figure out the profiles of everyone who's using it" much like what facebook has been doing forever.

                These are very common; any store you go to that accepts credit card payments has one with Visa/MC.
                There is no untested legal questions about having user data agreements.

                Probably not the best example: https://news.bloomberglaw.com/litigation/consumer-advances-info-broker-suit-against-mastercards-finicity

                1. [3]
                  stu2b50
                  Link Parent
                  My point is that the "AI" part of it has nothing to do with the agreements. An agreement between company X and Y where company X is training their data on company Y is related to "AI", because the...

                  My point is that the "AI" part of it has nothing to do with the agreements. An agreement between company X and Y where company X is training their data on company Y is related to "AI", because the details there matter with how company Y's rights to their data work out.

                  A simple "hands off our user data" agreement has nothing to do with it.

                  At the end of the data modern LLM's are all about data and models. ANY kind of data, even indirect data, is valuable and related. Just the data of "here's the kind of users who are using our integration, and here's how they're using it" is 100% useful and related, and that's before you get into any larger data that may or may not be acceptable for them to get their hands on.

                  This is like a complete non-sequitar. Data is useful in many contexts, again, it does not change anything about the very common B2B contracts where one company requires that the other company stay hands off with their data. How do you think corporate GSuite deals work? Interacting with an LLM requries no more transfer of data than any other piece of cloud software.

                  but this gets back to "well we didn't store anything, we did however run data analysis on it to figure out the profiles of everyone who's using it" much like what facebook has been doing forever.

                  Yeah, so a common thing that all B2B corporate software contracts have covered since the dawn of server-based enterprise software that has nothing to do with LLMs?

                  Probably not the best example:

                  This is again a non-sequitar. What I'm talking about is PCI requirements on CVCs, which Visa and MC very stringently require merchants never to store.

                  5 votes
                  1. [2]
                    Eji1700
                    Link Parent
                    I'm disagreeing because the definition of "was this hands off" is absolutely going to be litigated to death in the coming years because of AI, to summarize. I think you're vastly oversimplifying...

                    A simple "hands off our user data" agreement has nothing to do with it.

                    I'm disagreeing because the definition of "was this hands off" is absolutely going to be litigated to death in the coming years because of AI, to summarize. I think you're vastly oversimplifying the issue as well, but even at that level it's not cut and dry.

                    This is like a complete non-sequitar.

                    No, it's not, because if you say "hands off my data" and then I use the metadata and analytics to improve my LLM, was that hands off? By normal definitions right now, it is. This is common practice on the data side of many an agreement, but the affect it has with LLM's is going to be argued all over the place. Including corporate GSuite deals.

                    Yeah, so a common thing that all B2B corporate software contracts have covered since the dawn of server-based enterprise software that has nothing to do with LLMs?

                    Except again, LLM's absolutely change the equation on how this will be viewed, and given the original argument was "no one was going to risk breaching this contract", there are plenty of examples of companies doing exactly that, whihc brings me to:

                    This is again a non-sequitar. What I'm talking about is PCI requirements on CVCs, which Visa and MC very stringently require merchants never to store.

                    Yes that thing that MC was storing and selling illegally instead, and saw a risk worth taking in doing so.

                    1. stu2b50
                      Link Parent
                      How? What is the difference with LLMs? There is none, this is a type of corporate contract. The LLM part makes literally no difference. LLMs are not novel in any way from this type of B2B data...

                      By normal definitions right now, it is.

                      How? What is the difference with LLMs? There is none, this is a type of corporate contract. The LLM part makes literally no difference. LLMs are not novel in any way from this type of B2B data contract. What is the novel part?

                      Yes that thing that MC was storing and selling illegally instead, and saw a risk worth taking in doing so.

                      Because there is an imbalance of power, and it was a grey area. In the other direction, where an individual merchant is both much smaller and dependent on MC, you do not see PCI violations very often, at least intentional ones from legitimate merchants.

                      In this agreement, Apple is the 3T company with liquidity that OpenAI needs.

                      6 votes
      2. dpkonofa
        Link Parent
        I don’t even know if that’s true. Apple bought an AI company a few years ago and I’m pretty sure they’re working on the private cloud servers exclusively.

        I don’t even know if that’s true. Apple bought an AI company a few years ago and I’m pretty sure they’re working on the private cloud servers exclusively.

    3. JCAPER
      Link Parent
      Important context: these devices will have a hybrid system, some commands are run locally, some on the cloud. If Siri needs to call up the cloud, it will ask the user if he wants to do it, every...

      Important context:

      • these devices will have a hybrid system, some commands are run locally, some on the cloud. If Siri needs to call up the cloud, it will ask the user if he wants to do it, every time it happens;

      • chatGPT is the first supported service, but they said that they will support other services (I presume the concurrent services from Google, Antrhopic, Meta, etc)

      19 votes
    4. pyeri
      Link Parent
      When your personal data crosses institutional boundaries, all bets become off as the danger of data theft and misuse increases multifold. Not to mention, this very phenomenon (integration of data...

      When your personal data crosses institutional boundaries, all bets become off as the danger of data theft and misuse increases multifold. Not to mention, this very phenomenon (integration of data from multiple sources) is what sustains the power of big tech oligarchies. For this same reason, even Zuckerberg can't integrate his WhatsApp's user data with that of Facebook as there are strict privacy laws like GDPR which prevent that (except for Asian emerging economies where legislation is poor and Zuck no doubt manages to steal data there).

      Of course, the same can be said for every other application you have installed (including Twitter).

      On this point, a strong distinction needs to be made between a user's public data sharing (sites like twitter, facebook, etc. which the user knows will be public and/or contained in browser sandbox) and private data sharing (operating system level personal stuff that iOS/Android has access to). A breach of latter kind of data is far more devastating for a user than the former.

      6 votes
  2. [5]
    blindmikey
    Link
    Can someone please replace this cry baby of a man. Apple explicitly stated it would ask your permission before reaching out to cloud services for a query. I don't like apple products and I used to...

    Can someone please replace this cry baby of a man. Apple explicitly stated it would ask your permission before reaching out to cloud services for a query. I don't like apple products and I used to like Elon; Threatening to ban your employees from using Apple devices at work is just a new level of petty.

    20 votes
    1. [4]
      Deely
      Link Parent
      From reading HN comments its a bit different: they will ask you before reaching OpenAI clouds, but reaching Apple clouds is fine. Correct me if I'm wrong.

      Apple explicitly stated it would ask your permission before reaching out to cloud services for a query.

      From reading HN comments its a bit different: they will ask you before reaching OpenAI clouds, but reaching Apple clouds is fine. Correct me if I'm wrong.

      4 votes
      1. [3]
        pumpkin-eater
        Link Parent
        That's right, although they have some interesting and verifiable approaches to let the queries that run in Apple's cloud be a private extension of your devices

        That's right, although they have some interesting and verifiable approaches to let the queries that run in Apple's cloud be a private extension of your devices

        4 votes
        1. [2]
          jackson
          Link Parent
          Agreed, it's pretty interesting. Here's the preliminary writeup, but there's something more detailed coming when it enters beta: https://security.apple.com/blog/private-cloud-compute/

          Agreed, it's pretty interesting. Here's the preliminary writeup, but there's something more detailed coming when it enters beta: https://security.apple.com/blog/private-cloud-compute/

          5 votes
          1. BitsMcBytes
            Link Parent
            So Apple basically launched a blockchain to verify PCC events.

            Our commitment to verifiable transparency includes:
            Publishing the measurements of all code running on PCC in an append-only and cryptographically tamper-proof transparency log.
            Making the log and associated binary software images publicly available for inspection and validation by privacy and security experts.
            Publishing and maintaining an official set of tools for researchers analyzing PCC node software.
            Rewarding important research findings through the Apple Security Bounty program.

            So Apple basically launched a blockchain to verify PCC events.

            1 vote
  3. Cldfire
    Link
    He's clearly only saying this because he has a grudge against OpenAI. Incredibly poor take.

    He's clearly only saying this because he has a grudge against OpenAI. Incredibly poor take.

    9 votes
  4. [3]
    gil
    Link
    I don't like Musk at all, but I was disappointed to see Apple integrating with ChatGPT. They always promote privacy as one of their core values and being a reason to use their services instead of...

    I don't like Musk at all, but I was disappointed to see Apple integrating with ChatGPT. They always promote privacy as one of their core values and being a reason to use their services instead of competitors. Even if it's optional, I see it a bit like your doctor giving discount coupons for cigarettes.

    9 votes
    1. [2]
      Tirion
      Link Parent
      They wanted the best possible alternative LLM on their devices. I don’t think the deal is exclusive though – other alternative models are coming too.

      They wanted the best possible alternative LLM on their devices. I don’t think the deal is exclusive though – other alternative models are coming too.

      6 votes
      1. gil
        Link Parent
        Yeah, I get it. I think it's a good business decision, OpenAI is far ahead others AFAIK. But it's just not aligned with the story they tell us.

        Yeah, I get it. I think it's a good business decision, OpenAI is far ahead others AFAIK. But it's just not aligned with the story they tell us.

        2 votes