I would say the title is clickbait, but he keeps repeating it throughout the video so... At a couple points he admits that actually "Apple stole access to AI from everyone else". I kind of thing...
I would say the title is clickbait, but he keeps repeating it throughout the video so...
At a couple points he admits that actually "Apple stole access to AI from everyone else".
I kind of thing the whole thing is bs, but the gist is
Apple controls your phone and your apps
You use your phone a lot
By putting AI features in the phone UI, users won't look to other companies for AI functionality
but...
Stealing implies taking something from someone else such that the other party no longer has it. That's not happening here in any capacity. Apple's system seems to run on ChatGPT, and this in no way diminishes ChatGPT from having AI, or from other companies from using ChatGPT, or from creating their own generative or other models, or from deploying AI features in their own products. Just because Siri has ChatGPT now doesn't mean that I as a user am not going to get value out of apps also leveraging AI for in-app usage.
Terrible title. Apple is integrating AI into their phones and systems, but is planning to outsource to other models rather than create a new one. So the same thing that lots of other apps do....
Terrible title.
Apple is integrating AI into their phones and systems, but is planning to outsource to other models rather than create a new one.
So the same thing that lots of other apps do. Basically what Google/Bing do now.
So not really "stealing from everyone else" so much as just using AI in the standard way that everyone else already does.
Theyre also going to be adding some hook in points so app developers can make external calls to AI models. Thats probably pretty useful, I dont know. I just wanted to point out how awful I think the title is.
I don't think that's an accurate representation of Apple's "AI" integraation. 90% of it is internal, Apple developed models. Like 50% of the features are on-device models, which is fairly unique...
I don't think that's an accurate representation of Apple's "AI" integraation. 90% of it is internal, Apple developed models. Like 50% of the features are on-device models, which is fairly unique to Apple at the moment.
The only thing that kicks to an external partner and model (OpenAI), is that Siri can, as a last resort, make requests to ChatGPT, and it comes with one of those scary "Your data is being sent to OpenAI" messages EVERY TIME it does this (ala the "you are leaving steam for an external webpage" message).
It is fundamentally very different from how the other companies are doing "AI" integration.
At the risk of being uncharitable, it feels like a lot of people immediately wrote off Apple's LLM stuff as yet another ChatGPT wrapper the second they saw the word "OpenAI" when skimming.
At the risk of being uncharitable, it feels like a lot of people immediately wrote off Apple's LLM stuff as yet another ChatGPT wrapper the second they saw the word "OpenAI" when skimming.
Would be interested in getting other peoples takes My initial thought is that all google has to do to compete is integrate Gemini into their voice assistant, but maybe there’s some nuance that...
Would be interested in getting other peoples takes
My initial thought is that all google has to do to compete is integrate Gemini into their voice assistant, but maybe there’s some nuance that could prevent them from doing so
They're trying, but it's a solid step backwards. Gemini's half baked in obnoxious ways. Assistant plays songs on your app of choice. Gemini only supports YouTube music. There's a variety of voices...
They're trying, but it's a solid step backwards. Gemini's half baked in obnoxious ways.
Assistant plays songs on your app of choice. Gemini only supports YouTube music. There's a variety of voices for Assistant, but Gemini has just one voice. Assistant's capable of asking follow-up questions or asking for confirmation and automatically firing your mic back up to listen for your response, but if your request for Gemini isn't one-shot, you'll have to pick up and fiddle with your phone.
If you do anything more advanced than "remind me to X in Y minutes," Gemini falls flat as an assistant. And even when something is within its limited capabilities, sometimes it gets the idea that it's an impossible task and you cannot convince it otherwise.
One thing I haven't seen mentioned here or really anywhere is how Apple suggested the flow would work. Users ask Siri a question, if the answer is not good or Siri doesn't know chatGPT or whatever...
One thing I haven't seen mentioned here or really anywhere is how Apple suggested the flow would work. Users ask Siri a question, if the answer is not good or Siri doesn't know chatGPT or whatever other third part will be suggested. Apple will get all the data on what kinds of questions are being asked and falling through to chatGOT. They will also get all the data on the answers that users are receiving and how they work with it.
Except that’s not how it works.. https://www.apple.com/apple-intelligence/ Processing happens on-device, on the off chance it has to leave it goes to third-party verified private compute nodes...
Processing happens on-device, on the off chance it has to leave it goes to third-party verified private compute nodes that do not retain the data. If you leverage ChatGPT (optional), they may get some of your data, but they’re even clear when that will happen and explicitly warn you.
I think you fundamentally misunderstand how it works - I’d rewatch the videos revealing it, as they go into detail.
I would say the title is clickbait, but he keeps repeating it throughout the video so...
At a couple points he admits that actually "Apple stole access to AI from everyone else".
I kind of thing the whole thing is bs, but the gist is
but...
Stealing implies taking something from someone else such that the other party no longer has it. That's not happening here in any capacity. Apple's system seems to run on ChatGPT, and this in no way diminishes ChatGPT from having AI, or from other companies from using ChatGPT, or from creating their own generative or other models, or from deploying AI features in their own products. Just because Siri has ChatGPT now doesn't mean that I as a user am not going to get value out of apps also leveraging AI for in-app usage.
Terrible title.
Apple is integrating AI into their phones and systems, but is planning to outsource to other models rather than create a new one.
So the same thing that lots of other apps do. Basically what Google/Bing do now.
So not really "stealing from everyone else" so much as just using AI in the standard way that everyone else already does.
Theyre also going to be adding some hook in points so app developers can make external calls to AI models. Thats probably pretty useful, I dont know. I just wanted to point out how awful I think the title is.
I don't think that's an accurate representation of Apple's "AI" integraation. 90% of it is internal, Apple developed models. Like 50% of the features are on-device models, which is fairly unique to Apple at the moment.
The only thing that kicks to an external partner and model (OpenAI), is that Siri can, as a last resort, make requests to ChatGPT, and it comes with one of those scary "Your data is being sent to OpenAI" messages EVERY TIME it does this (ala the "you are leaving steam for an external webpage" message).
It is fundamentally very different from how the other companies are doing "AI" integration.
At the risk of being uncharitable, it feels like a lot of people immediately wrote off Apple's LLM stuff as yet another ChatGPT wrapper the second they saw the word "OpenAI" when skimming.
I dont know if its accurate to reality, it was more a summary of what it sounded like the video was saying.
Would be interested in getting other peoples takes
My initial thought is that all google has to do to compete is integrate Gemini into their voice assistant, but maybe there’s some nuance that could prevent them from doing so
They're trying, but it's a solid step backwards. Gemini's half baked in obnoxious ways.
Assistant plays songs on your app of choice. Gemini only supports YouTube music. There's a variety of voices for Assistant, but Gemini has just one voice. Assistant's capable of asking follow-up questions or asking for confirmation and automatically firing your mic back up to listen for your response, but if your request for Gemini isn't one-shot, you'll have to pick up and fiddle with your phone.
If you do anything more advanced than "remind me to X in Y minutes," Gemini falls flat as an assistant. And even when something is within its limited capabilities, sometimes it gets the idea that it's an impossible task and you cannot convince it otherwise.
So glad I've managed to shut down the requests to add Gemini so far. I really hate the insistence on shoving unfinished and unwanted products at me.
Gemini is the assistant. It will replace the original if you install it on your phone. Gemini is just an option, currently.
One thing I haven't seen mentioned here or really anywhere is how Apple suggested the flow would work. Users ask Siri a question, if the answer is not good or Siri doesn't know chatGPT or whatever other third part will be suggested. Apple will get all the data on what kinds of questions are being asked and falling through to chatGOT. They will also get all the data on the answers that users are receiving and how they work with it.
Except that’s not how it works..
https://www.apple.com/apple-intelligence/
Processing happens on-device, on the off chance it has to leave it goes to third-party verified private compute nodes that do not retain the data. If you leverage ChatGPT (optional), they may get some of your data, but they’re even clear when that will happen and explicitly warn you.
I think you fundamentally misunderstand how it works - I’d rewatch the videos revealing it, as they go into detail.