Apple Intelligence doesn't work the way I want it to
Recently I did an update on my Macbook and it started showing alerts about Apple Intelligence. I've heard a little bit of marketing about this but I haven't really spent any time trying to figure out if it is just hype. Well, I've tried it a few times and I'm completely underwhelmed.
One of marketed features is that Siri is much improved. That would be nice, I thought, because there are only a few use cases like "Set an Alarm" where Siri could ever do anything besides a google search.
So there are two times recently I tried to use this improved Siri to solve a problem. My background using AI: I use Copilot at work. I get mixed results for it, but it does use my local context (open files etc) and is able to ask follow up questions if my prompt is too vague.
First Use Case: I want to solve a technical problem on my laptop
- My Prompt: "Can you help me fix Discord so that audio is shared when I share a video stream"
- My Expectation: Maybe an AI summary of the cause of the issue. Maybe open up system settings or open up Discord or give an explanation of why this is a technical problem on Macs.
- Actual Siri Response: Does an internet search and shows some links. Essentially just did a google search which I could have done by typing the same prompt in a browser.
Second Use case: I want help finding a file on my laptop
In this case, I made a summary of my finances on my laptop a few months ago. I can't remember what I named the file or what kind of file it was. Maybe a spreadsheet? I know it was on my local computer.
- My 1st Prompt: Can you help me find a specific file on my computer
- My Expectation: Maybe some follow up questions where it asks me for a date range or something that is inside the file. Yes, I know that I can do this in Finder but I want Apple Intelligence to save me a few minutes.
- Siri: Shows the result of a web search on how to find files on a computer. The first few results are for Microsoft Windows
- 2nd Prompt: Can you help me find a specific file on my mac
- Siri: Tells me to use Command-space and use the search
In both cases, Siri just acted like a shortcut to a google search. It didn't even recognize that I was asking the question on a Mac. This is same as Siri has always been. I assume that it can still figure out to set a timer and do a few things, but it doesn't seem to be working in a way I would expect an AI to work at all.
Just as an aside, Siri used to be much better. It could do more and answer more questions without resorting to the web and had a connection to the Wolffram Alpha API. I assume the loss of Wolf from the alpha was some kind of licensing thing, but I could never understand why or how it lost so much of its basic functionality especially as compared to Google and Alexa.
Even its basic functionality seems to be somehow getting worse over time. I don't understand how it's somehow taking longer and failing more often when I'm just trying to set an alarm or play a song.
They've really bungled the launch of Apple Intelligence. They still haven't updated Siri beyond giving it the ability to pass your request to ChatGPT instead of searching Google, and you have to opt-in.
It's an absolute throw if you ask me. The big change to Siri with Apple Intelligence is on-screen intelligence. Rolling out the new UI and whatnot without on-screen intelligence was a total misplay.
I work in this space, but on totally different tech.
Siri should get much better, but it may never feel like the magic that is ChatGPT etc.
ChatGPT trained off huge amounts of text based data, and when you apply huge amounts of processing power to huge amounts of data the results are magical.
Apple takes privacy seriously, so I doubt they will ever train off huge amounts of your data.
In fact, Apple went in a totally different direction, and instead of using one giant brain that burned a lot of trees, they created two mini brains that were about a hundredth the size of OpenAI and hyper focused on Apple's use cases. One for Siri. The other for Writing Tools & Summaries etc...
Apple should eventually be able to make Siri work more seamlessly with the devices, using things called agents and tools.
I doubt Apple will ever allow you to tell your device to do absolutely anything, firstly because of Apple's philosophy of only doing a few things really, really well. Secondly, the current state of tech, agentic workflow, are created by humans, so as a result also only do a few things somewhat well.
Ok but they seem to be advertising that this is advanced and running on your own device and it needs a fairly new and powerful device with an M2 chip or whatever to work properly. And the kind of thing I want to do on my device is control the device itself. If I ask it to help me find a file, then why is it going to the internet to find out about it? My device has files on it and a configuration on it and it would be cool if there was an AI on it that did the boring work of poking around in the system app or config files. The people who programmed the mac OS and created the hardware are working at apple and they have access to the documentation in Cupertino. They don't have to scour the complete writing of man to help me look up a file on my hard drive.
A similar thing bothered me when Microsoft put "Cortana" on Windows 10. I would type a search for something (a program or a file) and it would start searching the internet. I kept turning off internet search but updates kept turning it back on. If I want to search the internet I'll use a damn browser.
Summaries would actually be pretty useful for email when scanning through a list of them. Maybe you have 100 emails from someone but you can't remember which one has the info you're looking for. A good summary could jog your memory and get you to pick one out of a hundred rows more quickly.
But for text messages I don't see the value.
I’ve come to find aspects of it it to be really suitable. I mean Image Playground is pretty bad, but I use the writing tools a lot. They’re great for taking a bland phrase and making it more fun while texting a friend, or when I’m too tired to revise and shorten a message/email. And, it’s been helpful to get quick proofreads of passages (though I wish changes were highlighted). I’ve also had good success editing out small imperfections or parts of photos. So I think the on-device resources are working well. Passing items through Siri to ChatGPT just feels like using ChatGPT, which is honestly mediocre for what I want to do. I’d much rather see a Claude plugin for conversational and contextual work.
Personally I’m offended when it tries to suggest responses to someone when I’m texting them. I would be upset if I knew a friend or someone I’m dating was using an AI to message me.
I do use the responses some times. Sometimes all I want to say is "Yep!" And it's there for me to click a single button and have it sent. Sometimes my wife has sent me the twelfth Instagram clip of a pet doing something cute, and I click the "So cute!"/validate her sharing button. I feel a little bad about it, so I often go with the cat heart eye or dog heart eye emoji instead so that I'm not faking a written response, but a lot of the emoji and short phrases we use are just there to convey a concept, and having a single button to convey that is nice.
But if I heard that the witty quip that someone said to me was actually AI generated, I would feel bothered. So much of banter and other small talk is a way to get a better feel for and make a connection with another person. If half of what I'm getting is AI, we're halfway to a parasocial relationship that I couldn't know was developing.
The small responses are less bad. But I’m even upset when my phone rewrites “OMW” to “On my way!”. It’s not a good user agent if it takes away my control. To me writing an acronym is not the same as writing out the words.
Edit: I have removed the default expansion for OMW
My guess is you already know this, but just in case: go to Settings > General > Keyboard > Text Replacement. You can add or remove anything you want. I have a bunch for things like ‽, AT&T, or TL;DR. You can also add a phrase without a shortcut to prevent the keyboard from changing it when you type it. For example, I have sql, webp, zigbee, and others. Those are all very annoying to type out without those “replacements” set.
Oh, yeah. I have no rewrites. Absolutely not. Response prompts based on my previous behavior? Sure. Changing my written voice? No thanks.
I don’t see the harm. It’s admittedly rare that I would ever send a phrase like “Sure, sounds great!” over text. But, for the times I would I don’t see how it makes a difference if I wrote it or a robot wrote it. If the meaning, tone, intent, and words sounds like something I’d say, then I’m unbothered by it. Some recommendations are better than what I’d say anyway because I’m tired or can’t get the words out and forget to respond.
One of the apparent purposes of Apple Intelligence is to give a reason to buy a new more powerful phone or computer. If it is just passing LLM prompts to an online service like ChatGPT then it could have been done on hardware from 15 years ago.
Yeah definitely agreed. I find the on-device models to be nice, but I can’t say I’d recommend someone upgrade their hardware just for it.
Probably not what you are asking but I've solved many problems like your first scenario with ChatGPT and the Google AI on the search bar. I don't have any experience with the Apple version but those seem like simple questions to solve.