Back

Apple changes the meaning of AI

June 10, 2024

That's quite an announcement, claiming the acronym "AI" - Apple Intelligence instead of Artificial Intelligence. And then a brilliant claim: "AI for the rest of us. We're used to that from Apple. But we are also used to companies showing fantastic examples in their product presentations and then the reality is completely different. Google, in particular, has proven this several times. That's why we don't like to write about products or features that aren't available yet. But of course, we can't ignore Apple.

What stuck with us the most from the presentation:

  1. The move from the app to the use case: I have something in my head that I want to do, and I tell my iPhone. It does the rest. That's much closer to the ideal way to use a smart device than the chatbots and other tools we have today. It's the intelligent agent that other vendors have described, but no one has really fleshed it out like Apple has.
  2. How privacy is sold: Apple addresses the issue on its own - not when asked. Everything should be done locally, and if that is not possible, then in an Apple cloud where requests are processed but not stored. No one should have access to this cloud, and it should be possible for external parties to verify this. That sounds like a coherent concept. We can only speculate about backdoors at this point. Maybe this is naive, but we should allow Apple to convince us that they are serious. There is a weakness. Since Apple's language models are not that mature yet, they have brought OpenAI on board. With explicit prior consent, users can communicate using its language model. The usual rules apply to this communication, although it is currently unclear how this will work in practice. Once again, OpenAI's CIO did not cut a good figure when asked how this user data would be handled. With ChatGPT, it is relatively easy and no longer a disadvantage to 

The Announcements

Apple is building a variety of AI capabilities into iOS, iPadOS, and MacOS to provide a more personalized experience. Apple Intelligence can understand spoken words, write protocols, create images, and understand personal context. As with others, text processing is a core component of AI. It will be available in applications such as Mail, Notes, Safari, Pages, and Keynote, as well as in third-party applications. Users will be able to compose text, change tone and style, and correct spelling. It will also be possible to create local images and custom emoji.

Siri is becoming what we all hoped it would be. It will be able to process natural language better and retain the context of interactions, such as when users correct themselves. In the future, Siri will also be able to recognize the context of the device. Similar to Microsoft's Recall, it should be able to "look" at the screen to bring that information into the conversation. Siri and the other apps will also be able to use OpenAI's language model, but only with the user's explicit consent.

Apple Intelligence will be available this fall, but only in English, and may initially be available only in the United States. Those who want to use Apple Intelligence in German will have to wait until 2025. Since it is integrated into the operating system, an iPhone 15 Pro or an iPad or Mac with an M-Chip is required.

Read more