Apple Intelligence, the iPhone maker’s new suite of AI capabilities coming to iOS 18, lays the foundation for a new way to use apps.
Today, the old app store model is under constant regulatory attack. Meanwhile, users can accomplish many tasks by asking simple questions to an AI assistant like ChatGPT. Proponents believe that AI could become the preferred way to find answers, be productive at work, and experiment with creativity.
Where does this leave the world of applications and growth? Services Revenue (more than $6 billion in the last quarter) that they produce for Apple?
The answer lies at the heart of Apple’s AI strategy.
Apple’s intelligence The program itself offers a small set of ready-to-use capabilities, such as writing aids, summarization tools, generative art, and other basic offerings.
But earlier this year at its Worldwide Developers Conference (WWDC) in June, Apple introduced new features that will allow developer apps to connect more deeply with both Siri and Apple Intelligence.
The improvements to the smart assistant will allow Siri to call up any item from an app’s menu without requiring additional work on the developer’s part. This means users can ask Siri to “show me my notes as a presenter” in a slideshow, for example, and Siri will know what to do. Siri will also be able to access any text displayed on a page, allowing users to point to what’s on their screen and act on it.
So, if you’re looking at your reminder to wish a family member a “Happy Birthday,” you can say something like “FaceTime them” and Siri will know what action to take.
This is already an upgrade to the core functionality Siri offers today, but it doesn’t end there. Apple is also providing developers with tools to use Apple Intelligence in their own apps. At WWDC, the company noted that Apple Intelligence will first be available for specific categories of apps, including books, browsers, cameras, document readers, file managers, magazines, mail, photos, presentations, spreadsheets, whiteboards, and word processors. Over time, Apple will likely open these capabilities to all developers via the App Store.
The AI functionality will be built on the App Intents framework, which is being expanded with new intents for developers. The ultimate goal is to allow users to interact with Siri not just to open their apps, but also to use them.
This means that the user will not have to search through the app menus to find the feature they need to perform a task. Instead, they can simply ask Siri to do so.
Users can also make these requests while speaking naturally—conversationally—and can point to things related to their personal context.
For example, you could ask a photo editing app like Darkroom to “apply a cinematic gift to the photo I took of Ian yesterday.” The current version of Siri would reject that type of request, but AI-powered Siri would instead figure out how to leverage the filter’s application intent in the app, as well as the photo you’re asking to use it on.
Apple said Siri will be able to take action even if you stumble over your words or refer to an earlier part of the conversation in your instructions.
You can also take action across apps. For example, after editing your photo, you can ask Siri to move it to another app, like Notes, without having to tap anything.
Additionally, the iPhone search feature, Spotlight, will be able to search for data from apps by incorporating app entities into its index. This refers to Apple Intelligence’s understanding of things like photos, messages, files, calendar events, and more.
This more nuanced use of AI, of course, requires developers to embrace it. Apple has alienated some of its larger developers and even some independent companies over the years with its revenue-sharing rules, which generally allow the company to keep 30% of the revenue for products and services sold through an app. But developers could be lured back as Siri takes apps that were previously hidden in the App Library on the back of the phone and makes them easily accessible with voice commands.
Instead of boring orientation screens to train users on how to navigate and use an app, developers can instead focus on making sure Siri understands how their app works, and how users might ask for things they want to do in it. That way, users can interact with the app via Siri either by speaking or typing commands, similar to how they interact with an intelligent chatbot like ChatGPT today.
Third-party developers will also benefit from other benefits of Apple’s new AI architecture.
Thanks to its partnership with OpenAI, Siri will be able to hand off queries to ChatGPT when it doesn’t have the answer. And with the iPhone 16 lineup’s visual search feature, Apple will also let users access a chatbot from OpenAI or Google Search by simply tapping the new camera control button on the side, turning what they see through the camera viewfinder into an actionable query.
These developments won’t seem as immediately revolutionary as the introduction of something like ChatGPT because the rate at which developers adopt the technology is likely to vary.
Furthermore, these future promises still seem a long way off. In the latest iOS 18 beta, the functionality seems incomplete. As much as I was surprised by what the new Siri can do, I was also confused by what it can’t do. That includes Apple’s own apps. For example, you can ask Siri in the Photos app to send someone a photo you’re looking at, but you can’t ask it to do something more complex, like turn the image into a sticker. Until Siri stops running into such roadblocks, the functionality could end up being frustrating to use.