What is Apple Intelligence, and what does it mean for the future of Apple’s products? Apple is no stranger to artificial intelligence. It’s one of the biggest AI companies in the world. The tech giant has been experimenting with assistants like Siri and biometrics for years.
However, at WWDC 2024, Apple introduced a new era in its AI strategy as part of the “Glowtime” event on September 9th. Now, the Cupertino company is diving deeper into cutting-edge AI than ever before (particularly generative AI), with a host of new features for Apple device users.
The new “Apple Intelligence” era will unveil a range of Gen AI solutions for proofreading, editing, summarizing content, manipulating images, and so much more. While these new capabilities will be rolling out slowly, they’re sure to have a big impact on the future of Apple devices.
Even better? Apple doesn’t seem to have any plans to charge customers for access to LLM-powered features – beyond the price you’ll pay for your Mac, iPad, or iPhone. Here’s everything you need to know about Apple Intelligence.
What is Apple Intelligence?
Apple Intelligence is the tech giant’s “personal intelligence” system, designed to put the power of multimodal, cross-platform generative models in the heart of your Apple devices. It’s coming to virtually every Apple platform, as well as future Apple devices (like new iPhones).
Apple says it’s “Intelligence” toolkit will feature everything users need to write, express themselves, edit content, and search the web more efficiently. It also draws on “personal context” – such as the data stored in your phone or iPad. The AI tools will be able to read all of your messages, follow your location and maps, record your phone calls, look at your photos, and check your calendar.
If you find that a little worrying, don’t panic. Apple has said that it’s committed to protecting user privacy with an innovative new approach to security.
Basically, this means you’ll maintain control over most of your data, which is good news for employees using Apple devices. When Apple’s AI tools need to access the cloud, they use a “Power Compute Standard” to keep your data under lock and key.
Overall, Apple Intelligence promises users a powerful, intuitive, and integrated AI solution for their devices, that adapts to their personal needs, while maintaining privacy. The company believes no other company (including Google with Gemini) offers the same combination. However, it’s still early days for this new AI strategy, so we’ll have to wait and see.
What is Apple Intelligence Used For? What Can it Do?
The launch of the “Apple Intelligence” suite marks a new era of investment in generative AI and multimodal solutions for Apple products. It seems likely that the company will continue to build on the functionality of the kit going forward, to stay competitive with brands like Microsoft.
In the meantime, Apple Intelligence promises access to all of the most popular generative AI capabilities directly within your devices. For instance, you’ll get:
Advanced AI Writing Tools
System-wide, Apple Intelligence writing tools will give you access to a copilot on all of your devices to help you create more compelling content. The selection of “Writing Tools”, according to Apple, will help you find the right words for everything from emails and SMS messages to documents.
You can use these tools to summarize lectures or meetings in seconds, get the shorter version of a long group messaging thread, and more. On any Apple device, users will be able to ask AI to proofread and rewrite their text, with various tones and language options. These tools will even work in third-party apps on Apple devices.
Plus, you’ll get the option to “prioritize” messages and notifications with AI. Based on context gathered from your phone, Apple’s assistant will push messages and notifications to the top of the stack on your iPhone or Mac, so you know what to respond to first. This feature also works with your Apple email inbox.
You can tap on an icon in an email or messaging thread to see an instant summary of key points and even record content in the Notes app with instant transcription. Plus, Apple is introducing “Smart Reply”, so you can rapidly respond to messages, with context, on the go.
Perhaps the most compelling feature is the new “Focus” mode, enhanced by AI, which understands the content of your notifications and only shows you the ones that really need your attention. I can see this being extremely useful for employees and enterprise users with Apple devices.
Image Playground and Editing Features
As a multimodal AI solution, Apple Intelligence can work with more than just text.
One of the most exciting new capabilities available for users with Apple devices is the new “Image Playground,” which allows users to create original images in seconds based on a description, suggested concepts, or data from their Photos library. You can even adjust the style to match a slide in Keynote, a Freeform board, or a Messages thread.
The Image Playground app, ideal for marketing use cases, also allows users to create new animations, illustrations, or sketches, which they can share with users across various apps and social media channels. There’s a handy “Image Wand” tool too, which allows you to transform rough sketches into related images in the Apple Notes app. Just draw a circle around a sketch, and the Apple AI will get to work, creating a complementary visual.
Additional Visual Features
Other compelling photo and image-related features powered by Apple Intelligence include:
- Genmojis: Forget relying on standard emojis; Apple’s AI tools can create a brand-new emoji from scratch within your keyboard to match any conversation. You can simply give the AI a description of what you want to see, or share a picture of someone from your Photos app, then edit the resulting icon until you’re happy.
- Memory movies: In the Apple Photos app, AI can help you create a memory movie, packed with all the images you want to see. Add a description, and Apple Intelligence will find the videos and photos that match to create an impressive storyline.
- Image and video search: The new search feature in the Photos app for Apple devices will also be able to track down relevant images based on text prompts. It can even find a specific moment in a video clip, and direct you to it instantly.
- Clean up: The new “Clean Up” tool in the Photos app uses AI to identify background objects instantly, so you can remove them with a tap, without editing tools.
On top of all that, Apple devices will also feature a new “Visual Intelligence” tool, which works with the iPhone 16’s camera control button. When you click the button on the side of the phone, you can use the multimodal features of Apple intelligence to get search results quickly. The system will be able to recognize images of objects and rapidly direct you to relevant search pages.
Siri and Apple Intelligence
Probably unsurprisingly, Siri is also getting a massive upgrade with Apple Intelligence. First, the entire Siri design will be upgraded. Whenever a user taps on the Siri icon, the screen will ripple to show you you’re interacting with the AI app. There’s even a glowing light that wraps around your screen.
With a double tap on the bottom of your screen, you can type messages to Siri from anywhere in the system when you don’t want to talk to your bot out loud. Siri can search for content for you, complete tasks, and even tell you more about your device. You can ask questions about how to schedule a text for later, or update your device Ios.
When you do choose to speak to Siri, Apple says the conversational app will be a lot better at understanding what you say. You can refer to things you said in a previous request, and Siri will be able to remember them. For instance, you might ask about the weather forecast for a place you asked for directions to.
Apple Intelligence also gives Siri “on-screen” awareness, so it can understand what’s going on within your screen. For instance, if a colleague sends you a message with an address for a meeting, you can ask Siri to add that to your calendar notes. You can also take actions across various apps with Siri, such as asking the assistant to send an email you previously drafted for a specific colleague.
The upgrades to Siri should also lead to an increased number of developers creating apps with specific prompts and actions for Apple’s assistant, so there’s no limit to the solution’s future potential.
When Will Apple Intelligence be Available?
Apple Intelligence is still in its early stages. Following the launch event on September 9th, we do have an insight into when new features will start rolling out. Most of the advanced capabilities offered by Apple won’t be available instantly for the new iPhone 16. However, Apple has said that some capabilities be implemented by the iOS 18.1 release in October this year.
Notably, iOS 18.1 won’t include all the major Siri upgrades—you’ll have to wait until 2025, when Apple upgrades its iOS yet again.
The downside is that the first release of Apple Intelligence is exclusively available to users in the US, and features will only be available in English. However, later in the year, Apple will be rolling out support for localized English in Australia, South Africa, Canada, New Zealand, and the UK.
In 2025, the company will be adding new language options, like Chinese, French, Japanese, and Spanish. However, we’re not sure yet exactly when these languages will be available.
For MacOS Sequoia, it’s likely the final version of the initial Apple Intelligence platform will be available by the end of this year. Once you have the right iOS or operating system installed, you may need to head into the Settings on your device to access the features.
For now, you can “Join the Apple Intelligence” waitlist on your smartphone by installing the iOS 18.1 developer beta and joining the waitlist from your Settings.
Will Apple Intelligence Use ChatGPT?
There’s been a lot of hype surrounding the partnership between OpenAI and Apple, announced in June 2024. In the past, a lot of people assumed that OpenAI was committed to working exclusively with Microsoft. However, following various anti-trust investigations, it seems that the ChatGPT creator is keen to dispel the rumors.
Thanks to its partnership with OpenAI, Apple will be offering free access to ChatGPT on most of its devices. The GPT technology powering ChatGPT will be integrated seamlessly into Siri and Apple’s writing tools, so you don’t have to jump between different AI apps while you work.
According to Apple, this functionality will be rolling out to all three major platforms, including iPadOS 18, MacOS Sequoia, and iOS 18, later this year. However, we’re not sure yet exactly when all of the OpenAI functionality will be available.
Notably, Apple has also said that it’s not necessarily just partnering with OpenAI. The company plans on forging connections with creators of other AI models going forward too.
Will Apple Intelligence Protect your Data?
A focus on privacy is certainly one of the top things that separates Apple from its competitors. Although Apple is giving its AI tools access to your personal data, it promises that this doesn’t mean your information won’t be protected.
For a start, Apple will be running all of the systems it can, directly on your device. This means while tools like Siri might know what you said in your messages to colleagues, it won’t share that information with anyone over the cloud.
When you create or edit images in the Image Playground, Apple doesn’t track what you create – even if your iPhone or device needs to tap into the cloud, Apple says it doesn’t get any reports. That’s because complex cloud requests are locked in a Private Cloud Computing environment. This environment cryptographically ensures that devices don’t talk to a server unless the software has been logged publicly for inspection.
Which Devices will Include Apple Intelligence?
Apple is planning on making AI a core feature of every product it sells, across every platform. However, you’ll need the latest operating systems for your device to access the functionality. That means iOS 18 and above for iPhones, iPadOS 18 and above for tablets, and MacOS 15 Sequoia and above for computers.
The iPhone 16, recently released by the tech giant, is the first phone in Apple’s collection that was specifically designed from the ground up for Apple Intelligence. This means you will be able to access the AI in every different model, from the iPhone 16 Plus, to the Pro Max.
Plus, Apple Intelligence will be introduced to all devices using Apple’s M1, M2, M3, and M4 chips – which could mean we see these features on the Apple Vision Pro too. The current iPad devices that use the M1 and M2 chips will also be supported.
Unfortunately, older iPhone models (beyond the iPhone 15 Pro and beyond) don’t feature the same chipsets, so they won’t be suitable for Appel Intelligence. If you have a MacBook Air, Pro, iMac, or Mac mini with an M1 chip or newer, you should be able to access these features, too.
The Future of Apple Intelligence
Apple Intelligence marks an exciting step forward in Apple’s long-term journey with artificial intelligence. Going forward, Apple will continue to invest more heavily in LLMs, generative AI, and unique models that transform the way we work with everyday devices, from smartphones to tablets.
Apple seems committed to taking AI into the mainstream, and it’s more than willing to partner with other innovators in the AI space to make this dream a reality too. We’ve already seen evidence of this with the OpenAI partnership, so it will be interesting to see who Apple targets next.
In my opinion, Apple Intelligence could be the key to cementing Apple’s position as one of the most intuitive and innovative device makers of all time. Time and time again, the company proves that it’s always looking for new ways to upgrade the way we communicate and work.