While the iPhone 16 was built for Apple Intelligence, it has been on the market in the UK for a few months without any of those headline features available. That is finally changing with the release of iOS 18.2 — Apple's artificial intelligence features are now finally rolling out to the UK as well as the US, following an extensive beta testing period.
Apple Intelligence aims to transform how we interact with our iPhones, iPads and Mac computers. With new AI-powered tools and features, Apple is blending advanced machine learning and natural language processing with practical functionality.
In layman terms? Your phone gets smarter, anticipating your needs to make using your device and getting tasks done simpler.
If you’ve been wondering what’s coming so you know what to try out when the Apple Intelligence update hits your phone, here’s a breakdown of the top ten Apple Intelligence features you should try first.
1. Writing Tools
One of the most practical additions is the suite of system-wide writing tools. Apple says that its writing tools will “refine their words by rewriting, proofreading, and summarizing text nearly everywhere they write, including Mail, Notes, Pages, and third-party apps.” Whether you’re typing a message, drafting an email, or taking notes, Apple Intelligence can suggest grammar corrections, offer rephrasing options, and help you polish your text. The promise is a writing assistant that ensures accuracy, clarity, and style across all of your apps.
Of course, this kind of AI tool isn’t anything new. In fact, companies have been doing this for a while already. Between OpenAI’s ChatGPT, Microsoft’s Bing, Google’s Gemini, and Anthropic’s Claude, all of these tools offer writing assistance. However, Apple’s ability to integrate these writing tools into all of your apps is unique. So far, only Google has the chance to replicate this level of integration.
2. Clean Up in Photos
Like Google before it, Apple is introducing a new Clean Up tool in the Photos app that “can identify and remove distracting objects in the background of a photo — without accidentally altering the subject.” If you’ve ever taken a photo on vacation and realised that someone photo-bombed your perfect moment, Apple now offers a built-in tool to fix it.
Whether it be a person, an animal, or simply an object that you want cleaned up or removed, Clean Up is the integrated tool to do it. While it’s not going to replace Photoshop, it is going to give most people 90% of what they need when editing photos within the Photos app itself.
3. Memory Movie Creation
One of the most interesting ideas in Apple’s Photos app is the Memories feature. While the feature historically put together movies on its own, Apple Intelligence is now letting you use prompts to generate your own videos. Apple says that “Memories feature now enables users to create the movies they want to see by simply typing a description.”
The feature allows users to type a theme or description—like “summer vacation”— and Apple Intelligence will automatically compile related photos and videos into a polished movie. If you’ve enjoyed the Memories feature but found Apple missed some that it shouldn’t have, you can now fix that issue by generating your own.
4. Natural Language Search in Photos
Finding photos in the Photos app has historically been…not great. While machine learning has improved the search function more recently, Google Photos’ search has been much better at finding specific things in your photos for a while.
Thankfully, Apple is utilising its natural language understanding with Apple Intelligence to improve this feature. Using descriptions like “birthday party in January” or “dog at the park,” Apple Intelligence can help you locate specific memories. The AI processes visual content and metadata, making photo searches more intuitive and precise.
The company says that “natural language can be used to search for specific photos, and search in videos gets more powerful with the ability to find specific moments in clips.” The second one is especially great. Being able to search for something buried in a video makes search in Photos much more powerful than what we’ve had before.
5. Priority Notifications
If there’s one thing Android has still had over iOS, it’s been better notification management. Instead of giving us an endless list of settings to modify and fine tune this, Apple is hoping that AI can solve this issue.
With Priority Notifications, your device should help “surface what’s most important, and summaries help users scan long or stacked notifications to show key details right on the Lock Screen.” The feature scans your messages and mail to understand context and bumps the most important communication into its own section.
Apple is also now using AI to summarise your notifications. Instead of seeing the entire contents of a long message in your notification, Apple will show a summary of each message or piece of mail. So, if you’re in a very active group chat, you can get caught up faster.
6. Smart Reply
Another feature Apple is finally catching Google with is Smart Reply. Gmail has offered quick replies that attempt to understand the context of the email exchange you’ve been having in Gmail for years now, and Apple is looking to use Apple Intelligence to offer the same functionality.
The company says “Smart Reply in Mail provides users with suggestions for a quick response and identifies questions in an email to ensure everything is answered.” While the first feature is table stakes for email in the 20th century, that second one is interesting. While Gmail always reminds me if I’ve forgetting to attach a file, I’d love if Apple Mail reminds me that I missed addressing something that someone asked me.
7. Focus Modes: Reduce Interruptions
I’m a huge fan of Focus Modes. From letting people know I’m driving or suppressing non-urgent notifications when I’m asleep, Focus Modes is one of my favourite features for staying on track and reducing anxiety when using my devices.
Now, Apple wants to bring AI smarts to Focus Modes with the Reduce Interruptions feature within Focus Modes. This new Focus Mode helps users maintain their concentration by filtering out non-essential notifications. Apple says that it “surfaces only the notifications that might need immediate attention.”
Whether you’re working or relaxing, this feature ensures that only the most critical alerts reach you, allowing for more uninterrupted time. Of course, this is also the feature I am most nervous about Apple getting right. If it suppresses a notification that is actually urgent in error, that could cause a real issue. We’ll see how smart it is!
8. Safari Summaries
I personally love the Reader Mode in Safari when I’m trying to read articles. It gets rid of all of the clutter, which is mostly ads, and lets me focus on the actual content. Now, Apple is using AI to make scanning an article even faster.
Safari Summaries is a standout feature for fans of Reader Mode. When using the feature, users can now request a summary of long articles or documents. Apple Intelligence distills key points into a concise summary, helping you get the most essential information of an article without having to scan through the entire thing.
9. Transcription in Notes and Phone
Taking notes from meetings and interviews, whether they be written or through talking, is now simpler with Transcription in the Notes and Phone apps. Apple says “In the Notes and Phone apps, users can record, transcribe, and summarise audio. When a recording is initiated while on a call in the Phone app, participants are automatically notified, and once the call ends, Apple Intelligence also generates a summary to help recall key points.”
So, if you’ve ever had a phone call you needed to recall information from, or needed to record and recall your thoughts, the Phone and Notes app are finally able to assist with that. I’ve personally been jealous of how good Google’s Voice Recorder app has been for journalists and writers, so I’m personally excited about this one to help me get my first drafts down by speaking to the Notes app and having the app transcribe it.
10. Enhanced Siri
While every other feature is going to be great, this is one that I’ve been feverishly anticipating the most. The promise of Siri as a helpful assistant has existed for years, but it has never lived up fully to that promise yet. Apple seems confident that Apple Intelligence is finally going to substantially close that gap.
Now, Siri “can follow along when users stumble over their words and can maintain context from one request to the next.” You can also now finally type to Siri and switch between typing and voice to interact with the assistant. I’m really hopeful that Siri starts to actually understand what I am saying to it.
More to come
While that is all of the main AI features to rolling out, there’s ever more to come in the coming months, especially with Siri. It is going to be able to use personal context and onscreen awareness alongside OpenAI’s ChatGPT to further enhance the voice assistant's capibilities. The company plans to add more AI chatbots like Google Gemini in the future.
There’s a lot to look forward to with Apple Intelligence over the coming weeks and months, so get ready for a lot of change in the way we use our Apple devices.