Apple Intelligence is the backbone of nearly every OS 26 upgrade this year. It’s powering Live Translation in Messages and FaceTime, unlocking smarter Shortcuts, and giving developers access to on-device large language models for the first time. We’ll also get revamped Visual Intelligence, more Genmoji tools, and expanded language support.
But for all its ambition, a few major features are still missing, especially the one everyone expected: personalized Siri. Confused about how to feel? Honestly, same. But before we pass judgment, let’s break down what’s actually here, and what’s still MIA.
What’s New in Apple Intelligence for iOS 26, iPadOS 26, visionOS, and macOS 26
Live Translation in Messages, FaceTime, and Phone

Live Translation is Apple’s most practical use of on-device AI so far. It works in real time across Messages, FaceTime, and even the Phone app. Whether you’re texting someone abroad or taking a live call, Apple Intelligence can translate text and audio without needing to send data to a server.
Everything happens on your device using Apple’s own language models. That means faster response times and better privacy. No data leaves your phone, which makes it more secure than many other translation tools. It’s one of the clearest examples of how Apple is using AI to make communication easier without compromising your personal information.
Visual Intelligence for On-Screen Actions
Apple Intelligence now understands what’s on your screen, not just what you say or type. That’s a big leap for Visual Intelligence, which previously focused on object recognition and image understanding. Now, it can scan screenshots, detect embedded event info, or help you shop for similar items by connecting with Google, Etsy, and other supported apps.
You can also summon ChatGPT directly to analyze or explain what’s on-screen—no need to copy and paste anything. If you’re looking at a flyer with a date and time, your phone might even suggest creating a calendar event with all the details filled in. This brings practical automation to everyday tasks, right from your screen.
Smart Summaries for Order Tracking in Mail
Apple Intelligence now scans your inbox for order confirmation emails and automatically compiles tracking updates. Also, these summaries aren’t limited to Apple Pay transactions. They’ll work with any merchant or carrier email, making it easier to keep tabs on all your deliveries.
You’ll find these unified order summaries directly in the Mail app, helping reduce clutter and save time. It’s not flashy, but it’s useful, especially if you’re tired of jumping between emails just to track a package. Apple’s LLM handles this quietly in the background, streamlining your inbox in a way that feels effortless.
Genmoji and Image Playground
Apple reimagined how users create and use Genmojis. It now lets you combine existing emojis, your own creations, and plain text to generate custom visuals that are more expressive than ever. It’s like giving your emoji a personality of its own.
Meanwhile, Image Playground lets you generate full illustrations from scratch using text prompts and creative tools. Everything runs on-device, and the new Shortcuts integration makes it easy to trigger these tools without jumping through menus. While this update is mostly for fun, it shows how Apple Intelligence can fuel creativity as much as productivity.
Smarter, More Capable Shortcuts
Shortcuts are getting a serious AI boost in iOS 26. Apple Intelligence introduces intelligent actions that suggest what you might want to do next based on context, like rewriting a note, summarizing a message, or launching Image Playground from within another app.
You’ll also see new dedicated triggers for Writing Tools, making it easier to integrate proofreading or rewriting into your existing workflows. They’re smart enough to adapt to what you’re doing and suggest meaningful shortcuts, which could save you time in your daily routine.
Foundation Models API for Developers
Apple is opening up its on-device large language model to third-party developers with a new Foundation Models framework. That means any app, e.g., productivity, health, and social media, can tap into Apple Intelligence’s capabilities without compromising user privacy.
What’s more is it’s all offline, completely private, and free to use. Apple’s betting that by giving developers access to its core AI engine, they’ll kickstart a new wave of innovative apps designed around local intelligence. If it works, iOS could see a wave of smarter tools that don’t rely on cloud services or constant connectivity.
What’s Missing From Apple Intelligence for iOS 26, iPadOS 26, visionOS, and macOS 26
Siri’s Big Upgrade
Let’s talk about the elephant in the room. As many have predicted, we’re not seeing the revamped and personalized Siri that was introduced last year. Apple will likely roll it out in 2026.
This means Siri won’t be able to remember what you asked earlier, help with multi-step tasks, or respond based on personal context. You’re getting some updates, but not the conversational, memory-enhanced Siri Apple showcased. For now, the version of Siri in iOS 26 is still largely based on traditional voice assistant rules.
Limited Device Support
Apple Intelligence is still limited to the iPhone 15 Pro, Pro Max, and newer models. That’s a huge cutoff. Even the regular iPhone 15 that was released just last year doesn’t qualify. I was hoping to get better support this year. If you own anything older or even the base model, you won’t get on-device Apple Intelligence at all.
Cross-Platform Real Context
Apple Intelligence can identify dates or events on-screen and suggest adding them to the Calendar, but only if the format is clear and the content is directly recognized. If you copy a flight number or time manually, and open Calendar, nothing happens. The system doesn’t retain session memory across apps.
That’s a key limitation. While Visual Intelligence can scan what’s visible and act on it, Apple’s AI doesn’t yet track your task flow. It doesn’t remember what you copied, searched, or planned across apps unless it fits a narrow recognition pattern. True cross-app context awareness is still missing.
That said, these are just my observations. I suggest downloading iOS 26 or iPadOS 26 and testing them yourself—the Developer Beta versions are out right now. The changes aren’t really all bad, and maybe I’m just being pedantic. But you have to admit some new features are trying to solve problems that don’t exist, i.e. macOS UI changes complicating navigation.