Apple Cracks Open Its AI Toolbox for App Makers

Foundation Models Framework logo
Image credit: Apple

Apple is now letting outside developers use the same AI models powering its new Apple Intelligence. Announced by Craig Federighi at WWDC 2025, the brand-new Foundation Models Framework opens the door for apps to integrate on-device AI‑powered features, without relying on cloud services and with no API fees.

Until now, Apple’s on-device AI was restricted to its own apps. Mail could summarize emails, Notes could use Genmoji, and Messages could auto-suggest polls. The new framework changes all that. Apps for tasks like quizzes, visuals, or productivity tools can tap directly into models of varying sizes, from basic 3 B param models for lightweight tasks to more complex 150 B models for nuanced reasoning. These remain local to iPhone and iPad, preserving privacy and enabling offline use.

Federighi highlighted example use cases during his demo: study apps can generate personalized quizzes; outdoor apps can run natural-language search entirely offline. The setup is designed to be easy. With just a few lines of Swift code, developers can call on guided generation, classification, summarization, and more.

Kahoot app using AI on iPhone
Image credit: Apple

This move is Apple’s answer to rapid competition in AI from platforms like OpenAI, Google, and even third-party LLMs, especially after previous delays in upgrading Siri and general generative AI features. At the same time, macOS 26, iOS 26, iPadOS 26, and other platforms will debut built-in AI features like live translation, smarter Shortcuts, image creation tools, and expanded Genmoji. But letting third-party developers access the models themselves is the most notable change, one that could truly democratize intelligent features across the Apple ecosystem.

Developers can begin testing today as part of the WWDC beta cycle, ahead of a public beta next month and the full release this fall, alongside the annual OS updates. It’s a modest step, but one with meaningful implications: smart assistants, translation tools, and productivity helpers can now become common across apps rather than confined to core Apple services.

Why This Matters

Apple’s approach centers on privacy-first AI by keeping all inference local to the device. This means user data never leaves the phone or tablet, preserving privacy and enabling offline functionality. There’s also no cost barrier, developers can integrate these AI features without paying API fees. That makes it easier for smaller teams or indie developers to build intelligent features into their apps without relying on cloud infrastructure.

While the demos focused on educational and productivity scenarios, the implications stretch far wider. With the tools now available, we’re likely to see an uptick in smart assistants, translation helpers, and automated workflows across third-party apps, many of which could rival Apple’s own in functionality and user reach.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.