Apple is starting something called the Foundation Models Framework. The company says it will allow developers to tap AI models offline in device fashion.
On the WWDC 2025 stage on Monday, Apple VP of Software Engineering Craig Federighi said the Foundation Models Framework allows apps to use device AI models created by Apple to promote experience. These models will be shipped as part of Apple Intelligence. This is Apple’s family of models that enhance many iOS features and features.
“For example, if you’re ready for an exam, apps like Kahoot can create personalized quizzes from notes to help you study more engagingly,” says Federighi. “And this happens without cloud API costs because it occurs using on-device models (…) We couldn’t be more excited about how developers can build Apple Intelligence and bring new experiences that are smart and offline available.”
In a blog post, Apple said that the Foundation Models Framework has native support for Swift, Apple’s programming language for building apps for various platforms. The company claims that developers can access Apple Intelligence models with just three lines of code.
According to Apple, guided generation, tool calls, and more are all built into the Foundation Models Framework. Automattic already uses the framework in its first day journaling app, and says Apple is tapping the framework to recommend various hiking routes while mapping Alltrails.
The Foundation Models Framework can be tested today through the Apple Developer program, with public beta being available early next month.