Apple is opening its on-device Apple Intelligence foundation model to third-party developers through a new Foundation Models framework, aiming to make private, offline AI features easier to build into apps.
The announcement, shared during WWDC on June 9, 2025, positions Apple’s on-device approach as a way for developers to add generative AI without cloud API fees and with privacy protections built in.
What Apple announced at WWDC 2025
Apple says the Foundation Models framework is a new API that lets developers use the large language models at the core of Apple Intelligence inside their own apps.
Craig Federighi, Apple’s senior vice president of Software Engineering, said Apple is taking “the huge step of giving developers direct access to the on-device foundation model powering Apple Intelligence,” describing it as “powerful, fast, built with privacy, and available even when users are offline.”
Apple also described this as the first time it is providing developers direct access to its on-device foundation models that power Apple Intelligence features on iPhone and iPad.
What developers can do with the on-device model
Apple says developers can use the Foundation Models framework to create intelligent experiences that work offline, protect privacy, and use inference that is “free of cost,” because the model runs on device.
Apple and coverage of the announcement highlighted example use cases such as an education app generating a personalized quiz from a user’s notes and an outdoors app adding natural language search that still works when the user is offline.
Apple says the framework includes built-in features such as guided generation and tool calling, designed to make it easier to integrate generative capabilities into existing apps.
Apple also says the Foundation Models framework has native support for Swift, and developers can access the Apple Intelligence model with as few as three lines of code.
How this fits into Apple’s broader developer push
In a separate developer-focused WWDC announcement, Apple said it is rolling out new tools and platform updates across iOS 26, iPadOS 26, macOS Tahoe 26, watchOS 26, and tvOS 26.
That developer release also links the Foundation Models framework to other updates meant to help teams build “more beautiful, intelligent, and engaging app experiences,” including Xcode 26 features that integrate large language models such as ChatGPT into the coding workflow.
Apple said Xcode 26 includes built-in support for ChatGPT, while also allowing developers to use API keys from other providers or run local models on a Mac with Apple silicon.
Apple also said developers can start using ChatGPT in Xcode without creating an account, with subscribers able to connect their accounts to access more requests.
Earlier reports pointed to this direction
Before WWDC, a report summarized by SiliconANGLE said Bloomberg had reported Apple planned to let third-party developers build apps using its large language models and expected to announce the effort at WWDC.
That report said Apple Intelligence is powered by internally developed models described as Apple Intelligence Foundation Models, and it said Apple planned to make some of those models available through a software development kit and related frameworks.
SiliconANGLE also said the report indicated Apple would initially make only on-device algorithms available to developers, with it being unclear whether cloud-hosted models would follow.
Apple Intelligence context and availability details
Apple’s June 9, 2025 announcement about Apple Intelligence said developers will be able to access the on-device large language model at the core of Apple Intelligence, giving them direct access to intelligence that is available even when users are offline.
Apple said these Apple Intelligence features are available for testing starting that day through the Apple Developer Program, with a public beta coming the next month through the Apple Beta Software Program.
Apple said the features will be available to users in the fall on supported devices set to a supported language.
Apple’s Apple Intelligence release also outlined broader user-facing updates, including Live Translation integrated into Messages, FaceTime, and Phone, enabled by Apple-built models that run entirely on device.
Apple said Shortcuts can now tap into Apple Intelligence directly, and users will be able to tap into Apple Intelligence models either on-device or with Private Cloud Compute to generate responses that feed into the rest of a shortcut.
