Apple is officially allowing third-party developers to build applications using its foundational artificial intelligence technology. The company announced the major shift during its Worldwide Developers Conference on June 9, marking a departure from its traditionally closed software ecosystem. By opening access to the core AI models that power Apple Intelligence, the tech giant aims to accelerate the creation of new applications and keep pace with competitors in the rapidly evolving artificial intelligence sector.
Previously, external developers could integrate Apple’s AI technology into their applications, but they were restricted from using the company’s underlying models to build and power their own unique AI features. The new initiative changes this dynamic, offering an artificial intelligence software development kit and related frameworks that let programmers weave Apple’s large language models directly into specific app functions or across entire software platforms.
On-Device Processing and New Frameworks
Initially, Apple is limiting developer access to its smaller, three-billion parameter model. This specific model operates entirely on-device, which aligns with the company’s long-standing emphasis on user privacy. While the on-device approach imposes some technical limitations compared to the more advanced cloud-based AI models utilized by competitors, it ensures that user data remains stored locally. The company is currently holding back access to its advanced cloud-based AI servers.
To facilitate this integration, Apple introduced the Foundation Models framework. The new system allows developers to incorporate Apple Intelligence features into their software using just three lines of Swift code. The framework offers guided generation and built-in tool-calling capabilities, providing privacy-focused AI inference at no cost to developers.
According to Reuters, Apple’s software chief Craig Federighi highlighted the shift during his presentation, stating, “We’re opening up access for any app to tap directly into the on-device, large language model at the core of Apple.”
Early Adopters and Upgraded Coding Tools
Several major companies are already experimenting with the newly available technology. Automattic has integrated the framework into its Day One journaling application. Paul Mayne, head of Day One at Automattic, noted that the Foundation Model framework allowed the team to rethink journaling possibilities, bringing intelligence and privacy together in ways that deeply respect users.
Apple also revealed significant updates to its development environment with Xcode 26. The updated software embeds large language models directly into the coding experience. Developers can now utilize ChatGPT within Xcode without needing to create an account, connect API keys from alternative providers, or run local models directly on Apple silicon Macs. A new Coding Tools feature further assists developers by suggesting actions to generate previews, build playgrounds, or fix code issues.
Furthermore, Apple is extending its Visual Intelligence capabilities to external developers through enhanced App Intents. This allows third-party applications to deliver search results directly within Apple’s visual intelligence interface. The e-commerce platform Etsy is currently exploring these features for product discovery. Etsy Chief Technology Officer Rafe Colburn described the ability to reach shoppers on their iPhones using visual intelligence as a meaningful unlock for the company.
Market Reaction and Future Updates
The strategic move is designed to help Apple compete against rivals like Samsung Electronics, which has successfully enhanced its devices using Google’s artificial intelligence features. Apple is also looking to boost the appeal of Apple Intelligence among both developers and users following an initial rollout last year that was marked by delays.
Despite the new developer tools, the financial market and industry analysts expressed skepticism regarding the announcements. Following the conference, Apple’s stock closed 1.2 percent lower. Some analysts questioned the measured approach, viewing the new features as an incremental step rather than a major breakthrough.
According to Investing.com senior analyst Thomas Monteiro, the announced features felt incremental at a time when the market is questioning Apple’s ability to take a lead in the artificial intelligence space. Technalysis Research chief analyst Bob O’Donnell observed that Apple has shifted from sharing visionary concepts about AI agents last year to simply focusing on delivering the technology they previously promised. Meanwhile, Ben Bajarin, chief executive of Creative Strategies, noted that Apple’s current priority appears to be on back-end infrastructure rather than front-end consumer features.
Beyond artificial intelligence, the Worldwide Developers Conference featured significant operating system updates for the iPhone, iPad, and Mac. The company is expected to introduce a new tool designed to optimize battery performance across its devices. Apple is also developing a revamped, AI-driven Health application, though reports indicate it will not be officially launched until next year.
The new Apple AI developer tools are available immediately for testing through the Apple Developer Program, with a broader public beta release expected next month.
