Meta has released Muse Spark, the first AI model from Meta Superintelligence Labs, as the company steps up its effort to compete more aggressively in artificial intelligence. The launch comes as Meta is also expanding its computing capacity through a new $21 billion AI cloud infrastructure deal with CoreWeave, adding to an earlier $14.2 billion agreement and bringing its total commitment to the provider to $35.2 billion.
Muse Spark is now available on the Meta AI app and on meta.ai, and Meta said the model will power its Meta AI assistant. The company described the release as the first model in a new Muse series. At the same time, Meta Superintelligence Labs chief Alexandr Wang said the team rebuilt its AI stack from scratch over nine months with new infrastructure, architecture, and data pipelines.
What Muse Spark adds
Meta said Muse Spark is a natively multimodal reasoning model designed to handle complex personal tasks, and it called the system the first step on its path to personal superintelligence. The company said the model can help with tasks such as estimating the calories in a meal from a photo or placing an image of a mug onto a shelf to preview how it might look.
The model also introduces different levels of reasoning, including Instant, Thinking, and Contemplating modes. Meta and Business Insider said the Contemplating mode can run multiple agents at the same time, with one agent able to build a travel plan while another looks for kid-friendly activities.
Meta said Muse Spark also includes a Shopping mode that uses styling inspiration and brand storytelling already happening across its apps to surface ideas from creators and communities people already follow. Business Insider reported that Meta also highlighted improved health responses developed in collaboration with 1,000 physicians, while saying coding workflows remain an area where the company sees current performance gaps.
The company said Muse Spark will initially be available on the Meta AI app and website, and in the coming weeks it will replace the Llama models currently powering chatbots on WhatsApp, Instagram, Facebook, and Meta smart glasses. Meta did not disclose the model’s size, which is a common measure used to compare AI systems.
Pressure to catch up
The release lands after a period of intense investment and pressure inside Meta’s AI effort. Times of India reported earlier that Meta’s next big AI model, then codenamed Avocado, had been delayed by at least two months after falling short of rivals on internal benchmarks and was unlikely to ship before May after originally being targeted for a mid-March release.
That earlier report said the model outperformed Meta’s Llama 4 and beat Google’s older Gemini 2.5, but it did not match Gemini 3.0 from November. The same report said leaders within Meta’s AI division had discussed temporarily licensing Google’s Gemini to power products while Avocado caught up, though no decision had been made.
The newer reports on Muse Spark connect that codename to the model now being rolled out. Times of India said Muse Spark was reportedly known internally as Avocado during development, while Business Insider said the model was nicknamed avocado internally.
Meta has tried to move faster by reshaping its AI organization around Meta Superintelligence Labs and giving researchers more autonomy. Times of India said Zuckerberg assembled the Superintelligence Labs team under Chief AI Officer Alexandr Wang, and Business Insider reported that Meta spent months building the division after deciding its AI efforts needed an overhaul.
CoreWeave deal expands capacity
At the same time, Meta is locking in more outside infrastructure as demand for AI computing grows. Times of India reported that the new CoreWeave deal, announced on April 9, runs from 2027 to 2032 and sits on top of the previously disclosed agreement that runs through 2031.
CoreWeave’s data centers are packed with hundreds of thousands of Nvidia GPUs, which are used to train and run large AI models. The report said Meta has been a CoreWeave customer since 2023, and CoreWeave chief executive Mike Intrator said the company’s infrastructure has helped Meta make better use of the AI talent it has hired.
Meta is also building its own AI facilities, including a $10 billion data center in Texas that it announced in March, but the company still needs resources now as the pace of AI development accelerates. A Meta spokesperson described the CoreWeave agreement as part of the company’s portfolio-based approach to infrastructure as it invests in capacity for its AI ambitions.
Muse Spark now becomes the clearest public sign yet of what those talent and infrastructure investments are meant to deliver. Meta is betting that a faster, more capable assistant for everyday tasks can help it reach users across a network of more than 3.5 billion people while it continues to build larger models behind the scenes.
