Elon Musk’s social media platform X has released code tied to its recommendation system, as the company pushes for more transparency around how posts get ranked and shown to users. Musk has also said X would open-source its “new algorithm,” including code for organic and advertising post recommendations, and repeat updates every four weeks with developer notes—though coverage differs on when this open-sourcing happens.
News9Live reported that on January 20, X released the core logic of its new recommendation algorithm as open source and quoted Musk calling the system “dumb” and still evolving, while arguing that transparency is the point. In a separate report, Khaleej Times said Musk posted that X would open-source its new algorithm in seven days and would repeat the release every four weeks with comprehensive developer notes explaining changes.
What X has published
A GitHub repository titled “the-algorithm” describes “X’s Recommendation Algorithm” as a set of services and jobs that serve feeds of posts and other content across X product surfaces, including the For You Timeline, Search, Explore, and Notifications. The same repository says the product surfaces included in the repo are the For You Timeline and Recommended Notifications.
In its overview, the repository lists shared components across X, including data services and signals (such as a “unified-user-actions” stream and a “user-signal-service”), as well as models and frameworks used to build and serve recommendations. It also invites the community to submit GitHub issues and pull requests to improve the recommendation algorithm and notes the project is licensed under AGPL-3.0.
How Phoenix changes ranking
News9Live reported that X’s new recommendation system is called Phoenix and described it as a rebuild that shifts away from manual ranking rules toward AI-driven prediction. According to News9Live, Phoenix is “powered by the same transformer architecture as xAI’s Grok model,” and the new system abandons the older approach where engineers manually defined many features and weights.
News9Live said Phoenix focuses on predicting how an individual user is likely to react when they see a post, using that user’s behavior history such as likes, replies, time spent reading, profile visits, and accounts muted or blocked. The report added that the model predicts 15 possible reactions (including positive actions like likes, replies, reposts, profile visits, long dwell time, and follows, as well as negative actions like “not interested,” mute, block, and report), assigns probabilities, and combines them with internal weights to produce a final ranking score that affects reach and visibility.
What creators may take from it
News9Live said the open-source repository points to practical lessons for creators, including that replying to commenters matters because author replies are treated as a strong positive signal. The same report said posts that push users toward muting or blocking can be penalized, and these negative actions can affect future posts, not only the post that triggered the reaction.
News9Live also reported that the system deprioritizes posts that send users off-platform and suggested that external links may perform better when placed in comments rather than the main post. It further said Phoenix includes an “author diversity” approach that can downrank consecutive posts from the same account and that time-based posting features no longer exist in the model, making “best posting time” tactics less central in this framework.
What’s still unclear, and why scrutiny remains
Even with open-sourcing, News9Live reported that key details are not revealed, including the exact weights for each action, the internal parameters of the Grok model, and the training data. The same report framed the release as showing the structure of ranking decisions without exposing every underlying value.
Separately, Khaleej Times reported that the European Commission extended a retention order on X tied to algorithms and the dissemination of illegal content through the end of 2026, citing a commission spokesperson. Khaleej Times also reported that in July 2025, Paris prosecutors investigated X for suspected algorithmic bias and fraudulent data extraction, and said Musk’s X called it a “politically motivated criminal investigation” that threatens users’ free speech.
