CRBC News

Sky News Study Finds X’s Algorithm Amplifies Right-Wing and 'Extreme' Content

Overview: A Sky News experiment in 2025 used nine fresh X accounts (three left, three right, three neutral) to track the For You feed for one month. The study found X’s algorithm disproportionately surfaced right‑wing and “extreme” content, with neutral accounts seeing about twice as much right‑wing material as left.

Sky controlled for engagement and follower counts and still observed lower visibility for some left‑leaning figures compared with certain right‑wing authors. At least half of the items shown came from authors judged to use “extreme” language. The findings raise questions about algorithmic bias, platform influence on political discourse, and the need for further independent analysis.

Sky News Study Finds X’s Algorithm Amplifies Right-Wing and 'Extreme' Content

Sky News analysis suggests algorithmic tilt toward right‑wing and 'extreme' posts

A Sky News investigation running across a one‑month period in 2025 found that X (formerly Twitter) systematically surfaces more right‑leaning and “extreme” content in users’ For You feeds. The outlet created nine brand‑new X accounts—three with left‑wing orientations, three right‑wing, and three politically neutral—and worked with political analysts and data scientists to label authors as left, right or “extreme.”

Key findings:

  • The three right‑wing test accounts were shown almost exclusively right‑wing posts.
  • All account types—including the politically neutral accounts—were shown more right‑wing content than left‑wing or neutral content. Neutral accounts saw roughly twice as much right‑wing material as left‑wing material.
  • Sky News controlled for popularity and engagement: even when left‑leaning figures had equal or higher engagement and substantially larger follower counts, their posts were displayed far less often than some right‑wing authors—Sky cites the example of far‑right independent MP Rupert Lowe, who has received visible interactions from platform owner Elon Musk.
  • At least 50% of the items shown across accounts came from authors judged to use “extreme” language on one side of the political spectrum or the other.
Sky's conclusion: Given comparable popularity and engagement, X’s algorithm appears more likely to surface content from right‑wing authors than from those on the left.

Context and implications. The report arrives amid ongoing scrutiny of platform behavior after Elon Musk’s takeover. Sky notes that Musk’s high‑profile interventions in political debates in both the U.K. and the U.S. provide relevant context; the findings suggest the platform may be amplifying partisan and polarizing messages despite public claims about promoting free speech.

Limitations. Sky’s experiment used a small, deliberate set of new accounts and focused mainly on U.K. politics. Algorithms change frequently, and results may vary by time period, geography, and account behavior. While the study controlled for engagement and follower counts, further independent research with larger samples would strengthen confidence in the findings.

Why it matters. If algorithmic systems systematically favor one political orientation or disproportionately promote extreme language, that can shape public visibility for certain viewpoints and influence political discourse. The Sky News study adds to broader debates about platform transparency, moderation, and algorithmic accountability.