Numerous scientific studies show that X is toxic for its users and democracy as a whole.
Harm to X Users’ Well-Being
The "For You" feed is one of the primary sources of information for X users. However, this feed is heavily biased toward toxic content. Researchers from CNRS demonstrated that it contains 49% more toxic content (insults, personal attacks, obscenities, etc.) compared to what is produced by your followees (as measured in early 2023). For some accounts, the perceived toxicity of the environment can be amplified by up to +200%!
This means X highlights the worst of your social environment, distorting your perceptions. Notably, Musk explicitly worsened the situation; before his takeover, the amplification was "only" 32%.
Studies have shown that modifying users’ news feeds directly impacts their emotional state and communication style. When the proportion of positive messages decreases, users produce fewer positive messages and more negative ones. It has also been measured that using Twitter is associated with a decrease in well-being and an increase in outrage.
Harm to Social Cohesion
The amplification of hostility and the effects of algorithms recommending toxic content and users have a global impact on social structures.
It has been demonstrated that X increases political polarization and hostility between groups, which clearly does not foster healthy debate and harms social cohesion [4, 5].
Modeling has also shown that, in the medium term, social networks designed around engagement, such as X, concentrate social influence in the hands of the most toxic users: toxic accounts are overrepresented in the top 1% of influencers with an estimated 40% increase. Essentially, users who promote open-mindedness and calm debate are systematically disadvantaged on X and, over time, effectively made invisible.
Proliferation of Hate Speech and Disinformation
The lack of moderation on a social network like X can have severe consequences, as seen on other platforms.
For example, Amnesty International revealed that Facebook’s failure to moderate content enabled the brutal ethnic cleansing campaign against Rohingya Muslims in Myanmar in 2017 and fostered severe human rights violations against the Tigrayan population in Ethiopia from 2020 to 2022.
Under Elon Musk, X’s moderation teams have been decimated, and accounts previously banned for inciting violence, promoting Nazism, or homophobia have been reinstated.
X openly admits it no longer intends to combat disinformation. For instance, the company ended its policy against COVID-19 misinformation. Should another pandemic arise while X remains central to global information, the human cost would be catastrophic.
Furthermore, X has removed the "verification badge" that previously authenticated accounts. The badge was repurposed to indicate that an account has paid for visibility on X. The consequences were immediate and extended beyond politics: a fake account impersonating pharmaceutical company Eli Lilly announced that its insulin would be free. That day, Lilly lost 4.37% of its market capitalization.
Selection of Scientific Publications
- [1] Kramer, Adam D. I., Jamie E. Guillory, and Jeffrey T. Hancock. 2014. “Experimental Evidence of Massive-Scale Emotional Contagion through Social Networks.” *Proceedings of the National Academy of Sciences of the United States of America* 111 (24): 8788‑90. https://doi.org/10.1073/pnas.1320040111.
- [2] Bouchaud, Paul, David Chavalarias, and Maziyar Panahi. 2023. “Crowdsourced Audit of Twitter’s Recommender Systems.” *Scientific Reports* 13 (1): 16815. https://doi.org/10.1038/s41598-023-43980-4.
- [3] Chavalarias, David, Paul Bouchaud, and Maziyar Panahi. 2024. “Can a Single Line of Code Change Society? The Systemic Risks of Optimizing Engagement in Recommender Systems on Global Information Flow, Opinion Dynamics, and Social Structures.” *Journal of Artificial Societies and Social Simulation* 27 (1): 9. https://www.jasss.org/27/1/9.html.
- [4] Oldemburgo de Mello, Victoria, Felix Cheung, and Michael Inzlicht. 2024. “Twitter (X) Use Predicts Substantial Changes in Well-Being, Polarization, Sense of Belonging, and Outrage.” *Communications Psychology* 2 (1): 1‑11. https://doi.org/10.1038/s44271-024-00062-z.
- [5] Yarchi, Moran, Christian Baden, and Neta Kligler-Vilenchik. 2021. “Political Polarization on the Digital Sphere: A Cross-Platform, Over-Time Analysis of Interactional, Positional, and Affective Polarization on Social Media.” *Political Communication* 38 (1‑2): 98‑139. https://doi.org/10.1080/10584609.2020.1785067.