Hello Quit X - Free our digital spaces

What Are the Proofs of X’s Toxicity?

Hello Quit X - Free our digital spaces

Countdown timer
until January 20, 2025:

days
hours
minutes
seconds
> I’m looking in the FAQ > What Are the Proofs of X’s Toxicity?

Numerous scientific studies show that X is toxic for its users and democracy as a whole.

Harm to X Users’ Well-Being

The "For You" feed is one of the primary sources of information for X users. However, this feed is heavily biased toward toxic content. Researchers from CNRS demonstrated that it contains 49% more toxic content (insults, personal attacks, obscenities, etc.) compared to what is produced by your followees (as measured in early 2023). For some accounts, the perceived toxicity of the environment can be amplified by up to +200%!

This means X highlights the worst of your social environment, distorting your perceptions. Notably, Musk explicitly worsened the situation; before his takeover, the amplification was "only" 32%.

Studies have shown that modifying users’ news feeds directly impacts their emotional state and communication style. When the proportion of positive messages decreases, users produce fewer positive messages and more negative ones. It has also been measured that using Twitter is associated with a decrease in well-being and an increase in outrage.

Harm to Social Cohesion

The amplification of hostility and the effects of algorithms recommending toxic content and users have a global impact on social structures.

It has been demonstrated that X increases political polarization and hostility between groups, which clearly does not foster healthy debate and harms social cohesion [4, 5].

Modeling has also shown that, in the medium term, social networks designed around engagement, such as X, concentrate social influence in the hands of the most toxic users: toxic accounts are overrepresented in the top 1% of influencers with an estimated 40% increase. Essentially, users who promote open-mindedness and calm debate are systematically disadvantaged on X and, over time, effectively made invisible.

Proliferation of Hate Speech and Disinformation

The lack of moderation on a social network like X can have severe consequences, as seen on other platforms.

For example, Amnesty International revealed that Facebook’s failure to moderate content enabled the brutal ethnic cleansing campaign against Rohingya Muslims in Myanmar in 2017 and fostered severe human rights violations against the Tigrayan population in Ethiopia from 2020 to 2022.

Under Elon Musk, X’s moderation teams have been decimated, and accounts previously banned for inciting violence, promoting Nazism, or homophobia have been reinstated.

X openly admits it no longer intends to combat disinformation. For instance, the company ended its policy against COVID-19 misinformation. Should another pandemic arise while X remains central to global information, the human cost would be catastrophic.

Furthermore, X has removed the "verification badge" that previously authenticated accounts. The badge was repurposed to indicate that an account has paid for visibility on X. The consequences were immediate and extended beyond politics: a fake account impersonating pharmaceutical company Eli Lilly announced that its insulin would be free. That day, Lilly lost 4.37% of its market capitalization.

Selection of Scientific Publications

Read also:

Follow the trend!

On #January20, our migration tools will be ready to help us all, and find threads and followees on other networks.

Leave X on #January20, not before, not after, and mobilize your friends, family and network on all platforms to join the movement and help us leave X!

I want to follow the countdown

Follow HelloQuitX on:

X / Twitter Mastodon

Bluesky Youtube

Instagram

I want
to know more:

I read the Manifesto

I want to take part

I’m looking in the FAQ