Recommending Toxicity: How TikTok and YouTube Shorts are bombarding boys and men with misogynist content

A new study from Dublin City University’s Anti-Bullying Centre shows that the recommender algorithms used by social media platforms are rapidly amplifying misogynistic and male supremacist content.

The study, conducted by Professor Debbie Ging, Dr Catherine Baker and Dr Maja Andreasen, tracked, recorded and coded the content recommended to 10 experimental or ‘sockpuppet’ accounts on 10 blank smartphones – five on YouTube Shorts and five on TikTok. The researchers found that all of the male-identified accounts were fed masculinist, anti-feminist and other extremist content, irrespective of whether they sought out general or male supremacist-related content, and that they all received this content within the first 23 minutes of the experiment.

Once the account showed interest by watching this sort of content, the amount rapidly increased. By the last round of the experiment (after 400 videos or two to three hours viewing), the vast majority of the content being recommended to the phones was toxic (TikTok 76% and YouTube Shorts 78%), primarily falling into the manosphere (alpha male and anti-feminist) category. Much of this content rails against equality and promotes the submission of women. There was also a large amount of content devoted to male motivation, money-making and mental health. This material strategically taps into boys’ financial and emotional insecurities and is particularly dangerous in relation to mental health as it frequently claims that depression is a sign of weakness and that therapy is ineffective. The other toxic categories were reactionary right and conspiracy, which accounted for 13.6% of recommended content on TikTok and 5.2% of recommended content on YouTube Shorts. Much of this was anti-transgender content.

Overall, YouTube Shorts accounts were recommended a larger amount of toxic content (on average 61.5% of the total recommended content) than TikTok accounts (34.7%). Content featuring ‘Manfluencers’ (male influencers) accounted for the vast majority of recommended videos in the dataset, demonstrating their centrality in the current manosphere ecosystem. By far the most prevalent of these was Andrew Tate, who featured 582 times on the YouTube Shorts accounts and 93 times on the TikTok accounts.

According to Prof. Ging, “Our study shows that shutting down influencers’ accounts does not necessarily remove their content. The overwhelming presence of Andrew Tate content in our dataset at a time when he was de-platformed means that social media companies must tackle harmful content in more sophisticated ways.”

The findings of the report point to urgent and concerning issues for parents, teachers, policy makers, and society as a whole. Among the authors’ recommendations are better content moderation, turning off recommender algorithms by default and cooperation with trusted flaggers to highlight illegal, harmful, and borderline content. They also stress the need for teacher education and the teaching of critical digital literacy skills in schools to equip young people with a better understanding of how influencer culture and algorithms work.

According to Prof. Ging, “Ultimately, girls and women are the most severely impacted by these beliefs, but they are also damaging to the boys and men who consume them, especially in relation to mental wellbeing. The social media companies must come under increased pressure from the government to prioritise the safety and wellbeing of young people over profit.”

The full-length report is available at: https://antibullyingcentre.ie/recommending-toxicity/

Recent Articles