Social media algorithms pushing, normalising extreme misogynistic content on young people, finds study
Social media algorithms pushing, normalising extreme misogynistic content on young people, finds study
A recent report by The Guardian sheds light on the alarming trend of social media algorithms rapidly propagating extreme misogynistic content, which is infiltrating school environments and normalizing harmful attitudes towards women.
The report is based on a study conducted by teams from University College London and the University of Kent, which reveals a concerning surge in misogynistic content suggested by TikTok’s algorithm over a five-day monitoring period.
This content, often centred on anger and blame directed at women, saw a four-fold increase, indicating a worrying trend in the platform’s recommendation system.
While the focus of this research was TikTok, experts suggest that similar patterns likely extend to other social media platforms. They advocate for a nuanced approach to addressing the issue, promoting a “healthy digital diet” over blanket bans on phones or social media platforms, which are deemed ineffective.
The report comes amidst growing concerns over the impact of social media on young people, with recent studies indicating a generational divide in attitudes towards feminism. Additionally, calls for stricter regulations on smartphone use among minors have gained momentum following tragic incidents linked to online activities.
According to the findings, social media algorithms play a pivotal role in presenting harmful content as entertainment, thereby influencing young users’ perceptions and behaviours. The researchers stress that toxic ideologies, once confined to online spaces, are now permeating school environments and mainstream youth cultures.
Geoff Barton, from the Association of School and College Leaders, highlights the insidious nature of algorithmic processes, urging social media platforms to reassess their algorithms and strengthen safeguards to counter the proliferation of harmful content.
Andy Burrows, adviser to the Molly Rose Foundation, echoes these sentiments, emphasizing the urgent need for regulatory intervention to curb the dissemination of harmful content targeting vulnerable teens.
Responding to these concerns, Prime Minister Rishi Sunak reaffirmed the government’s commitment to online safety, citing the recently passed Online Safety Act as a crucial step in holding social media companies accountable for protecting children from harmful content.
In light of the report’s findings, TikTok asserts its commitment to combating misogyny on its platform, noting its proactive content moderation measures. However, critics argue that the methodology employed in the report fails to capture the real impact of harmful content on users.
With the Online Safety Act set to empower regulators to tackle online harms, including misogyny, authorities emphasize the importance of addressing content that disproportionately affects women and girls, signalling a concerted effort to safeguard users’ safety and rights in the digital realm.