Lifestyle

EU descends on Tiktok, YouTube, Snapchat over harmful contents

Published

on

Spread The News

The European Union Commission has intensified its scrutiny of tech giants, requesting detailed information from YouTube, Snapchat, and TikTok about the algorithms driving their content recommendations.

This inquiry, announced on Wednesday, aims to assess how these platforms’ algorithms contribute to systemic risks, particularly in areas like election integrity, mental health, and the protection of minors.

The requests, made under the EU’s Digital Services Act (DSA), also address the platforms’ efforts to mitigate the amplification of illegal content, including the promotion of hate speech and illegal drugs.

In a statement, the EU Commission emphasized its concern over the influence of recommender systems on the spread of harmful content and asked the companies to disclose measures they have taken to counteract these risks.

The commission’s focus on TikTok stems from concerns over the platform’s role in elections and the potential manipulation of its algorithms by bad actors.

The platform was asked to provide additional details about steps it has taken to reduce risks tied to election interference and the spread of disinformation.

Dr. Maria Mendes, a professor of digital media governance at the University of Amsterdam, remarked, “These platforms have become central to political communication, especially among younger audiences.

READ ALSO: Ex-presidential aide Bashir Ahmad calls for stricter regulation of TikTok

The EU’s move is a clear signal that they want greater transparency, particularly around how these algorithms might distort civic discourse or amplify misinformation during election periods.”

The EU’s inquiry also highlights growing concerns about the impact of social media algorithms on mental health, particularly among teenagers and children. Recommender systems are designed to keep users engaged, but critics argue that this can lead to overexposure to harmful content, including material that may negatively affect self-esteem, body image, and emotional well-being.

Dr. Lara Schmidt, a digital safety expert, explained, “There’s mounting evidence linking prolonged exposure to certain types of content with deteriorating mental health, especially in young users.

The EU’s effort to force transparency could be a crucial step in understanding—and mitigating—those effects.”

The companies have been given until November 15 to respond with the requested information. After reviewing the responses, the EU Commission will determine further actions, which may include issuing fines if platforms are found to be in violation of the DSA.

The law, which took effect earlier this year, mandates that large tech firms do more to address illegal and harmful content on their platforms.

This inquiry follows ongoing non-compliance proceedings against other major players like Meta (Facebook and Instagram), AliExpress, and TikTok, all of which have been questioned about their adherence to the DSA.

“The EU Commission is signaling that it won’t shy away from holding tech giants accountable,” said Alexandre Dupont, a Brussels-based tech policy analyst.

“These inquiries are part of a broader trend to curb Big Tech’s influence, especially when it comes to issues like misinformation, illegal content, and user safety.”

As these platforms prepare to disclose their data, experts warn that the EU’s response will likely set the tone for how digital regulation evolves across the globe, with other regions closely watching how the bloc enforces its groundbreaking legislation.

Leave a Reply

Your email address will not be published.

Trending

Copyright © 2024 Nationaldailyng