Friday, October 10, 2025
av1tvnews@gmail.com
EUWorld

EU Demands Answers from Snapchat, YouTube, Others on Child Safety Amid Push for Social Media Age Limits

As the EU tightens its grip on digital regulation, the debate over balancing safety, privacy, and freedom online continues to intensify across Europe.

The European Union (EU) has demanded explanations from major tech companies  including Snapchat, YouTube, Apple, and Google  on how they are protecting children from online harm, as the bloc considers new restrictions on minors’ access to social media.

The move comes amid growing concern across Europe about the exposure of young people to harmful or illegal online content, with member states such as France and Spain pushing for stronger safeguards.

Inspired by Australia’s social media ban for under-16s, the European Commission is assessing whether similar restrictions could be introduced across the 27-member bloc.

Under the Digital Services Act (DSA)  Europe’s powerful tool for regulating online platforms — the Commission has launched “investigative actions”, requesting information from Snapchat about measures to block users under 13 and prevent access to illegal goods such as drugs and vapes.

The Commission also sought details from Apple’s App Store and Google Play on how they prevent children from downloading harmful or illegal apps, including those offering gambling or sexual content, and so-called “nudify apps” that generate non-consensual sexualised images.

“Privacy, security and safety have to be ensured, and this is not always the case,” said EU tech chief Henna Virkkunen, ahead of a meeting of European ministers in Denmark, where child safety online topped the agenda.

The Commission clarified that requests for information do not imply any legal violation but can lead to formal investigations and potential fines under the DSA if non-compliance is found.

Brussels also asked YouTube to explain how its recommender system prevents minors from being exposed to harmful content, following multiple reports of inappropriate material circulating on the platform.

This latest demand adds to ongoing EU probes into Meta’s Facebook and Instagram, as well as TikTok, over concerns that their algorithms may contribute to addictive behaviour among young users.

In a broader policy move, EU telecom ministers are discussing the creation of a bloc-wide digital age of majority, which would set a consistent minimum age for social media access across member states.

European Commission President Ursula von der Leyen has backed the proposal, pledging to form an expert panel to study potential legislative steps.

Denmark, currently holding the EU’s rotating presidency, is leading the charge, with Prime Minister Mette Frederiksen announcing plans to ban social media access for children under 15.

Similarly, France has already implemented a law requiring parental consent for users below the same age threshold.

As the EU tightens its grip on digital regulation, the debate over balancing safety, privacy, and freedom online continues to intensify across Europe.

Victoria Emeto
the authorVictoria Emeto
A bright and self-driven graduate trainee at AV1 News, she brings fresh energy and curiosity to her role. With a strong academic background in Mass Communication, she has a solid foundation in storytelling, audience engagement, and media ethics. Her passion lies in the evolving media landscape, particularly how emerging technologies are reshaping content creation and distribution. She is already carving a niche for herself as a skilled journalist, honing her reporting, writing, and research abilities through hands-on experience. She actively explores the intersection of digital innovation and traditional journalism.

Leave a Reply