ºìÌÒÊÓÆµ

Skip to main content
European School Education Platform
News item

New EU guidelines on the protection of minors

On 14 July 2025, the European Commission published new guidelines on the protection of minors under, to ensure a safe online experience for children and young people.
Image showing the digital services act wording on futuristic backgorund

The . The guidelines set out a non-exhaustive list of measures to protect children from online risks such as grooming, harmful content, problematic and addictive behaviours, as well as cyberbullying and harmful commercial practices.

The guidelines will apply to all online platforms accessible to minors, with the exception of micro and small enterprises. Key recommendations include the following:

  • Setting minors' accounts to private by default so their personal information, data, and social media content is hidden from those they aren't connected with to reduce the risk of unsolicited contact by strangers.
  • Modifying the platforms’ recommender systems to lower the risk of children encountering harmful content or getting stuck in rabbit holes of specific content, including by advising platforms to prioritise explicit signals from children over behavioural signals as well as empowering children to be more in control of their feeds.
  • Empowering children to be able to block and mute any user and ensuring they can't be added to groups without their explicit consent, which could help prevent cyberbullying.
  • Prohibiting accounts from downloading or taking screenshots of content posted by minors to prevent the unwanted distribution of sexualised or intimate content and sexual extortion.
  • Disabling by default features that contribute to excessive use, like communication "streaks," ephemeral content, "read receipts," autoplay, or push notifications, as well as removing persuasive design features aimed predominantly at engagement and putting safeguards around AI chatbots integrated into online platforms.
  • Ensuring that children’s lack of commercial literacy is not exploited and that they are not exposed to commercial practices that may be manipulative, lead to unwanted spending or addictive behaviours, including certain virtual currencies or loot-boxes.
  • Introducing measures to improve moderation and reporting tools, requiring prompt feedback, and minimum requirements for parental control tools.

The guidelines also recommend the use of effective age assurance methods provided that they are accurate, reliable, robust, non-intrusive, and non-discriminatory. In particular, they recommend age verification methods to restrict access to adult content such as pornography and gambling, or when national rules set a minimum age to access certain services such as defined categories of online social media services. 

Find out more about .

 

Additional information

  • Education type:
    School Education