Advertisement

We need your help now

Support from readers like you keeps The Journal open.

You are visiting us because we have something you value. Independent, unbiased news that tells the truth. Advertising revenue goes some way to support our mission, but this year it has not been enough.

If you've seen value in our reporting, please contribute what you can, so we can continue to produce accurate and meaningful journalism. For everyone who needs it.

The Code applies to video-sharing platform whose EU headquarters are in Ireland. Alamy Stock Photo

Ireland's new age check system kicks in and seeks to stop children accessing ‘adult’ video

It’s contained in Part B of the Online Safety Code, which came into place today.

A NEW AGE verification system which seeks to prevent children from accessing “adult-only” video content on sites such as Instagram and TikTok has come into place.

The age verification system is contained in Part B of the Online Safety Code from Ireland’s media regulator Coimisiún na Meán.

The code, which aims to address harmful and illegal content, applies to video-sharing platforms whose EU headquarters are in Ireland.

Many of these platforms are household names and includes Facebook and Instagram, as well as YouTube, TikTok, X, and Reddit.

However, this means many other major platforms, such as Snapchat, is outside of the remit of the Code and is instead subject to UK online safety legislation.

CyberSafeKids meanwhile noted that children will still have unrestricted access to harmful or pornographic content provided by other commercial operators outside of the Code’s remit, which falls instead under the EU’s Digital Services Act (DSA).  

The new age verification system seeks to provide an “effective method of age assurance” that will prevent children from accessing pornography or extreme violence.

Other restricted categories include cyberbullying, promotion of eating and feeding disorders, promotion of self-harm and suicide, dangerous challenges, and incitement to hatred or violence.

In the Code, Coimisiún na Meán notes that “merely asking users whether they are over 18 will not be enough”.

It added that platforms will “need to use appropriate forms of age verification to protect children from video and associated content which may impair their physical, mental or moral development”.

The Code requires video sharing platforms to implement “effective age assurance measures” to ensure that “adult-only video content cannot normally be seen by children”.

The Code does not mandate a specific type of age verification method but notes that an age assurance measure “based solely on self-declaration of age by users of the service shall not be an effective measure”.

The video sharing platform is also required to have an “easy-to-use and effective procedure for the handling and resolution of complaints” around age verification and related issues.

Platforms are also required to provide parental controls that enable parents or guardians to set time limits in respect of video content and to restrict children from viewing video uploaded or shared by users unknown to the child.

The Code was formally adopted last November but platforms were given nine months to make any changes that were needed to their online systems.

CyberSafeKids described today as a “milestone that formally shifts legal responsibility onto tech companies to protect children online”.

It added that this move “finally places a clear obligation on platforms to face the reality that underage users are accessing harmful content daily on their platforms, and to implement effective safeguards”. 

While Coimisiún na Meán has not mandated a specific type of age verification system, CyberSafeKids said the nine-month implementation period has allowed “more than enough time to develop robust age verification systems other than self-declaration”.

Meanwhile, CyberSafeKids expressed concern that the Code does not cover the recommender algorithm system.

The recommender system is an algorithm that uses data to suggest items that a social media user might be interested in.

However, CyberSafeKids warns that “much harmful content coming through a child’s feed originates from this”.

The Irish Council for Civil Liberties previously warned that recommender systems “push hate and extremism into people’s feeds and inject content that glorifies self-harm and suicide into children’s feeds”.

CyberSafeKids has called for the Code to be reviewed within 12 to 24 months.

“If the Code has failed to reduce underage access to harmful content, stronger measures must be implemented to keep children safe,” said CyberSafeKids.

“Financial penalties should be quickly and fully imposed for non-compliance, in line with the legislation.”

Readers like you are keeping these stories free for everyone...
A mix of advertising and supporting contributions helps keep paywalls away from valuable information like this article. Over 5,000 readers like you have already stepped up and support us with a monthly payment or a once-off donation.

Close
25 Comments
This is YOUR comments community. Stay civil, stay constructive, stay on topic. Please familiarise yourself with our comments policy here before taking part.
Leave a Comment
Submit a report
Please help us understand how this comment violates our community guidelines.
Thank you for the feedback
Your feedback has been sent to our team for review.

Leave a commentcancel

 
JournalTv
News in 60 seconds