From cyberbullying to terrorism: Ireland imposes tough rules on social media

Man

Professional
Messages
3,085
Reaction score
623
Points
113
The new rules for video platforms will come into force next month.

The Irish internet and media regulator Coimisiún na Meán has adopted and published an Online Safety Code, which will come into force from next month. The document will affect the largest video platforms whose headquarters are located in the country, including TikTok from ByteDance, YouTube from Google, as well as Instagram and Facebook Reels from Meta*.

According to the provisions of the Code, platforms are required to include in their terms of use a ban on the publication and distribution of certain harmful content, such as cyberbullying, propaganda of suicide or eating disorders, as well as content that incites hatred, violence, terrorism, containing materials about sexual abuse of children, racism and xenophobia.

The representative of Coimisiún na Meán, Adam Hurley, explained that the new Code complements the European Digital Services Act (DSA). In contrast to the pan-European legislation, which focuses on combating illegal content, the Irish document covers a wider range of potentially dangerous materials.

While the Code formally applies only to video services that provide services to users in Ireland, technology companies can implement similar measures across the region to streamline the compliance process and prevent issues of inconsistency in content standards.

It is important to note that EU law prohibits imposing general content monitoring obligations on platforms. According to Hurley, the Irish Code does not require the implementation of boot filters, but expands on the existing notification and removal approach, allowing users to report harmful content for later review.

Age verification for pornography​

The document pays special attention to the protection of minors. Video platforms that allow the publication of pornographic content or scenes of unjustified violence must implement "appropriate" age verification systems. The regulator will evaluate the technologies used on an individual basis.

In addition, sites are required to create convenient content rating systems and provide parental control for materials that can negatively affect the physical, mental or moral development of children under 16 years old.

Recommendation systems​

Initially, the regulator considered the possibility of obliging video platforms to disable content recommendations based on profiling by default. However, after consultations last year, the measure was not included in the final version of the Code. Instead, issues related to recommendation systems will be governed by the pan-European Digital Services Act.

The Online Safety Code is part of Ireland's overall digital regulatory framework aimed at protecting users from online threats. The document operates in parallel with the European Digital Services Act, which is also monitored by the Coimisiún na Meán.

According to Ireland's Online Safety Commissioner Niamh Hodnett, the adoption of the Code marks the end of the era of social media self-regulation. The regulator intends to inform citizens about their rights on the Internet and hold platforms accountable for non-compliance with the established requirements.

Source
 
Top