Ireland’s media regulator, which oversees a number of tech giants’ compliance with the EU’s Digital Services Act (DSA) rules, said it is reviewing how major platforms allow users to report illegal content, following a large number of complaints.
On Thursday, the Commission on Human Rights (CNM) said that one in three DSA complaints it received since the general rules came into effect in February related to difficulties in reporting illegal content online.
The review looks at tools and processes offered by Dropbox, Etsy, LinkedIn, Meta (Facebook and Instagram), Pinterest, Shein, Temu, TikTok, Tumblr, YouTube and X. Another lesser-known service, called Hostelworld, is included in the sweep.
The DSA is the European Union’s new online content governance and moderation framework. It aims to ensure that digital services and platforms have effective tools and processes in place to enforce their own rules and respond to reports of illegal content, such as terrorism and child sexual abuse material (CSAM).
Penalties for breaching this regime can reach up to 6% of global annual turnover, so any violation of the rules for reporting illegal content could end up being costly for large technology companies.
Content and Touchpoint Reporting Tools
The DSA stipulates that platforms must have easily accessible and user-friendly systems for reporting problems (Article 16). They must also provide a clear and accessible point of contact for users to raise their concerns (Article 12).
Some platforms have already had problems in both areas. For example, the European Commission has been investigating X’s compliance with Article 16 of the DSA since December. And the EU is currently investigating Meta for reporting illegal content. In July, X lost a lawsuit brought by a Dutch citizen that included a complaint about the platform’s violation of Article 12 of the DSA.
The CNM review is important as it could lead to a broader application of regulations on X if the regulator ends up confirming that the platform has breached the rules.
The Irish regulator’s review is examining Article 12 compliance by all of the above-mentioned platforms. It will also examine Article 16 compliance for all of the above-mentioned platforms except Meta, Shein and X, as the European Commission has already opened DSA investigations/reviews into them.
While the CNM said its review is still in the “information gathering phase,” the move appears significant considering the number of major services involved. It is also the first wide-ranging action by the DSA on the part of the regulator.
“The Commission is now initiating a formal review of the online platforms’ systems, to ensure that the platforms comply with their obligations under the EU Digital Services Act (DSA),” it wrote in a statement. Press releaseand added that the review could lead to «possible formal enforcement and investigative actions.»
Ireland’s media watchdog plays an outsized role in monitoring major tech platforms because many of them have chosen to locate their EU headquarters in the country.
“Once the information gathering phase is complete, Coimisiún na Meán will communicate with platforms to ensure that their reporting mechanisms and points of contact comply with DSA requirements,” it wrote, adding that it could issue a “compliance notice” ordering platforms to address any identified deficiencies.
“If this does not lead to changes and improvements, the Commission may open a formal investigation. If the investigation leads to a finding of non-compliance, the Commission may impose sanctions, such as a fine,” it said.
Niamh Hodnett, Coimisiún na Meán’s online safety commissioner, said in the statement: “We are committed to using the full range of powers available under our Online Safety Framework to hold platforms to account for keeping people safe online.
“Through the DSA, our upcoming Internet Safety Code and the EU Regulation on Terrorist Content online, we are working towards a digital landscape where adults and children can browse the internet without fear of harm from the content or behaviour they encounter. When people see illegal content, they should report it to the platform they saw it on, and if they are not satisfied with the platform’s response or cannot find an easy way to report the content, they should contact us.”