Connect with us


Dublin riots sentiment “hasn’t gone away” says Online Safety Commissioner



Launching a public consultation for the Online Safety Code today, Niamh Hodnett described the task of monitoring incendiary content online as “an ongoing issue”. She also said platforms with adult content could ask children for photo ID to prove their age.

Speaking about the launch of Coimisiún na Meán’s new draft safety code today, Niamh Hodnett told the Irish Independent that the anger associated with the riots “hasn’t gone away”.

Ms Hodnett, whose Online Safety Commissioner role is part of Coimisiún na Meán, also said that she was worried about social media platforms’ ‘recommender’ systems and how they might portray videos from the scene of the stabbings in different ways to different groups.

“A number of us spoke in the afternoon of that Thursday, to try and get a picture of what exactly people were seeing,” she said.

“There was a concern that there was going to be a pile-on in relation to incitement to hatred or violence against migrants. And that’s what we were concerned about, that we could start seeing a picture of. We didn’t anticipate it was going to lead to the riots that it did lead to, but we did think it could lead to incitement to hatred, and that that would have had real world harm. So that’s why we contacted the platforms.”

She said that her team contacted big tech platforms soon after news of the stabbings on Parnell Square broke.

Under the draft code, Coimisiún na Meán’s powers of enforcement, which include fines of up to 10pc of annual turnover or €20m, do not begin until February of next year.

Nevertheless, Ms Hodnett said that all platforms, including Elon Musk’s X, “engaged” with the regulatory body when it contacted them on Thursday November 23rd. She said that meetings with the big platforms took place the following day using video conference calls with the European Commision present on the calls.

“Engagement and cooperation is not the same as compliance, necessarily, and the European Commission is currently looking into that,” she said.

Earlier this week, X, formerly Twitter, described as “inaccurate” remarks made by Minister for Justice Helen McEntee, who said that X had not co-operated with Gardai in the aftermath of the riots.

“The Gardai did not make any formal requests to us until late Monday 27th November,” the company’s global governance account said in a public tweet, castigating Ms McEntee and adding that it had taken action against 1,230 pieces of content relating to the riots.

“We responded promptly. The only appeal we have received from the Gardai relating to the enforcement of our rules is for a single post.”

The video-sharing tech platforms, said Ms Hodnett, activated response teams to deal with content that might be inciteful on the evening. The companies subsequently shared some of the actions they took with the Online Safety Commissioner’s office, although Ms Hodnett declined to say what those specific actions were.

She said that because her office is still “informal” until February, when enforcement of the Safety Code kicks in, it is the European Commission which is taking the lead on responding formally with the tech platforms about the matter.

“They [European Commission] are assessing information and it’s up to them as to what steps they’re going to take further in relation to that,” she said.

Ms Hodnett said that she remained concerned about content being shared in the aftermath of the riots.

“I think this is going to be an ongoing issue,” she said.

“There was a lot of anger on the streets that night and it hasn’t gone away. We met last week with the platforms and the European Commission and with the Gardai to offer what assistance we could.”

‘Recommender’ systems, which use algorithms to show content the system thinks a user might like, are not included in the just-published draft safety code, which is now open to public consultation until January 19th.

However, Coimisiún na Meán says that it would be “appropriate” that supplementary measures to the code should require platforms “to prepare, publish and implement a recommender system safety plan” that includes “measures to mitigate the main risks” and “explain the choices that

have been made about whether and how they have implemented a number of specified measures”.

Ms Hodnett said that WhatsApp, which was used extensively during the riots, was not covered under the new draft code because it is considered to be a messaging service rather than a video-sharing platform.

She said that this would be kept “under review”, though.

“The question is the degree to which those types of services have public channels so that they’re no longer end to end interpersonal communications,” she said, adding that if they shared terrorist content, they could come under additional enforcement measures.

Ms Hodnett said that the new safety code’s “robust” and “effective” age-verification requirements, to stop children accessing inappropriate or adult content, could include requests for photo identification or camera shots from children.

However, because adult sites such as Pornhub and Onlyfans are not based in Ireland, Coimisiún na Meán will not enforce the measures on those sites.

However, sites such as X, which allow adult material and pornography, may be required to ask for age-verification, she said.

“We are not mandating any particular type of technology or approach,” said Ms Hodnett. “That can include photo ID. It’s up to the platforms to satisfy themselves as to what robust age verification is, in a way that’s GDPR-compliant, and to report to us on it.”

She said that the watchdog may interrogate the self-reporting process “if we don’t think those reports are okay”.

After consultation, the finalised Code will form part of Ireland’s overall online safety framework, making digital services legally accountable for how they keep people safe online. This framework will also include the EU Digital Services Act and the EU Terrorist Content Online Regulation, enforced in Ireland by Coimisiún na Meán.

The draft Code sets out measures that designated video-sharing platforms will be obliged to implement to keep their users, especially children, safe online. These platforms will have to protect children from specific types of harmful content. This includes cyberbullying, online content that promotes or encourages a feeding or eating disorder and online content that promotes or encourages self-harm or suicide.

Platforms will have to “prevent the uploading or sharing of a range of illegal content, including incitement to hatred or violence”. They will also have to provide “media literacy tools for users, which can help people recognise disinformation and misinformation”.

Public consultation on the draft digital safety code can be accessed at

Continue Reading
Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *