EU’s DSA enforcers send more questions to Snapchat, TikTok and YouTube about AI risks

1 month ago 16
ARTICLE AD

On Wednesday the European Union requested more information from Snapchat, TikTok and YouTube about their respective content recommendation algorithms under the bloc’s online governance rulebook, the Digital Services Act (DSA).

In a press release the Commission said it’s sent requests for information (RFI) to the three social media platforms asking them for more details about the design and functioning of their algorithms. The trio have been given until November 15 to provided the sought for data.

The EU said their responses will inform any next steps — such as, potentially, opening formal investigations.

The bloc’s online governance framework, which contains tough penalties for violations (of up to 6% of global annual turnover), applies an extra layer of systemic risk mitigation rules to the three platforms — owing to their designations as VLOPs (aka very large online platforms).

These rules require larger platforms to identify and mitigate risks that could result from their use of AI as a content recommending tool, with the law stipulating they must take action to prevent negative impacts in a range of areas including users’ mental health and civic discourse. The EU has also warned specifically that the dissemination of harmful content could result from the use of algorithms with an engagement-based design — which looks to be where its latest RFIs are focused.

The Commission said it’s asking Snapchat and YouTube to provide “detailed information” on algorithmic parameters they use to recommend content to their users. It also wants more data on these AIs’ role in “amplifying certain systemic risks” — including risks related to the electoral process and civic discourse; users’ mental well-being (e.g. addictive behaviour and content ‘rabbit holes’); and the protection of minors.

“The questions also concern the platforms’ measures to mitigate the potential influence of their recommender systems on the spread of illegal content, such as promoting illegal drugs and hate speech,” the EU added.

For TikTok, the Commission is seeking more detail on anti-manipulation measures deployed to try to prevent malicious actors from gaming the platform to spread harmful content. The EU is also asking TikTok for more on how it mitigates risks related to elections, pluralism of media, and civic discourse — systemic risks which it said may be amplified by recommender systems.

These latest RFIs are not the first the Commission has sent to the three platforms. Earlier DSA asks have included questions for the trio (and several other VLOPs) about election risks ahead of the EU elections earlier this year. It has also previously quizzed all three about child safeguarding issues. Plus the Commission sent an RFI to TikTok last year asking how it was responding to content risks related to the Israel-Hamas war

However ByteDance’s platform is the only one of the three social media products to be under formal DSA investigation so far. In February the bloc opened a probe of TikTok’s DSA compliance — saying it is concerned about a range of issues including the platform’s approach to minor protection and the risk management of addictive design and harmful content. That investigation is ongoing.

TikTok spokesperson Paolo Ganino emailed a statement in which the company wrote: “This morning, we received a request for information from the European Commission, which we will now review. We will cooperate with the Commission throughout the RFI process.” 

We’ve also reached out to Snap and TikTok for responses to the latest Commission’s latest RFIs.

While the DSA’s rules for VLOPs have been in force since late last summer the bloc has yet to conclude any of the several probes it has open on larger platforms. Although, in July, the Commission put out preliminary findings related to investigations on X — saying it suspects the social network of breaching the DSA’s rules on dark pattern design; providing data access to researchers; and ad transparency.

Read Entire Article