Bing, Facebook, Google Search, Instagram, Snapchat, TikTok, X, YouTube
24
AliExpress
25
Facebook, Instagram
26
TikTok
27
AliExpress, Amazon Store, AppStore, Bing, Booking.com, Facebook, Google Maps, Google Play, Google Search, Google Shopping, Instagram, LinkedIn, Pinterest, Snapchat, TikTok, YouTube, Zalando
28
X
29
AppStore, Google Play
30
Facebook, Instagram, Snapchat
31
TikTok, YouTube
38 rows
Filter the platform(s)
Authority
Type
Starting Date
Period
Enforcement status
Concerned DSA articles
Enforcement action
Source
TikTok
EU Commission
Investigation
December 17, 2024 -> TBD
In progress
34(1) - Risk Assessment- Systemic risks
34(2) - Risk Assessment - Frequency of risk assessments
35 (1) - Mitigation of Risks - Implementation of mitigation measures
The Commission has opened formal proceedings against TikTok for a suspected breach of the Digital Services Act (DSA) in relation to TikTok's obligation to properly assess and mitigate systemic risks linked to election integrity, notably in the context of the recent Romanian presidential elections on 24 November.
The Commission has issued a ‘retention order' to TikTok under the DSA, ordering the platform to freeze and preserve data related to actual or foreseeable systemic risks its service could pose on electoral processes and civic discourse in the EU.
The Commission has sent TikTok a request for information (RFI) under the Digital Services Act (DSA), seeking details about how it managed risks of information manipulation during the Romanian elections. The request focuses on TikTok's analysis and mitigation of risks from inauthentic or automated activity, its recommender systems, and its efforts to enable third-party public scrutiny and access to data related to systemic risks in electoral processes.
The Commission has opened formal proceedings to assess whether Temu may have breached the Digital Services Act (DSA) in areas linked to the sale of illegal products, the potentially addictive design of the service, the systems used to recommend purchases to users, as well as data access for researchers.
The Commission is requesting the providers of Pornhub, Stripchat and XVideos to provide more information related to their transparency reporting obligations, as the Commission suspects they lack clear and easily comprehensible information on their content moderation practices.
The European Commission on Friday ordered Temu to provide detailed information and internal documents by Oct. 21 to show how it is respecting European rules to limit the spread of illegal content under the bloc’s content-moderation law, the Digital Services Act (DSA). The firm also has to give information about its recommendation algorithm and the risk to the protection of users’ personal data.
The Commission has sent a request for information to YouTube, Snapchat, and TikTok under the Digital Services Act (DSA), asking the platforms to provide more information on the design and functioning of their recommender systems.
Coimisiún na Meán has issued formal requests for information for further comprehensive detail on approach to reporting options for illegal content (Article 16 of the DSA)
Coimisiún na Meán has issued formal requests for information for further comprehensive detail on measures for users to report illegal content (Article 12 of the DSA)
Following the discontinuation of CrowdTangle on 14 August 2024, the Commission is requesting Meta to provide more information on the measures it has taken to comply with its obligations to give researchers access to data that is publicly accessible on the online interface of Facebook and Instagram
the Commission has informed X of its preliminary view that it is in breach of the Digital Services Act (DSA) in areas linked to dark patterns, advertising transparency and data access for researchers.
the Commission has sent Amazon a request for information (RFI) under the Digital Services Act (DSA). The Commission is requesting Amazon to provide more information on the measures the platform has taken to comply with the DSA obligations related to the transparency of recommender systems and their parameters, as well as to the provisions on maintaining an ad repository and its risk assessment report.
The European Commission asked the e-commerce companies to provide information before July 12 about how they respect obligations to allow users to notify them about illegal products and ensure their platforms aren’t designed in a way that may manipulate and deceive consumers.
The Commission is requesting the companies to provide more detailed information on the measures they have taken to diligently assess and mitigate risks related to the protection of minors online, as well as the to prevent the amplification of illegal content and gender-based violence. Among others the Commission is requiring details on age assurance mechanisms adopted by these pornographic platforms.
The request for information is based on the suspicion that Bing may have breached the DSA for risks linked to generative AI, such as so-called ‘hallucinations', the viral dissemination of deepfakes, as well as the automated manipulation of services that can mislead voters.
Facebook
Instagram
EU Commission
Investigation
Not available
In progress
34 - Risk Assessment
28 - Online Protection of Minors
35 - Mitigation of Risks
The Commission has opened formal proceedings to assess whether Meta, the provider of Facebook and Instagram, may have breached the Digital Services Act (DSA) in areas linked to the protection of minors.
The Commission has requested X to provide detailed information and internal documents on its content moderation resources in light of its latest Transparency report under the DSA, which revealed that X has curtailed its team of content moderators by almost 20% since the preceding report in October 2023, reducing linguistic coverage within the European Union from 11 EU languages to 7. The Commission also sought further details on the risk assessments and mitigation measures linked to the impact of generative AI tools on electoral processes, dissemination of illegal content, and protection of fundamental rights.
25(1) - Online Interface Design and Organisation - Dark patterns
20(3) - Internal Complaint - Handling System - Accesible and user-friendly
14(1) - Terms and Conditions - Transparency of content moderation policies
16(6) - Notice and Action Mechanisms - Prompt acknowledgment of notices
16(1) - Notice and Action Mechanisms - User-friendly notification systems
24(5) - Transparency Reporting Obligations for Providers of Online Platforms - Submission of content moderation statements to the EC
20(1) - Internal Complaint-Handling System - User appeals
17(1) - Statement of Reasons - Justification for content restrictions
16(5) - Notice and Action Mechanisms
34(2) - Risk Assessment - Frequency of risk assessments
35 (1) - Mitigation of Risks - Implementation of mitigation measures
The suspected infringements cover Meta's policies and practices relating to deceptive advertising and political content on its services. They also concern the non-availability of an effective third-party real-time civic discourse and election-monitoring tool ahead of the elections to the European Parliament, against the background of Meta's deprecation of its real-time public insights tool CrowdTangle without an adequate replacement.
This investigation will thus focus on the following areas:
• TikTok's compliance with the DSA obligation to conduct and submit a risk assessment report prior to deploying functionalities, in this case the “Task and Reward Lite” program, that are likely to have a critical impact on systemic risks. In particular negative effects on mental health, including minors' mental health, especially as a result of the new feature stimulating addictive behavior.
• The measures taken by TikTok to mitigate those risks.
The Commission has sent TikTok a request for information under the Digital Services Act (DSA), asking for more details on the risk assessment the provider of TikTok should have carried out before deploying the new app TikTok Lite in the EU. This concerns the potential impact of the new “Task and Reward Lite” programme on the protection of minors, as well as on the mental health of users, in particular in relation to the potential stimulation of addictive behaviour. The Commission is also requesting information about the measures the platform has put in place to mitigate such systemic risks.
Australia's eSafety commissioner has put the big social media companies on notice, demanding they do better to stop the proliferation of violent extremist material and activity on their platforms.
RFI for details on how their service complies with the prohibition of presenting advertisements based on profiling using special categories of personal data. LinkedIn is also required to provide information about how it ensures that all necessary transparency requirements for advertisements are provided to its users.
The Commission is requesting these services to provide more information on their respective mitigation measures for risks linked to generative AI, such as so-called ‘hallucinations' where AI provides false information, the viral dissemination of deepfakes, as well as the automated manipulation of services that can mislead voters.
The Commission is also requesting information and internal documents on the risk assessments and mitigation measures linked to the impact of generative AI on electoral processes, dissemination of illegal content, protection of fundamental rights, gender-based violence, protection of minors, mental well-being, protection of personal data, consumer protection and intellectual property. The questions relate to both the dissemination and the creation of generative AI content.
The Commission is requesting Meta to provide more information related to the Subscription for no Ads options for both Facebook and Instagram. In particular, Meta should provide additional information on the measures it has taken to comply with its obligations concerning Facebook and Instagram's advertising practices, recommender systems and risk assessments related to the introduction of that subscription option.
Provide more information on the measures they have taken to comply with the obligation to give access, without undue delay, to the data that is publicly accessible on their online interface to eligible researchers.
Provide more information on compliance with the rules applicable to online marketplaces and to transparency related to recommender systems and online advertisements.
The Commission is requesting the companies to provide more information on the measures they have taken to comply with their obligations related to the protection of minors under the DSA, including the obligations related to risk assessments and mitigation measures to protect minors online, in particular with regard to the risks to mental health and physical health, and on the use of their services by minors.
The Commission is requesting the companies to provide more information on the measures they have taken to comply with their obligations related to protection of minors under the DSA, including the obligations related to risk assessments and mitigation measures to protect minors online, in particular with regard to the risks to mental health and physical health, and on the use of their services by minors.