Bing, Facebook, Google Search, Instagram, Snapchat, TikTok, X, YouTube
21
AliExpress
22
Facebook, Instagram
23
TikTok
24
AliExpress, Amazon Store, AppStore, Bing, Booking.com, Facebook, Google Maps, Google Play, Google Search, Google Shopping, Instagram, LinkedIn, Pinterest, Snapchat, TikTok, YouTube, Zalando
25
X
26
AppStore, Google Play
27
Facebook, Instagram, Snapchat
28
TikTok, YouTube
29
AliExpress
30
Instagram
31
Amazon Store
37 rows
Filter the platform(s)
Authority
Type
Starting Date
Period
Enforcement status
Enforcement action
Source
TikTok
EU Commission
Investigation
December 17, 2024 -> TBD
In progress
The Commission has opened formal proceedings against TikTok for a suspected breach of the Digital Services Act (DSA) in relation to TikTok's obligation to properly assess and mitigate systemic risks linked to election integrity, notably in the context of the recent Romanian presidential elections on 24 November.
The Commission has opened formal proceedings to assess whether Temu may have breached the Digital Services Act (DSA) in areas linked to the sale of illegal products, the potentially addictive design of the service, the systems used to recommend purchases to users, as well as data access for researchers.
The Commission is requesting the providers of Pornhub, Stripchat and XVideos to provide more information related to their transparency reporting obligations, as the Commission suspects they lack clear and easily comprehensible information on their content moderation practices.
The European Commission on Friday ordered Temu to provide detailed information and internal documents by Oct. 21 to show how it is respecting European rules to limit the spread of illegal content under the bloc’s content-moderation law, the Digital Services Act (DSA). The firm also has to give information about its recommendation algorithm and the risk to the protection of users’ personal data.
TikTok
YouTube
Snap
EU Commission
RFI
October 2, 2024 --> November 15, 2024
Done
The Commission has sent a request for information to YouTube, Snapchat, and TikTok under the Digital Services Act (DSA), asking the platforms to provide more information on the design and functioning of their recommender systems.
Coimisiún na Meán has issued formal requests for information for further comprehensive detail on approach to reporting options for illegal content (Article 16 of the DSA)
Coimisiún na Meán has issued formal requests for information for further comprehensive detail on measures for users to report illegal content (Article 12 of the DSA)
Following the discontinuation of CrowdTangle on 14 August 2024, the Commission is requesting Meta to provide more information on the measures it has taken to comply with its obligations to give researchers access to data that is publicly accessible on the online interface of Facebook and Instagram
the Commission has informed X of its preliminary view that it is in breach of the Digital Services Act (DSA) in areas linked to dark patterns, advertising transparency and data access for researchers.
the Commission has sent Amazon a request for information (RFI) under the Digital Services Act (DSA). The Commission is requesting Amazon to provide more information on the measures the platform has taken to comply with the DSA obligations related to the transparency of recommender systems and their parameters, as well as to the provisions on maintaining an ad repository and its risk assessment report.
The European Commission asked the e-commerce companies to provide information before July 12 about how they respect obligations to allow users to notify them about illegal products and ensure their platforms aren’t designed in a way that may manipulate and deceive consumers.
Pornhub
XVideos
Stripchat
EU Commission
RFI
June 13, 2024 -> July 4, 2024
Done
The Commission is requesting the companies to provide more detailed information on the measures they have taken to diligently assess and mitigate risks related to the protection of minors online, as well as the to prevent the amplification of illegal content and gender-based violence. Among others the Commission is requiring details on age assurance mechanisms adopted by these pornographic platforms.
The request for information is based on the suspicion that Bing may have breached the DSA for risks linked to generative AI, such as so-called ‘hallucinations', the viral dissemination of deepfakes, as well as the automated manipulation of services that can mislead voters.
The Commission has opened formal proceedings to assess whether Meta, the provider of Facebook and Instagram, may have breached the Digital Services Act (DSA) in areas linked to the protection of minors.
The suspected infringements cover Meta's policies and practices relating to deceptive advertising and political content on its services. They also concern the non-availability of an effective third-party real-time civic discourse and election-monitoring tool ahead of the elections to the European Parliament, against the background of Meta's deprecation of its real-time public insights tool CrowdTangle without an adequate replacement.
This investigation will thus focus on the following areas:
• TikTok's compliance with the DSA obligation to conduct and submit a risk assessment report prior to deploying functionalities, in this case the “Task and Reward Lite” program, that are likely to have a critical impact on systemic risks. In particular negative effects on mental health, including minors' mental health, especially as a result of the new feature stimulating addictive behavior.
• The measures taken by TikTok to mitigate those risks.
The Commission has sent TikTok a request for information under the Digital Services Act (DSA), asking for more details on the risk assessment the provider of TikTok should have carried out before deploying the new app TikTok Lite in the EU. This concerns the potential impact of the new “Task and Reward Lite” programme on the protection of minors, as well as on the mental health of users, in particular in relation to the potential stimulation of addictive behaviour. The Commission is also requesting information about the measures the platform has put in place to mitigate such systemic risks.
Australia's eSafety commissioner has put the big social media companies on notice, demanding they do better to stop the proliferation of violent extremist material and activity on their platforms.
RFI for details on how their service complies with the prohibition of presenting advertisements based on profiling using special categories of personal data. LinkedIn is also required to provide information about how it ensures that all necessary transparency requirements for advertisements are provided to its users.
The Commission is requesting these services to provide more information on their respective mitigation measures for risks linked to generative AI, such as so-called ‘hallucinations' where AI provides false information, the viral dissemination of deepfakes, as well as the automated manipulation of services that can mislead voters.
The Commission is also requesting information and internal documents on the risk assessments and mitigation measures linked to the impact of generative AI on electoral processes, dissemination of illegal content, protection of fundamental rights, gender-based violence, protection of minors, mental well-being, protection of personal data, consumer protection and intellectual property. The questions relate to both the dissemination and the creation of generative AI content.
The Commission is requesting Meta to provide more information related to the Subscription for no Ads options for both Facebook and Instagram. In particular, Meta should provide additional information on the measures it has taken to comply with its obligations concerning Facebook and Instagram's advertising practices, recommender systems and risk assessments related to the introduction of that subscription option.
Provide more information on the measures they have taken to comply with the obligation to give access, without undue delay, to the data that is publicly accessible on their online interface to eligible researchers.
Provide more information on compliance with the rules applicable to online marketplaces and to transparency related to recommender systems and online advertisements.
The Commission is requesting the companies to provide more information on the measures they have taken to comply with their obligations related to the protection of minors under the DSA, including the obligations related to risk assessments and mitigation measures to protect minors online, in particular with regard to the risks to mental health and physical health, and on the use of their services by minors.
The Commission is requesting the companies to provide more information on the measures they have taken to comply with their obligations related to protection of minors under the DSA, including the obligations related to risk assessments and mitigation measures to protect minors online, in particular with regard to the risks to mental health and physical health, and on the use of their services by minors.
The Commission is requesting AliExpress to provide more information on the measures it has taken to comply with obligations related to risk assessments and mitigation measures to protect consumers online, in particular with regard to the dissemination of illegal products online such as fake medicines.
The Commission is requesting Meta to provide additional information on the measures it has taken to comply with its obligations to assess risks and take effective mitigation measures linked to the protection of minors, including regarding the circulation of self-generated child sexual abuse material (SG-CSAM)
The Commission is requesting Amazon to provide more information on the measures it has taken to comply with obligations related to risk assessments and mitigation measures to protect consumers online, in particular with regard to the dissemination of illegal products and the protection of fundamental rights, as well as on compliance of recommender systems with the relevant provisions of the DSA.