Platform(s)
1
X
2
TikTok
3
TikTok
4
LinkedIn
5
Shein
6
X
7
TikTok
8
Instagram
9
Facebook
10
YouTube
11
Snapchat
12
Google Play
13
AppStore
14
Google Search
15
Google Play
16
Booking
17
Bing
18
AppStore
19
Temu
20
AliExpress
21
AliExpress
22
XVideos
23
XNXX
24
Stripchat
25
Pornhub
26
TikTok
27
Shein
28
X
29
TikTok
30
TikTok
31
TikTok
111 rows
(Empty)
Authority
Type
Summary
Enforcement status
Date sent
End Date (if relevant)
Concerned DSA articles
Source
EU Commission
Fine
The Commission imposed a €120 million fine on X in its first non-compliance decision under the DSA, for breaching platform transparency obligations. The decision focuses on three areas: the deceptive design of its “blue checkmark” and other dark patterns that mislead users - Article 25; the lack of transparency and accessibility of its advertising repository - Article 39; and the failure to provide vetted researchers with access to public data - Article 40.
Done
40 - Data Access
39 - Additional Online Advertising Transparency
25 - Online Interface Design and Organisation
https://digital-strategy.ec.europa.eu/en/news/commission-fines-x-eu120-million-under-digital-services-act
EU Commission
Decision accepting binding commitments
Following its preliminary findings on shortcomings in TikTok’s advertising transparency, the Commission has accepted binding commitments from TikTok to bring its ad repository into full compliance with the DSA’s requirements for very large online platforms. TikTok will now provide the full content of ads (including URLs), update the repository within 24 hours, disclose targeting criteria and aggregated reach data (by age group, gender and Member State), and offer improved search and filtering so that regulators, researchers and users can more easily scrutinise advertising on the service.
Done
39 - Additional Online Advertising Transparency
https://digital-strategy.ec.europa.eu/en/news/commission-accepts-tiktoks-commitments-advertising-transparency-under-digital-services-act
🇮🇪 Coimisiún na Meán
Investigation
The investigations will look into: Whether the illegal content reporting mechanisms implemented by TikTok and LinkedIn are easy to access and user-friendly – Article 16(1); Whether the illegal content reporting mechanisms provided by TikTok and LinkedIn allow people to report suspected child sexual abuse material anonymously – Article 16(2)(c); Whether the illegal content reporting mechanisms provided by TikTok and LinkedIn deceive people from reporting content as illegal – Article 25.
In progress
16(1) - Notice and Action Mechanisms - User-friendly notification systems
25 - Online Interface Design and Organisation
16(2) - Notice and Action Mechanisms
https://www.cnam.ie/coimisiun-na-mean-commences-investigations-into-tiktok-and-linkedin/
🇮🇪 Coimisiún na Meán
Investigation
The investigations will look into: Whether the illegal content reporting mechanisms implemented by TikTok and LinkedIn are easy to access and user-friendly – Article 16(1); Whether the illegal content reporting mechanisms provided by TikTok and LinkedIn allow people to report suspected child sexual abuse material anonymously – Article 16(2)(c); Whether the illegal content reporting mechanisms provided by TikTok and LinkedIn deceive people from reporting content as illegal – Article 25.
In progress
16(1) - Notice and Action Mechanisms - User-friendly notification systems
25 - Online Interface Design and Organisation
16(2) - Notice and Action Mechanisms
https://www.cnam.ie/coimisiun-na-mean-commences-investigations-into-tiktok-and-linkedin/
EU Commission
RFI
The Commission requested the platform to provide detailed information and internal documents on how it ensures that minors are not exposed to age-inappropriate content, in particular through age assurance measures, as well as how it prevents the circulation of illegal products on its platform. The Commission is also inquiring about the effectiveness of such mitigation measures adopted by Shein.
In progress
No concerned DSA article specified
https://digital-strategy.ec.europa.eu/en/news/commission-requests-shein-provide-information-sale-illegal-products-under-digital-services-act
🇮🇪 Coimisiún na Meán
Investigation
The investigation will look into: Whether people are able to appeal X’s decisions not to remove content when they report something that they think breaches X’s terms of service Whether people are properly informed of the outcome of a report they make and whether they are informed about their right to appeal the decision Whether X has an internal complaints-handling mechanism that is easy to access and user friendly.
In progress
20 - Internal Complaint-Handling System
https://www.cnam.ie/coimisiun-na-mean-investigation-into-x/
EU Commission
Preliminary findings
The Commission preliminary found TikTok in breach of the DSA as it may have put in place burdensome procedures and tools for researchers to request access to public data.
In progress
40 - Data Access
https://ec.europa.eu/commission/presscorner/detail/en/ip_25_2503
EU Commission
Preliminary findings
The Commission found Instagram in breach of its obligations to provide users simple mechanisms to notify illegal content, as well as to allow them to effectively challenge content moderation decisions. Also, the Commission found the platform in breach of its obligation to grant researchers adequate access to public data.
In progress
40 - Data Access
16 - Notice and Action Mechanisms
20 - Internal Complaint-Handling System
https://ec.europa.eu/commission/presscorner/detail/en/ip_25_2503
EU Commission
Preliminary findings
The Commission found Facebook in breach of its obligations to provide users simple mechanisms to notify illegal content, as well as to allow them to effectively challenge content moderation decisions. Also, the Commission found the platform in breach of its obligation to grant researchers adequate access to public data.
In progress
40 - Data Access
16 - Notice and Action Mechanisms
20 - Internal Complaint-Handling System
https://ec.europa.eu/commission/presscorner/detail/en/ip_25_2503
EU Commission
RFI
The Commission requested information on its age assurance system. Additionally, the Commission is seeking more details on its recommender system, following reporting of harmful content being disseminated to minors.
In progress
No concerned DSA article specified
https://digital-strategy.ec.europa.eu/en/news/commission-scrutinises-safeguards-minors-snapchat-youtube-apple-app-store-and-google-play-under
EU Commission
RFI
The Commission requested Snapchat to provide information about how it prevents children under 13 years of age from accessing its services, as prohibited by the platform’s own terms of service. The Commission also requested Snapchat to provide information on the features it has in place to prevent the sale of illegal goods for children, such as vapes or drugs
In progress
No concerned DSA article specified
https://digital-strategy.ec.europa.eu/en/news/commission-scrutinises-safeguards-minors-snapchat-youtube-apple-app-store-and-google-play-under
EU Commission
RFI
The Commission requested information on how Google Play manages the risk of users, including minors, being able to download illegal or otherwise harmful apps, including gambling apps and tools to create non-consensual sexualised content, the so-called ‘nudify apps’. The Commission also aims to understand how Google Play applies apps' age ratings.
In progress
No concerned DSA article specified
https://digital-strategy.ec.europa.eu/en/news/commission-scrutinises-safeguards-minors-snapchat-youtube-apple-app-store-and-google-play-under
EU Commission
RFI
The Commission requested information on how AppStore manages the risk of users, including minors, being able to download illegal or otherwise harmful apps, including gambling apps and tools to create non-consensual sexualised content, the so-called ‘nudify apps’. The Commission also aims to understand how AppStore applies apps' age ratings.
In progress
No concerned DSA article specified
https://digital-strategy.ec.europa.eu/en/news/commission-scrutinises-safeguards-minors-snapchat-youtube-apple-app-store-and-google-play-under
EU Commission
RFI
The Commission asked to provide detailed information on how they assess the presence of fraudulent content and what measures they take to reduce the risks of financial scams.
Done
No concerned DSA article specified
https://digital-strategy.ec.europa.eu/en/news/commission-requests-information-under-digital-services-act-apple-bookingcom-google-and-microsoft
EU Commission
RFI
The Commission asked to provide detailed information on how they assess the presence of fraudulent content and what measures they take to reduce the risks of financial scams.
Done
No concerned DSA article specified
https://digital-strategy.ec.europa.eu/en/news/commission-requests-information-under-digital-services-act-apple-bookingcom-google-and-microsoft
EU Commission
RFI
The Commission asked to provide detailed information on how they assess the presence of fraudulent content and what measures they take to reduce the risks of financial scams.
Done
No concerned DSA article specified
https://digital-strategy.ec.europa.eu/en/news/commission-requests-information-under-digital-services-act-apple-bookingcom-google-and-microsoft
EU Commission
RFI
The Commission asked to provide detailed information on how they assess the presence of fraudulent content and what measures they take to reduce the risks of financial scams.
Done
No concerned DSA article specified
https://digital-strategy.ec.europa.eu/en/news/commission-requests-information-under-digital-services-act-apple-bookingcom-google-and-microsoft
EU Commission
RFI
The Commission asked to provide detailed information on how they assess the presence of fraudulent content and what measures they take to reduce the risks of financial scams.
Done
No concerned DSA article specified
https://digital-strategy.ec.europa.eu/en/news/commission-requests-information-under-digital-services-act-apple-bookingcom-google-and-microsoft
EU Commission
Preliminary findings
The Commission preliminarily found Temu in breach of the obligation under the Digital Services Act (DSA) to properly assess the risks of illegal products being disseminated on its marketplace
Done
34 - Risk Assessment
https://ec.europa.eu/commission/presscorner/detail/en/ip_25_1913
EU Commission
Preliminary findings
the Commission preliminarily found AliExpress in breach of its obligation to assess and mitigate risks related to the dissemination of illegal products under the DSA.
Done
34 - Risk Assessment
35 - Mitigation of Risks
https://digital-strategy.ec.europa.eu/en/news/commission-accepts-commitments-offered-aliexpress-under-digital-services-act-and-takes-further
EU Commission
Decision accepting binding commitments
The Commission made commitments with AliExpress related to: the assessment and mitigation of specific risks of dissemination of illegal content and content affecting health and minors, the notice and action and internal complaint handling mechanisms, the transparency of its advertisement systems, the transparency of its recommender systems, the traceability of traders, and access to public data for researchers.
Done
27 - Recommender System Transparency
30 - Traceability of Traders
38 - Recommender Systems
16 - Notice and Action Mechanisms
20 - Internal Complaint-Handling System
26- Advertising on Online Platforms
39 - Additional Online Advertising Transparency
https://digital-strategy.ec.europa.eu/en/news/commission-makes-aliexpress-commitments-under-digital-services-act-binding
EU Commission
Investigation
The Commission's investigation focus on the risks for the protection of minors, including those linked to the absence of effective age verification measures. The Commission preliminarily found that the platform do not comply with putting in place: Appropriate and proportionate measures to ensure a high level of privacy, safety and security for minors, in particular with age verification tools to safeguard minors from adult content. Risk assessment and mitigation measures of any negative effects on the rights of the child, the mental and physical well-being of users, and to prevent minors from accessing adult content, notably via appropriate age verification tools.
In progress
No concerned DSA article specified
https://ec.europa.eu/commission/presscorner/detail/en/ip_25_1339
EU Commission
Investigation
The Commission's investigation focus on the risks for the protection of minors, including those linked to the absence of effective age verification measures. The Commission preliminarily found that the platform do not comply with putting in place: Appropriate and proportionate measures to ensure a high level of privacy, safety and security for minors, in particular with age verification tools to safeguard minors from adult content. Risk assessment and mitigation measures of any negative effects on the rights of the child, the mental and physical well-being of users, and to prevent minors from accessing adult content, notably via appropriate age verification tools.
In progress
No concerned DSA article specified
https://ec.europa.eu/commission/presscorner/detail/en/ip_25_1339
EU Commission
Investigation
The Commission's investigation focus on the risks for the protection of minors, including those linked to the absence of effective age verification measures. The Commission preliminarily found that the platform do not comply with putting in place: Appropriate and proportionate measures to ensure a high level of privacy, safety and security for minors, in particular with age verification tools to safeguard minors from adult content. Risk assessment and mitigation measures of any negative effects on the rights of the child, the mental and physical well-being of users, and to prevent minors from accessing adult content, notably via appropriate age verification tools.
In progress
No concerned DSA article specified
https://ec.europa.eu/commission/presscorner/detail/en/ip_25_1339
EU Commission
Investigation
The Commission's investigation focus on the risks for the protection of minors, including those linked to the absence of effective age verification measures. The Commission preliminarily found that the platform do not comply with putting in place: Appropriate and proportionate measures to ensure a high level of privacy, safety and security for minors, in particular with age verification tools to safeguard minors from adult content. Risk assessment and mitigation measures of any negative effects on the rights of the child, the mental and physical well-being of users, and to prevent minors from accessing adult content, notably via appropriate age verification tools.
In progress
No concerned DSA article specified
https://ec.europa.eu/commission/presscorner/detail/en/ip_25_1339
EU Commission
Preliminary findings
The Commission informed TikTok of its preliminary view that the company does not fulfil the Digital Services Act (DSA)'s obligation to publish an advertisement repository.
Done
39 - Additional Online Advertising Transparency
https://ec.europa.eu/commission/presscorner/detail/en/ip_25_1223
EU Commission
RFI
The Commission requested Shein to provide internal documents and more detailed information on risks linked to the presence of illegal content and goods on its marketplace, on the transparency of its recommender systems, and on the access to data for qualified researchers. Moreover, the Commission is requesting Shein to provide detailed information on the measures adopted to mitigate risks relating to consumer protection, public health and users' wellbeing. The Commission is also requesting details on the protection of users' personal data.
Done
No concerned DSA article specified
https://ec.europa.eu/commission/presscorner/detail/en/mex_25_430
EU Commission
Retention Order
The Commission requestes a ‘retention order’ requires the platform to preserve internal documents and information regarding future changes to the design and functioning of its recommender algorithms, for the period between 17 January 2025 and 31 December 2025, unless the Commission’s ongoing investigation is concluded beforehand.
Done
25 - Online Interface Design and Organisation
39 - Additional Online Advertising Transparency
https://digital-strategy.ec.europa.eu/en/news/commission-addresses-additional-investigatory-measures-x-ongoing-proceedings-under-digital-services
EU Commission
Investigation
The Commission opened formal proceedings against TikTok for a suspected breach of the Digital Services Act (DSA) in relation to TikTok's obligation to properly assess and mitigate systemic risks linked to election integrity, notably in the context of the recent Romanian presidential elections on 24 November.
In progress
34(1) - Risk Assessment- Systemic risks
34(2) - Risk Assessment - Frequency of risk assessments
35 (1) - Mitigation of Risks - Implementation of mitigation measures
https://ec.europa.eu/commission/presscorner/detail/en/ip_24_6487
EU Commission
Retention Order
The Commission issued a ‘retention order' to TikTok under the DSA, ordering the platform to freeze and preserve data related to actual or foreseeable systemic risks its service could pose on electoral processes and civic discourse in the EU.
Done
No concerned DSA article specified
https://ec.europa.eu/commission/presscorner/detail/en/ip_24_6243
EU Commission
RFI
The Commission sent TikTok a request for information (RFI) under the Digital Services Act (DSA), seeking details about how it managed risks of information manipulation during the Romanian elections. The request focuses on TikTok's analysis and mitigation of risks from inauthentic or automated activity, its recommender systems, and its efforts to enable third-party public scrutiny and access to data related to systemic risks in electoral processes.
Done
No concerned DSA article specified
https://digital-strategy.ec.europa.eu/en/news/commission-sends-additional-request-information-tiktok-under-digital-services-act