On February 9, 2025, the European Commission announced preliminary findings that TikTok and Meta have breached transparency rules set forth under the Digital Services Act (DSA). The Commission accused both tech giants of failing to provide “adequate access” to public data for researchers, crucial for understanding the societal impacts of their platforms. The ramifications of these findings could lead to significant fines and additional scrutiny of the companies’ data handling practices in the future.
| Article Subheadings |
|---|
| 1) Overview of the Charges Against TikTok and Meta |
| 2) Implications Under the Digital Services Act |
| 3) Company Responses to the Findings |
| 4) Broader Context of EU Regulations on Big Tech |
| 5) Next Steps and Potential Consequences |
Overview of the Charges Against TikTok and Meta
The European Commission, the executive body of the European Union, has emerged as a watchdog against potential breaches in tech oversight, recently issuing preliminary findings against TikTok and Meta for violating transparency rules. Under the DSA, these companies are mandated to ensure that researchers have “adequate access” to public data on their platforms. This access is critical for enabling scrutiny of the platforms’ roles in shaping public behavior and health, particularly concerning illegal or harmful content exposure.
The accusations highlight systemic issues within the operational practices of both tech giants. In its statement, the Commission articulated concerns that Facebook, Instagram, and TikTok have established procedures that are overly cumbersome, hindering researchers from obtaining reliable data. Such barriers may significantly restrict their ability to conduct meaningful assessments on user experiences and content moderation practices that affect millions, including vulnerable groups like minors.
Implications Under the Digital Services Act
The Digital Services Act represents a significant legislative effort by the EU to regulate online services and fortify protections for users. By mandating that platforms maintain transparency about their data practices, the DSA aims to curb misinformation and ensure user safety. The preliminary findings against TikTok and Meta may catalyze greater enforcement of this legislation, particularly regarding transparency in content moderation and data access for academic and public research.
If the European Commission’s findings are confirmed, TikTok and Meta could face a “non-compliance decision,” leading to fines that can reach up to 6% of their annual global revenue. This could translate to billions of euros given the substantial revenues these companies generate. Such penalties are designed not just as punitive measures, but also as effective deterrents to encourage compliance with European regulations.
Company Responses to the Findings
In light of the accusations, both TikTok and Meta have issued responses defending their practices. Meta’s spokesperson highlighted an ongoing commitment to enhance content reporting options and improve user mechanisms for reporting illegal content. The representative emphasized their belief that changes implemented since the DSA’s enactment are in line with EU expectations, stating,
“We disagree with any suggestion that we have breached the DSA.”
Conversely, representatives from TikTok acknowledged the importance of the researchers’ role in ensuring transparency but raised concerns over potential conflicts between the DSA and existing GDPR (General Data Protection Regulation) protections. In a statement, they remarked,
“If it is not possible to fully comply with both, we urge regulators to provide clarity on how these obligations should be reconciled.”
This positions TikTok as cautious about adhering to the DSA while also safeguarding user privacy, asserting its commitment to transparency amid the scrutiny.
Broader Context of EU Regulations on Big Tech
The recent actions taken by the European Commission align with a broader context of regulatory frameworks aimed at amending the operations of Big Tech. The DSA, along with the Digital Markets Act (DMA), is designed to address the power imbalance in the tech landscape, emphasizing user safety and fair competition. With these regulations, the EU seeks to sound the alarm on the potential harms posed by unchecked technological advancements that affect personal privacy and data management.
Furthermore, the EU’s rigorous stance highlights the growing awareness and concerns regarding the potential societal implications of social media usage. As instances of misinformation, online harassment, and mental health complications associated with tech platforms continue to rise, regulators are increasingly focused on holding companies accountable for the environments they foster online.
Next Steps and Potential Consequences
Both TikTok and Meta are now invited to engage with the European Commission to address the preliminary findings. This initial phase allows the companies to present counterarguments and evidence before any formal non-compliance decisions are made. Should the findings be substantiated, the resulting fines could impact not only their bottom line but also go a long way in reshaping their policies on data accessibility and user reporting mechanisms.
The long-term implications may extend beyond financial penalties. A clearer set of regulations prompted by the DSA and the DMA likely means both companies will also need to be proactive in reformulating their internal structures to adhere to evolving standards of transparency. This situation showcases the tension between technological advancement and regulatory efforts aimed at ensuring that such progress does not come at the cost of user safety and integrity.
| No. | Key Points |
|---|---|
| 1 | The European Commission accused TikTok and Meta of breaching transparency rules regarding public data access. |
| 2 | The Digital Services Act mandates that platforms provide adequate access to public data for researchers. |
| 3 | Both companies face potential fines of up to 6% of their global annual revenue for non-compliance. |
| 4 | TikTok and Meta have asserted their commitment to making necessary changes but raised concerns over conflicting regulations. |
| 5 | The situation reflects greater regulatory scrutiny of Big Tech as the EU pushes for accountability and transparency. |
Summary
The European Commission’s preliminary findings regarding TikTok and Meta’s breaches of the Digital Services Act underscore a significant shift in regulatory practices aimed at Big Tech. As these companies face potential fines and increased scrutiny, the case exemplifies the ongoing tension between technological innovation and the need for regulatory oversight to protect public interest. The implications of these findings could redefine compliance standards that shape the future of social media governance in Europe and beyond.
Frequently Asked Questions
Question: What is the Digital Services Act?
The Digital Services Act is EU legislation designed to regulate online platforms and ensure user safety through transparency and accountability in data practices.
Question: What could happen if TikTok and Meta do not comply with the European Commission’s findings?
If TikTok and Meta do not comply, they may face significant fines of up to 6% of their global annual revenue and possible adjustments in their operational practices to meet regulatory standards.
Question: How do the findings relate to user safety and content moderation?
The findings emphasize the need for social media platforms to provide reliable data access for researchers, enabling them to study the impacts of online content and ensure user safety.

