Comments

On October 28, 2025, the European Commission announced preliminary findings that TikTok and Meta (for both Facebook and Instagram) breached their transparency and user-protection obligations under the Digital Services Act (DSA). The findings remain preliminary and do not prejudge the outcome of the investigation.
As a key transparency obligation, the DSA requires providing researchers with data access. However, the findings of the Commission include failures to grant researchers adequate access to public data.
Further, regarding Meta, the Commission’s preliminary assessment indicates that Facebook and Instagram lack user-friendly mechanisms for illegal content notifications. The existing tools reportedly require multiple steps and impose unnecessary burdens on users. The Commission also noted the use of “dark patterns”, or deceptive design interfaces may confuse or discourage users from flagging illegal material. Those practices risk the effectiveness of content moderation.
The DSA also grants users the right to appeal content moderation decisions, such as content removals and account suspensions. However, Meta’s current appeals process—used for both Facebook and Instagram—reportedly does not allow users to submit explanations or evidence in support of their appeals.