{"id":109912,"date":"2025-09-25T15:46:38","date_gmt":"2025-09-25T13:46:38","guid":{"rendered":"https:\/\/www.hiig.de\/?p=109912"},"modified":"2025-12-10T17:36:17","modified_gmt":"2025-12-10T16:36:17","slug":"analysis-of-the-dsas-transparency-reports","status":"publish","type":"post","link":"https:\/\/www.hiig.de\/en\/analysis-of-the-dsas-transparency-reports\/","title":{"rendered":"Counting without accountability? An analysis of the DSA\u2019s transparency reports"},"content":{"rendered":"\n<p><strong>The Digital Services Act aims to hold platforms more accountable for illegal content by demanding greater transparency. Platforms like Facebook, Instagram, Tiktok or X must now publish detailed reports, showing, for example, how many posts they removed and how quickly. These reports are meant to give regulators, researchers and the public insight into how well platforms are enforcing the rules. But do the reports really deliver what they promise? Or is this measure just a new but ultimately useless addition to the flood of EU reports without any actual improvements in practice?<\/strong><\/p>\n\n\n\n<p>Online platforms like Instagram, TikTok or X have become an integral part of everyday life. However, despite their societal relevance, these privately owned platform companies remain largely opaque when it comes to understanding how they work. They rarely explain how they choose which content to distribute, remove or suppress, even though their decisions determine what we see online.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\"><strong>Holding platforms accountable<\/strong><\/h2>\n\n\n\n<p>One of the core aims of the new EU digital rulebook, including the Digital Services Act (DSA), is to regulate how platforms handle illegal content and enable more effective action against it. What constitutes illegal content is not explicitly codified in the DSA but rather ultimately depends on what is illegal under Union or Member State law. This often includes child sexual abuse material, incitement to terrorism, illegal hate speech or infringement of intellectual property rights.&nbsp;<\/p>\n\n\n\n<p>Platforms are to be held accountable for their reactions once they are made aware of illegal content. For example, if a user reports a post or video that they believe contains illegal hate speech, the platform must review the report and decide on an action such as deleting or restricting the post. Later, the platform must also disclose how many such cases were handled within a given timeframe and what they did about the reported content in a transparency report.<\/p>\n\n\n\n<p>One of the DSA\u2019s main aims is to increase the accountability of platforms by promoting transparency. A key assumption underlying the DSA is that digital services may pose systemic risks to society. Examples of such risks include widespread disinformation and the undermining of electoral integrity. To identify and limit these risks at an early stage, maximum transparency is necessary. This is supposed to enable public authorities, researchers and civil society to recognise potential systemic risks, and allow individual users to understand and assert their rights.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\">The DSA\u2019s transparency reports<\/h2>\n\n\n\n<p>Transparency reports are a key accountability measure under the DSA. Under Articles 15, 24 and 42, platforms must publish comprehensible reports on their content moderation activities. These reports must include information on the types of illegal content moderation and the actions taken. Content moderation actions can include deleting the content, demoting it or geo-blocking it in a specific country. The reports must be publicly available in a machine-readable format (Art. 15(1) DSA). Most platforms provide them as PDF documents or HTML pages on their websites. They are also collected and linked on an<a href=\"https:\/\/digital-strategy.ec.europa.eu\/en\/policies\/dsa-brings-transparency\" target=\"_blank\" rel=\"noreferrer noopener\"> EU website<\/a> (European Commission, 2025) .\u00a0<\/p>\n\n\n\n<p>But how much transparency do these reports actually provide? Are they really suited to uncover how platforms decide what to take down and what not to? And most importantly, are they an adequate measure to increase platform accountability? By analysing the transparency reports of selected online platforms, I argue that current transparency reports fall short of delivering true accountability with regard to the moderation of illegal content. Even though a new standardised reporting template has recently been introduced, and many hope it will improve the situation, I argue that the template can only address some of the inadequacies of the current reports while potentially creating new problems.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\"><strong>Transparency reports in practice: Three observations<\/strong><\/h2>\n\n\n\n<p>What do the transparency reports actually reveal about how platforms handle illegal content? Based on a qualitative analysis of the two reporting rounds of DSA transparency reports published in 2024 by seven very large online platforms (VLOPs, platforms with more than 45 million monthly average users), namely Instagram, Facebook, LinkedIn, Pinterest, Snapchat, TikTok and X, I examined how these VLOPs fulfill their transparency requirements.<\/p>\n\n\n\n<p>Although the European Commission (EC) provides guidance on the content of transparency reports, it does not specify their structure or level of detail. The general idea seems to have been to give the platforms some leeway, on the one hand, and on the other, to see whether it would be possible to build on best practices and refine the specifications later.<\/p>\n\n\n\n<p>The new template is intended to clarify the expected form, content and level of detail in the reports (European Commission, 2024). However, the regulation has only been in force since July 2025 and has not yet been applied in practice. In the available reports, each platform has interpreted the DSA specifications independently. My analysis revealed three key findings that highlight the variety of approaches used by platforms for their transparency reporting obligations and the limitations of the current reporting format.<\/p>\n\n\n\n<h3 class=\"wp-block-heading\"><strong>Observation 1: Disconnected data points<\/strong><\/h3>\n\n\n\n<p>One major issue is the lack of connection between different data points within the individual reports. For example, figures relating to moderation decisions, user complaints or automatic deletions are often only presented as individual values, with no reference to one another. This means that it is hard to establish a relationship between data, which makes it harder for researchers to interpret or integrate them into meaningful analyses.&nbsp;<\/p>\n\n\n\n<p>An example of this is the reporting of authority orders under Article 9 of the DSA, which sets out how Member State authorities, such as national courts or Digital Services Coordinators (in Germany, the Bundesnetzagentur), can request that platforms take action against illegal content. The term \u201corder\u201d can be misleading though, as platforms are not required to delete content that is referred to them by a Member State authority. Instead, they review the content independently and then decide whether to take action.<\/p>\n\n\n\n<p>In its 2024 transparency reports, Facebook provided two separate tables: one lists the number of orders received per Member State, the other the number of orders by content type, such as terrorist content, illegal speech, etc. (Facebook, 2024, p.3-5; Facebook, 2025, p.4-6). However, these two tables are not linked, which makes it impossible to find out, for example, how many orders to act against terrorist activity were issued by Italy against Facebook. This lack of cross-referencing severely limits the analytical value of the data.&nbsp;<\/p>\n\n\n\n<h3 class=\"wp-block-heading\"><strong>Observation 2: Arbitrary and inconsistent categories<\/strong><\/h3>\n\n\n\n<p>There is no standardised categorisation of illegal content across platforms, sometimes not even within the individual reports. Each platform has created its own set of categories, loosely based on EU or Member State law, but ultimately inconsistent and often arbitrary.<\/p>\n\n\n\n<p>LinkedIn, for example, uses the label&nbsp; \u201cIllegal or harmful speech\u201d (LinkedIn, 2024, p.17) while Facebook uses terms like \u201cHate speech\u201d and \u201cMisinformation\u201d (Facebook, 2024, p.4). Pinterest takes a different approach and directly refers to the specific laws that a piece of content is said to violate.<\/p>\n\n\n\n<p>Most platforms also include a vague catch-all category such as \u201cOther illegal content\u201d. For example, Instagram\u2019s transparency report for April to September 2024 states that it received 91 orders from authorities to act against \u201cother\u201d types of illegal content, accounting for almost one third of all cases (Instagram, 2024, p.4). Facebook received 113,638 user notices for \u201cother illegal content\u201d, accounting for around 45% of the total 248,748 notices received.&nbsp;<\/p>\n\n\n\n<p>Furthermore, the fact that some platforms use different categories for authority orders and user-submitted reports of illegal content adds unnecessary complexity. This makes it difficult to compare between platforms and further complicates the picture of how illegal content is handled globally.<\/p>\n\n\n\n<h3 class=\"wp-block-heading\"><strong>Observation 3: Opaque decision-making&nbsp;<\/strong><\/h3>\n\n\n\n<p>The most striking issue, perhaps, is the lack of clarity surrounding what happens after a platform receives a user notice or an authority order. In many reports, it is unclear what action was taken, on what basis, and whether any action was taken at all. While most platforms report how many orders they have received from authorities, they do not report whether they responded by deleting the reported post, deleting the account, geo-blocking the content in a specific country or reducing a post\u2019s visibility.&nbsp;<\/p>\n\n\n\n<p>LinkedIn merely hints whether \u201cat least some action was taken\u201d (LinkedIn, 2025, p.16-17) in response to an authority order, but does not provide further details. Pinterest is the notable exception: it clearly indicates whether it deactivated content, restricted it geographically or limited its distribution (Pinterest, 2025).<\/p>\n\n\n\n<p>Another issue is the considerable confusion surrounding the two types of reporting mechanisms: the specific mechanism for reporting illegal content under Article 16 of the DSA, and the general channels that platforms have in place for reporting any type of rule violation (e.g. content that violates a platform\u2019s advertising policy but not any laws). Article 16 requires platforms to have a reporting mechanism through which users can report potentially illegal content in a precise and substantiated way. For example, if a user believes that a post incites terrorism, they must be able to report it to the platform in a way that clearly indicates the potential illegality of the content.&nbsp;&nbsp;<\/p>\n\n\n\n<p>In practice, all user reports, regardless of the reason given for the report, are first reviewed for violations of the platforms\u2019 own rules, such as community guidelines or advertising policies. Therefore, at the outset of the review process, it is irrelevant whether it is an Article 16 notice or a different kind of user report. If an Art. 16 notice is found to violate a platform rule and if this leads to the reported content being deleted globally, it is never checked to see if it had also violated the law \u2013 even if originally reported for that reason. So, if the terrorism-inciting post in our example also violated a platform\u2019s advertising policy, it would never be reviewed for breaching any anti-terrorism laws. This means: the content would disappear globally, not just in the country where it might be unlawful.&nbsp;<\/p>\n\n\n\n<p>While this approach is clearly efficient \u2013 Why bother blocking a post in only one country if it violates a platform rule and would be del\u00adeted globally in any case? \u2013 it raises questions about who decides how public debate takes place and on the basis of which rules.<\/p>\n\n\n\n<p>Neither LinkedIn nor Snapchat explicitly distinguish between actions taken following a user report based on the law and those based on internal policies. Snapchat even argues that breaches of the law are automatically covered by their own rules, as a violation of their Community Guidelines includes \u201creasons of illegality\u201d (Snapchat, 2024). This seems to be in tension with Article 15 of the DSA, which clearly states that providers must specify whether an action was taken on the basis of the law or their own terms and conditions.<\/p>\n\n\n\n<p>In summary, the reports analysed here reveal major inconsistencies and blind spots. From disconnected data points and arbitrary categories to the opaque reasoning behind content moderation decisions, the reports currently fall short of offering real transparency. Against this backdrop, the following section outlines three key criticisms of the current reporting system and considers whether the new template might address some of these shortcomings.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\"><strong>Three key flaw<\/strong><strong>s in platform transparency reports&nbsp;<\/strong><\/h2>\n\n\n\n<ol class=\"wp-block-list\">\n<li><em><strong>The provided data is borderline unusable.<\/strong><\/em> The big differences in amount, level of detail, operationalisation and presentation of data across reports make it quite hard to compare the platforms and assess what measures are most effective in combating illegal content. In many cases, it is not even possible to establish connections between data points within individual reports, which further hinders any meaningful evaluation. The new template, which is an Excel spreadsheet where platforms can enter their data, should help to address some of these problems. For one thing, comparability is likely to improve if all platforms provide their information in the same format. The template also introduces fixed categories of illegal content and requires that the category \u201cother\u201d be described. However, the template only asks platforms to report the \u201cnumber of items moderated\u201d (European Commission, 2025), without specifying what type of content moderation action was taken. This is surprising, given that Article 15 of the DSA requires platforms to report how many notices they received under Article 16, and to categorise these by \u201cany action taken pursuant to the notices\u201d (Art. 15(1b) DSA). The new template, however, does not seem to include this requirement.&nbsp;<\/li>\n<\/ol>\n\n\n\n<ol start=\"2\" class=\"wp-block-list\">\n<li><em><strong>The process of moderating illegal content remains largely opaque. <\/strong><\/em>Even when report data does show which actions came in response to which notices or reports, the underlying process remains a black box. Questions such as \u201cWhat criteria are used to decide cases?\u201d and \u201cWhat legal expertise do content moderators have?\u201d remain unanswered. Snapchat, for example, has received 82,011 Art. 16 notices for content that potentially violates rules against false information during the reporting period between January and June 2024. Out of these, 106 pieces of content were deleted, 255 accounts were issued warnings and 12 accounts were locked. Setting aside the absurdly low number of actual actions, it is impossible for us to know why Snapchat deleted content in some cases and not in others, or why it locked accounts in some cases and merely issued warnings in others. The new template does not ask for that kind of information. So, it is unlikely that we will see an improvement in this regard.<\/li>\n<\/ol>\n\n\n\n<ol start=\"3\" class=\"wp-block-list\">\n<li><em><strong>Platform rules remain the gold standard.<\/strong><\/em> While it makes sense for efficiency reasons and is completely in line with the DSA, platforms still primarily check if content violates their own rules before or instead of focusing on EU or Member State law. However, this renders the separate mechanism for reporting illegal content frankly obsolete. It also raises questions about which rules are considered more important: those set by a private platform company or democratically legitimised laws. The new template cannot fundamentally challenge this hierarchy, nor does it make more visible the rules and criteria that platforms use to make content moderation decisions. Instead, the template further entrenches the already limited amount of information that platforms make available.<\/li>\n<\/ol>\n\n\n\n<h2 class=\"wp-block-heading\"><strong>The template is not the saviour some make it out to be<\/strong><\/h2>\n\n\n\n<p>The analysis of the 2024 Transparency Reports from seven very large online platforms (VLOPs) (Instagram, Facebook, LinkedIn, Pinterest, Snapchat, TikTok and X) reveals significant shortcomings in the way these companies report on the moderation of illegal content. Key data points are not connected, categories of illegal content are applied inconsistently and arbitrarily, and the reasoning behind content moderation remains largely opaque.&nbsp;<\/p>\n\n\n\n<p>These gaps make it difficult to assess how platforms actually respond to illegal content and thus to measure the adequacy and effectiveness of their response. Still, the newly introduced EU template for transparency reports is a step towards greater comparability and clarity as it standardises reporting formats and categories. It thereby may help reduce the inconsistency observed so far. However, the template leaves important blind spots unaddressed. Most notably, it does not require platforms to explain their reasoning behind moderation decisions or distinguish clearly between enforcement based on law versus internal rules. There might also be potential issues with platforms adhering only to the template\u2019s minimum requirements, which could reinforce existing shortcomings and further limit the availability of meaningful information. Thus, neither the transparency reports nor the new template currently achieve accountability through transparency.&nbsp;<\/p>\n\n\n\n<h2 class=\"wp-block-heading\"><strong>References<\/strong><\/h2>\n\n\n\n<p><strong>Transparency reports&nbsp;<\/strong><\/p>\n\n\n\n<p>Facebook. (2024). <em>Regulation (EU) 2022\/2065 Digital Services Act Transparency Report for Facebook. April\u2014September 2024<\/em>. Facebook.<\/p>\n\n\n\n<p>Facebook. (2025). <em>Regulation (EU) 2022\/2065 Digital Services Act Transparency Report for Facebook. October\u2014December 2024<\/em>. Facebook.<\/p>\n\n\n\n<p>Instagram. (2024). <em>Regulation (EU) 2022\/2065 Digital Services Act Transparency Report for Instagram. April\u2014September 2024<\/em>. Instagram.<\/p>\n\n\n\n<p>Instagram. (2025). <em>Regulation (EU) 2022\/2065 Digital Services Act Transparency Report for Instagram. October\u2014December 2024<\/em>. Instagram.<\/p>\n\n\n\n<p>LinkedIn. (2024). <em>Digital Services Act Transparency Report. January\u2014June 2024<\/em>. LinkedIn.<a href=\"https:\/\/www.linkedin.com\/help\/linkedin\/answer\/a1678508\" target=\"_blank\" rel=\"noreferrer noopener\"> https:\/\/www.linkedin.com\/help\/linkedin\/answer\/a1678508<\/a><\/p>\n\n\n\n<p>LinkedIn. (2025). <em>Digital Services Act Transparency Report. July\u2014December 2024<\/em>. LinkedIn.<a href=\"https:\/\/content.linkedin.com\/content\/dam\/help\/tns\/en\/February-2025-DSA-Transparency-Report.pdf\" target=\"_blank\" rel=\"noreferrer noopener\"> https:\/\/content.linkedin.com\/content\/dam\/help\/tns\/en\/February-2025-DSA-Transparency-Report.pdf<\/a><\/p>\n\n\n\n<p>Pinterest. (2024). <em>Digital Services Act Transparency Report. January\u2014June 2024<\/em>. Pinterest.<a href=\"https:\/\/policy.pinterest.com\/en\/transparency-report-h1-2024\" target=\"_blank\" rel=\"noreferrer noopener\"> https:\/\/policy.pinterest.com\/en\/transparency-report-h1-2024<\/a><\/p>\n\n\n\n<p>Pinterest. (2025). <em>Digital Services Act Transparency Report. July\u2014December 2024<\/em>. Pinterest.<a href=\"https:\/\/policy.pinterest.com\/en\/digital-services-act-transparency-report-jul-2024-dec-2024\" target=\"_blank\" rel=\"noreferrer noopener\"> https:\/\/policy.pinterest.com\/en\/digital-services-act-transparency-report-jul-2024-dec-2024<\/a><\/p>\n\n\n\n<p>Snapchat. (2024). <em>European Union Transparency | Snapchat Transparency. January\u2014June 2024<\/em>. Snapchat.<a href=\"https:\/\/values.snap.com\/privacy\/transparency\/european-union\" target=\"_blank\" rel=\"noreferrer noopener\"> https:\/\/values.snap.com\/privacy\/transparency\/european-union<\/a><\/p>\n\n\n\n<p>TikTok. (2024). <em>TikTok\u2019s DSA Transparency report. January\u2014June 2024<\/em>. TikTok.<a href=\"https:\/\/sf16-va.tiktokcdn.com\/obj\/eden-va2\/zayvwlY_fjulyhwzuhy[\/ljhwZthlaukjlkulzlp\/DSA_H2_2024\/TikTok-DSA-Transparency-Report-Jan-to-Jun-2024.pdf\" target=\"_blank\" rel=\"noreferrer noopener\"> https:\/\/sf16-va.tiktokcdn.com\/obj\/eden-va2\/zayvwlY_fjulyhwzuhy[\/ljhwZthlaukjlkulzlp\/DSA_H2_2024\/TikTok-DSA-Transparency-Report-Jan-to-Jun-2024.pdf<\/a><\/p>\n\n\n\n<p>TikTok. (2025). <em>TikTok\u2019s DSA Transparency report. July\u2014December 2024<\/em>. TikTok.<a href=\"https:\/\/sf16-va.tiktokcdn.com\/obj\/eden-va2\/zayvwlY_fjulyhwzuhy[\/ljhwZthlaukjlkulzlp\/DSA_H2_2024\/Corrected%20Data\/TikTok%20-%20DSA%20Transparency%20report%20-%20July%20-%20December%202024%20-21.03.2025.pdf\" target=\"_blank\" rel=\"noreferrer noopener\"> https:\/\/sf16-va.tiktokcdn.com\/obj\/eden-va2\/zayvwlY_fjulyhwzuhy[\/ljhwZthlaukjlkulzlp\/DSA_H2_2024\/Corrected%20Data\/TikTok%20-%20DSA%20Transparency%20report%20-%20July%20-%20December%202024%20-21.03.2025.pdf<\/a><\/p>\n\n\n\n<p>X. (2024). <em>DSA Transparency Report. April\u2014September 2024<\/em>. X.<a href=\"https:\/\/transparency.x.com\/dsa-transparency-report.html\" target=\"_blank\" rel=\"noreferrer noopener\"> https:\/\/transparency.x.com\/dsa-transparency-report.html<\/a><\/p>\n\n\n\n<p>X. (2025). <em>DSA Transparency Report. October 2024\u2014March 2025<\/em>. X.<a href=\"https:\/\/transparency.x.com\/assets\/dsa\/transparency-report\/dsa-transparency-report-april-2025.pdf\" target=\"_blank\" rel=\"noreferrer noopener\"> https:\/\/transparency.x.com\/assets\/dsa\/transparency-report\/dsa-transparency-report-april-2025.pdf<\/a><\/p>\n\n\n\n<p><strong>EU regulations and template<\/strong><\/p>\n\n\n\n<p>European Commission (2025). Annex I \u2013 Transparency reports template [Microsoft Excel file].<em> Publications Office of the European Union<\/em>.<\/p>\n\n\n\n<p>European Commission (2024). <em>Commission Implementing Regulation (EU) laying down templates concerning the transparency reporting obligations of providers of intermediary services and of providers of online platforms under Regulation (EU) 2022\/2065 of the European Parliament and the Council. <\/em><a href=\"https:\/\/eur-lex.europa.eu\/eli\/reg_impl\/2024\/2835\/oj\/eng\" target=\"_blank\" rel=\"noreferrer noopener\">https:\/\/eur-lex.europa.eu\/eli\/reg_impl\/2024\/2835\/oj\/eng<\/a>\u00a0<\/p>\n\n\n\n<p>European Commission (2022). <em>Regulation (EU) 2022\/2065 of the European Parliament and of the European Council of 19 October 2022 on a Single Market For Digital Services and Amending Directive 2000\/31\/EC (DSA)<\/em>. <a href=\"https:\/\/eur-lex.europa.eu\/eli\/reg\/2022\/2065\/oj\" target=\"_blank\" rel=\"noreferrer noopener\">https:\/\/eur-lex.europa.eu\/eli\/reg\/2022\/2065\/oj<\/a><\/p>\n<div class=\"shariff shariff-align-flex-start shariff-widget-align-flex-start\"><ul class=\"shariff-buttons theme-round orientation-horizontal buttonsize-medium\"><li class=\"shariff-button linkedin shariff-nocustomcolor\" style=\"background-color:#1488bf\"><a href=\"https:\/\/www.linkedin.com\/sharing\/share-offsite\/?url=https%3A%2F%2Fwww.hiig.de%2Fen%2Fanalysis-of-the-dsas-transparency-reports%2F\" title=\"Share on LinkedIn\" aria-label=\"Share on LinkedIn\" role=\"button\" rel=\"noopener nofollow\" class=\"shariff-link\" style=\"; background-color:#0077b5; color:#fff\" target=\"_blank\"><span class=\"shariff-icon\" style=\"\"><svg width=\"32px\" height=\"20px\" xmlns=\"http:\/\/www.w3.org\/2000\/svg\" viewBox=\"0 0 27 32\"><path fill=\"#0077b5\" d=\"M6.2 11.2v17.7h-5.9v-17.7h5.9zM6.6 5.7q0 1.3-0.9 2.2t-2.4 0.9h0q-1.5 0-2.4-0.9t-0.9-2.2 0.9-2.2 2.4-0.9 2.4 0.9 0.9 2.2zM27.4 18.7v10.1h-5.9v-9.5q0-1.9-0.7-2.9t-2.3-1.1q-1.1 0-1.9 0.6t-1.2 1.5q-0.2 0.5-0.2 1.4v9.9h-5.9q0-7.1 0-11.6t0-5.3l0-0.9h5.9v2.6h0q0.4-0.6 0.7-1t1-0.9 1.6-0.8 2-0.3q3 0 4.9 2t1.9 6z\"\/><\/svg><\/span><\/a><\/li><li class=\"shariff-button bluesky shariff-nocustomcolor\" style=\"background-color:#84c4ff\"><a href=\"https:\/\/bsky.app\/intent\/compose?text=Counting%20without%20accountability%3F%20An%20analysis%20of%20the%20DSA%E2%80%99s%20transparency%20reports https%3A%2F%2Fwww.hiig.de%2Fen%2Fanalysis-of-the-dsas-transparency-reports%2F  via @hiigberlin.bsky.social\" title=\"Share on Bluesky\" aria-label=\"Share on Bluesky\" role=\"button\" rel=\"noopener nofollow\" class=\"shariff-link\" style=\"; background-color:#0085ff; color:#fff\" target=\"_blank\"><span class=\"shariff-icon\" style=\"\"><svg width=\"20\" height=\"20\" version=\"1.1\" xmlns=\"http:\/\/www.w3.org\/2000\/svg\" viewBox=\"0 0 20 20\"><path class=\"st0\" d=\"M4.89,3.12c2.07,1.55,4.3,4.71,5.11,6.4.82-1.69,3.04-4.84,5.11-6.4,1.49-1.12,3.91-1.99,3.91.77,0,.55-.32,4.63-.5,5.3-.64,2.3-2.99,2.89-5.08,2.54,3.65.62,4.58,2.68,2.57,4.74-3.81,3.91-5.48-.98-5.9-2.23-.08-.23-.11-.34-.12-.25,0-.09-.04.02-.12.25-.43,1.25-2.09,6.14-5.9,2.23-2.01-2.06-1.08-4.12,2.57-4.74-2.09.36-4.44-.23-5.08-2.54-.19-.66-.5-4.74-.5-5.3,0-2.76,2.42-1.89,3.91-.77h0Z\"\/><\/svg><\/span><\/a><\/li><li class=\"shariff-button mailto shariff-nocustomcolor\" style=\"background-color:#a8a8a8\"><a href=\"mailto:?body=https%3A%2F%2Fwww.hiig.de%2Fen%2Fanalysis-of-the-dsas-transparency-reports%2F&subject=Counting%20without%20accountability%3F%20An%20analysis%20of%20the%20DSA%E2%80%99s%20transparency%20reports\" title=\"Send by email\" aria-label=\"Send by email\" role=\"button\" rel=\"noopener nofollow\" class=\"shariff-link\" style=\"; background-color:#999; color:#fff\"><span class=\"shariff-icon\" style=\"\"><svg width=\"32px\" height=\"20px\" xmlns=\"http:\/\/www.w3.org\/2000\/svg\" viewBox=\"0 0 32 32\"><path fill=\"#999\" d=\"M32 12.7v14.2q0 1.2-0.8 2t-2 0.9h-26.3q-1.2 0-2-0.9t-0.8-2v-14.2q0.8 0.9 1.8 1.6 6.5 4.4 8.9 6.1 1 0.8 1.6 1.2t1.7 0.9 2 0.4h0.1q0.9 0 2-0.4t1.7-0.9 1.6-1.2q3-2.2 8.9-6.1 1-0.7 1.8-1.6zM32 7.4q0 1.4-0.9 2.7t-2.2 2.2q-6.7 4.7-8.4 5.8-0.2 0.1-0.7 0.5t-1 0.7-0.9 0.6-1.1 0.5-0.9 0.2h-0.1q-0.4 0-0.9-0.2t-1.1-0.5-0.9-0.6-1-0.7-0.7-0.5q-1.6-1.1-4.7-3.2t-3.6-2.6q-1.1-0.7-2.1-2t-1-2.5q0-1.4 0.7-2.3t2.1-0.9h26.3q1.2 0 2 0.8t0.9 2z\"\/><\/svg><\/span><\/a><\/li><\/ul><\/div>","protected":false},"excerpt":{"rendered":"<p>Are the DSA&#8217;s transparency reports really holding platforms accountable? A critical analyses of reports from major platforms reveals gaps and raises doubts.<\/p>\n","protected":false},"author":313,"featured_media":109917,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"_acf_changed":false,"footnotes":""},"categories":[1577,1579],"tags":[],"class_list":["post-109912","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-digital-so","category-ftif-plattformen-governance"],"acf":[],"yoast_head":"<!-- This site is optimized with the Yoast SEO plugin v26.6 - https:\/\/yoast.com\/wordpress\/plugins\/seo\/ -->\n<title>The DSA\u2019s transparency reports &#8211; Digital Society Blog<\/title>\n<meta name=\"description\" content=\"Are the DSA&#039;s transparency reports really holding platforms accountable? A critical analyses of reports from major platforms reveals gaps and raises doubts.\" \/>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/www.hiig.de\/en\/analysis-of-the-dsas-transparency-reports\/\" \/>\n<meta property=\"og:locale\" content=\"en_US\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"The DSA\u2019s transparency reports &#8211; Digital Society Blog\" \/>\n<meta property=\"og:description\" content=\"Are the DSA&#039;s transparency reports really holding platforms accountable? A critical analyses of reports from major platforms reveals gaps and raises doubts.\" \/>\n<meta property=\"og:url\" content=\"https:\/\/www.hiig.de\/en\/analysis-of-the-dsas-transparency-reports\/\" \/>\n<meta property=\"og:site_name\" content=\"HIIG\" \/>\n<meta property=\"article:published_time\" content=\"2025-09-25T13:46:38+00:00\" \/>\n<meta property=\"article:modified_time\" content=\"2025-12-10T16:36:17+00:00\" \/>\n<meta property=\"og:image\" content=\"https:\/\/www.hiig.de\/wp-content\/uploads\/2025\/09\/Titelbild_DSAtransparancy-1.png\" \/>\n\t<meta property=\"og:image:width\" content=\"1144\" \/>\n\t<meta property=\"og:image:height\" content=\"643\" \/>\n\t<meta property=\"og:image:type\" content=\"image\/png\" \/>\n<meta name=\"author\" content=\"Digital Society Blog\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:label1\" content=\"Written by\" \/>\n\t<meta name=\"twitter:data1\" content=\"Digital Society Blog\" \/>\n\t<meta name=\"twitter:label2\" content=\"Est. reading time\" \/>\n\t<meta name=\"twitter:data2\" content=\"16 minutes\" \/>\n<!-- \/ Yoast SEO plugin. -->","yoast_head_json":{"title":"The DSA\u2019s transparency reports &#8211; Digital Society Blog","description":"Are the DSA's transparency reports really holding platforms accountable? A critical analyses of reports from major platforms reveals gaps and raises doubts.","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/www.hiig.de\/en\/analysis-of-the-dsas-transparency-reports\/","og_locale":"en_US","og_type":"article","og_title":"The DSA\u2019s transparency reports &#8211; Digital Society Blog","og_description":"Are the DSA's transparency reports really holding platforms accountable? A critical analyses of reports from major platforms reveals gaps and raises doubts.","og_url":"https:\/\/www.hiig.de\/en\/analysis-of-the-dsas-transparency-reports\/","og_site_name":"HIIG","article_published_time":"2025-09-25T13:46:38+00:00","article_modified_time":"2025-12-10T16:36:17+00:00","og_image":[{"width":1144,"height":643,"url":"https:\/\/www.hiig.de\/wp-content\/uploads\/2025\/09\/Titelbild_DSAtransparancy-1.png","type":"image\/png"}],"author":"Digital Society Blog","twitter_card":"summary_large_image","twitter_misc":{"Written by":"Digital Society Blog","Est. reading time":"16 minutes"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"Article","@id":"https:\/\/www.hiig.de\/en\/analysis-of-the-dsas-transparency-reports\/#article","isPartOf":{"@id":"https:\/\/www.hiig.de\/en\/analysis-of-the-dsas-transparency-reports\/"},"author":{"name":"Digital Society Blog","@id":"https:\/\/www.hiig.de\/#\/schema\/person\/a921ecfdfcb94cb9c718b90c3a5dedbd"},"headline":"Counting without accountability? An analysis of the DSA\u2019s transparency reports","datePublished":"2025-09-25T13:46:38+00:00","dateModified":"2025-12-10T16:36:17+00:00","mainEntityOfPage":{"@id":"https:\/\/www.hiig.de\/en\/analysis-of-the-dsas-transparency-reports\/"},"wordCount":2844,"publisher":{"@id":"https:\/\/www.hiig.de\/#organization"},"image":{"@id":"https:\/\/www.hiig.de\/en\/analysis-of-the-dsas-transparency-reports\/#primaryimage"},"thumbnailUrl":"https:\/\/www.hiig.de\/wp-content\/uploads\/2025\/09\/Titelbild_DSAtransparancy-1.png","articleSection":["Digital Society Blog","Ftif Platform governance"],"inLanguage":"en-US"},{"@type":"WebPage","@id":"https:\/\/www.hiig.de\/en\/analysis-of-the-dsas-transparency-reports\/","url":"https:\/\/www.hiig.de\/en\/analysis-of-the-dsas-transparency-reports\/","name":"The DSA\u2019s transparency reports &#8211; Digital Society Blog","isPartOf":{"@id":"https:\/\/www.hiig.de\/#website"},"primaryImageOfPage":{"@id":"https:\/\/www.hiig.de\/en\/analysis-of-the-dsas-transparency-reports\/#primaryimage"},"image":{"@id":"https:\/\/www.hiig.de\/en\/analysis-of-the-dsas-transparency-reports\/#primaryimage"},"thumbnailUrl":"https:\/\/www.hiig.de\/wp-content\/uploads\/2025\/09\/Titelbild_DSAtransparancy-1.png","datePublished":"2025-09-25T13:46:38+00:00","dateModified":"2025-12-10T16:36:17+00:00","description":"Are the DSA's transparency reports really holding platforms accountable? A critical analyses of reports from major platforms reveals gaps and raises doubts.","breadcrumb":{"@id":"https:\/\/www.hiig.de\/en\/analysis-of-the-dsas-transparency-reports\/#breadcrumb"},"inLanguage":"en-US","potentialAction":[{"@type":"ReadAction","target":["https:\/\/www.hiig.de\/en\/analysis-of-the-dsas-transparency-reports\/"]}]},{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/www.hiig.de\/en\/analysis-of-the-dsas-transparency-reports\/#primaryimage","url":"https:\/\/www.hiig.de\/wp-content\/uploads\/2025\/09\/Titelbild_DSAtransparancy-1.png","contentUrl":"https:\/\/www.hiig.de\/wp-content\/uploads\/2025\/09\/Titelbild_DSAtransparancy-1.png","width":1144,"height":643,"caption":"Are the DSA's transparency reports really holding platforms accountable? A critical analyses of reports from major platforms reveals gaps and raises doubts."},{"@type":"BreadcrumbList","@id":"https:\/\/www.hiig.de\/en\/analysis-of-the-dsas-transparency-reports\/#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"Home","item":"https:\/\/www.hiig.de\/en\/"},{"@type":"ListItem","position":2,"name":"Counting without accountability? An analysis of the DSA\u2019s transparency reports"}]},{"@type":"WebSite","@id":"https:\/\/www.hiig.de\/#website","url":"https:\/\/www.hiig.de\/","name":"HIIG","description":"Alexander von Humboldt Institute for Internet and Society","publisher":{"@id":"https:\/\/www.hiig.de\/#organization"},"potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/www.hiig.de\/?s={search_term_string}"},"query-input":{"@type":"PropertyValueSpecification","valueRequired":true,"valueName":"search_term_string"}}],"inLanguage":"en-US"},{"@type":"Organization","@id":"https:\/\/www.hiig.de\/#organization","name":"HIIG","url":"https:\/\/www.hiig.de\/","logo":{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/www.hiig.de\/#\/schema\/logo\/image\/","url":"https:\/\/www.hiig.de\/wp-content\/uploads\/2019\/06\/hiig.png","contentUrl":"https:\/\/www.hiig.de\/wp-content\/uploads\/2019\/06\/hiig.png","width":320,"height":80,"caption":"HIIG"},"image":{"@id":"https:\/\/www.hiig.de\/#\/schema\/logo\/image\/"}},{"@type":"Person","@id":"https:\/\/www.hiig.de\/#\/schema\/person\/a921ecfdfcb94cb9c718b90c3a5dedbd","name":"Digital Society Blog"}]}},"_links":{"self":[{"href":"https:\/\/www.hiig.de\/en\/wp-json\/wp\/v2\/posts\/109912","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.hiig.de\/en\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.hiig.de\/en\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.hiig.de\/en\/wp-json\/wp\/v2\/users\/313"}],"replies":[{"embeddable":true,"href":"https:\/\/www.hiig.de\/en\/wp-json\/wp\/v2\/comments?post=109912"}],"version-history":[{"count":7,"href":"https:\/\/www.hiig.de\/en\/wp-json\/wp\/v2\/posts\/109912\/revisions"}],"predecessor-version":[{"id":112022,"href":"https:\/\/www.hiig.de\/en\/wp-json\/wp\/v2\/posts\/109912\/revisions\/112022"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.hiig.de\/en\/wp-json\/wp\/v2\/media\/109917"}],"wp:attachment":[{"href":"https:\/\/www.hiig.de\/en\/wp-json\/wp\/v2\/media?parent=109912"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.hiig.de\/en\/wp-json\/wp\/v2\/categories?post=109912"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.hiig.de\/en\/wp-json\/wp\/v2\/tags?post=109912"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}