The Global Disinformation Index

GDI

The Global Disinformation Index is a not-for-profit organisation that operates on the three principles of neutrality, independence and transparency.

Lobbying Activity

Meeting with Christel Schaldemose (Member of the European Parliament) and Hertie School

11 Nov 2025 · Digital Sovereignty, Democratic resilience and regulatory simplicity under the future EU budget

Meeting with Alexandra Geese (Member of the European Parliament)

15 Sept 2025 · Disinformation & tech policies

Meeting with Marco Giorello (Head of Unit Communications Networks, Content and Technology)

10 Sept 2025 · DSA implementation, Code of conduct on Disinformation, Commission cooperation with civil society and democracy shield

Response to EU Civil Society Strategy

4 Sept 2025

The Global Disinformation Index welcomes the European Commission's initiative to develop a comprehensive EU Civil Society Strategy, particularly its recognition of the deteriorating environment for civil society organisations. As part of a group of organisations specialising in combating disinformation, foreign information manipulation and interference (FIMI), and promoting advertising transparency, there are several escalating threats that demand specific attention within this strategy. Civil Society has become increasingly targeted by hostile actors, political interference, and systematic campaigns designed to undermine the credibility and operational capacity of the sector. The recent experiences of disinformation researchers facing congressional investigations, legal harassment, and coordinated attacks demonstrate the urgent need for EU-level protection and support mechanisms. Please find our recommendations attached.
Read full response

Meeting with Nathalie Loiseau (Member of the European Parliament, Committee chair)

6 Jun 2025 · Lutte contre les ingérences étrangères et la manipulation de l'information

Meeting with Rita Wezenbeek (Director Communications Networks, Content and Technology) and

11 Feb 2025 · Opening Session: DSA roundtable discussions on online advertising (Article 46 DSA)

Response to Delegated Regulation on data access provided for in the Digital Services Act

10 Dec 2024

Please find attached the Global Disinformation Index's (GDI) contribution to the public consultation on the draft Delegated Regulation on data access for researchers provided for in the Digital Services Act.
Read full response

Meeting with Kim Van Sparrentak (Member of the European Parliament)

6 Mar 2024 · DSA implementation and enforcement

Meeting with Alexandra Geese (Member of the European Parliament)

6 Mar 2024 · Disinformation on social media platforms

Response to Implementing Act on transparency reports under Regulation (EU) 2022/2065 (Digital Services Act)

24 Jan 2024

This analysis will focus on the VLOP transparency reports from TikTok, LinkedIn, Bing, and Snapchat. Overall, in comparison to the COP transparency reports, the VLOP reports follow a looser standardization. Instead of a common reporting framework template, the VLOPs had the freedom to organize their reporting as they saw fit. Although this flexibility may be appropriate since VLOPs cover many different services, it makes cross-report comparison more difficult. A broad overview of the compliance can be found in the attached Table 1. Concerningly, none of the VLOPs provided demonetisation metrics in their transparency reports. The lack of focus on demonetisation within the reports suggests that the COP reports, which have a higher standard of demonetisation metrics, will play a pivotal role in accountability for disrupting the business of disinformation. In sum, while VLOP reports may offer greater insight into the composition and action of content moderation, COP reports will still be civil societys major path forward for gaining transparency within the demonetisation of disinformation. TikTok and LinkedIn provided the most comprehensive reporting, with an explanation of content moderation policies, a breakdown of content removed for each policy, and even a member state breakdown of how they were assigning labor to each EU region. This provided insight on where content moderation action is happening and how focus is being distributed. Yet, it also highlighted imbalances within each team that were not given explanation. For example, LinkedIn had no content moderation specialists with proficiency in some Eastern European countries, and while TikTok had much larger numbers overall, they still had a greater slant towards Western European countries. However, for other VLOPs, only a minimal amount of data was provided. Bing did not provide a breakdown of policy violations, nor did it provide an explanation of content moderation team member allocation between the EU member states. This made their report little more than a reiteration of public data, but they justify their sparse reporting with the lack of a concrete minimum baseline for transparency. Snapchat did provide a breakdown of policy violations, but did not provide insight into their content moderation team.Furthermore, other DSA obligations were not reflected in the transparency reports. These include bans on targeted advertising to children; regular systemic risk assessments and independent audits; options for users to opt out of recommender systems; and increased data sharing with authorities and researchers. Recommendations: - A minimum requirement standardized template for VLOP transparency reports would facilitate cross-comparison and greater accountability by giving a common benchmark. There should be freedom outside the minimum requirement template to allow for VLOPS differing natures. Content moderation teams broadly tend to have more expertise in Western European languages, with less focus in Eastern European regions. This could lead to an enforcement gap. - More granular data is needed to make the VLOP transparency reports effective at empowering public accountability. Specifically, VLOP transparency reports should have more specific requirements concerning: - Engagement/view data, when content is amplified by algorithms, how many of the views/engagement came from the platform recommendations of content, a specific breakdown for each category of quantitative illegal content (per the Annex), violations of community guidelines, and priority questions around the monetisation of disinformation (see attached Table 2). - Whether the moderation action is proactive or taken in response to a flag from someone else (e.g., user flags, trusted flaggers, governments etc). - In addition to metrics around the accuracy of algorithmic moderation systems, metrics could include accuracy of the human content moderator operation (agreement rates of moderators by policy)
Read full response

Meeting with Wojtek Talko (Cabinet of Vice-President Věra Jourová)

30 Aug 2023 · Disinformation

Response to Performance of independent audits provided for in the Digital Services Act

2 Jun 2023

The Global Disinformation Index (GDI) is a not-for-profit organisation focused on defunding and disrupting disinformation. We welcome the opportunity to submit the following response as a proposal for strengthening Article 37 of the DSA through a delegated act. Our proposal is focused around two areas to strengthen the independence and clarity of the audit process: 1) external standard setting and 2) data disclosure. The success of the Digital Services Act hinges upon the effectiveness of independent audits. Without robust oversight, digital services will have strong economic incentives to skirt their new responsibilities, rendering the Act incapable of ensuring a safe and accountable online environment. For example, GDI has documented since the start of the COVID-19 pandemic the monetisation of disinformation and harmful content from ad-tech platforms with insufficient and unenforced publishing policies. GDI estimates that just in the EU, tech companies pay out annually out more than US$76 million in ad revenues to known disinformation sites targeting member states. It is clear that delegating the enforcement of standards to the platforms will not be a viable strategy. GDI proposes the following amendments be codified into the Article 37 delegated act: -Formalise civil society input and third party data reporting into the audit process. -Data disclosure during an audit should meet holistic and flexible guidelines, considering the varied stakeholders and systemic risks of a service. -Auditors should conduct a designated industry-independent process of mapping stakeholders, identifying systemic risks, and develop consistent metrics to understand the risks. -Specific consideration of how to operationalise and measure the impact of disinformation content on the behavioural grouping and sorting of users in tech platforms product offerings. -Ad-tech services of an auditee should be required to have assessment around metrics concerning monetisation of disinformation and the promotion of harmful content. For more detailed analysis and evidence that supports these amendments, please see the attached PDF
Read full response

Response to Delegated Regulation on data access provided for in the Digital Services Act

31 May 2023

The Global Disinformation Index (GDI) is a not-for-profit organisation focused on defunding and disrupting disinformation. We welcome the opportunity to submit the following response as a proposal for strengthening Article 40 of the DSA through a delegated act. Our proposal is focused around three areas to strengthen the independence and clarity of the data sharing process: 1) necessary data sharing standards for compliance, 2) data access, and 3) independent advisory mechanisms. The success of the Digital Services Act hinges upon the effectiveness of effective data sharing for compliance. Without robust oversight, digital services will have strong economic incentives to skirt their new responsibilities, rendering the Act incapable of ensuring a safe and accountable online environment. For example, GDI has documented since the start of the COVID-19 pandemic the monetisation of disinformation and harmful content from ad-tech platforms with insufficient and unenforced publishing policies. GDI estimates that just in the EU, tech companies pay out annually out more than US$76 million in ad revenues to known disinformation sites targeting member states. It is clear that delegating the enforcement of standards to the platforms will not be a viable strategy. Furthermore, researcher data will be critical for establishing assessments from auditors by giving an indication of risks and gaps in compliance. GDI proposes the following amendments be codified into the Article 40 delegated act: -Data disclosure should meet holistic and flexible guidelines, considering the varied stakeholders and systemic risks of a service. For example, there should be a designated industry-independent process of mapping stakeholders, identifying systematic risks, and then developing measurements to understand the risks. -For different modes of data access, such as APIs, data providers need to account for potential information security risks. -There should be a modular approach to researcher access to data. -Ad-tech services of an auditee must be required to have assessment metrics concerning monetisation of disinformation and the promotion of harmful content. -Formalise civil society input and third party data reporting as independent advisory mechanisms. This fora should vet researchers, assess their proposals, and verify whether requested data withheld by platforms is truly confidential to ensure services are meeting their commitments. This would be a key safeguard to prevent the process for vetting researchers and proposals from industry capture. For more detailed analysis and evidence that supports these amendments, please see the attached PDF which includes a table of priority research questions concerning online disinformation, and example data sets necessary to answer these questions.
Read full response

Meeting with Eleonora Ocello (Cabinet of Commissioner Thierry Breton)

29 Mar 2023 · Disinformation and DSA