EU DisinfoLab

EU DisinfoLab

EU DisinfoLab is a non-profit organization focused on researching and tackling disinformation campaigns targeting the European Union.

Lobbying Activity

Meeting with Dan Barna (Member of the European Parliament)

9 Dec 2025 · Conference: Election Integrity and Foreign Interference in Romania, Moldova and Poland Is There a Better Way to Shield Our Democracy?

EU DisinfoLab: Digital Fairness Act must protect fundamental rights

22 Oct 2025
Message — The group urges the Commission to rethink the scope of this initiative. They want digital harms addressed as fundamental rights issues rather than consumer problems.12
Why — This change would allow the group to better tackle harmful disinformation campaigns.34

EU DisinfoLab warns against weakening digital rules for simplification

14 Oct 2025
Message — The group demands that existing laws be thoroughly implemented across the European Union. They reject labeling digital regulations as burdens, viewing them as essential for democracy.12
Why — Consistent enforcement helps the organization protect citizens and democratic values from information disorders.3
Impact — Citizens would face higher risks while media companies suffer from unchecked tech influence.45

EU DisinfoLab urges sustainable funding and anti-SLAPP protections

4 Sept 2025
Message — They request mandatory compensation for expertise and simplified EU grant processes. They also seek multi-year financial visibility and robust legal support.123
Why — These changes would improve their financial sustainability and protect their operational independence.45
Impact — Litigants who use abusive lawsuits to silence civil society actors would lose.6

Response to European Democracy Shield

26 May 2025

We welcome the opportunity to provide input to the Commission. As our raison d'être is to study disinformation, we have almost a decade's worth of analysis on our website disinfo.eu. For example, we have a resource page on the coordinated disinformation campaign known as Doppelganger - https://www.disinfo.eu/doppelganger-operation/ We also have a general resource page at https://www.disinfo.eu/resources/ Rather than list all of our reports and analysis here, we recommend you consult our site and/or contact us for any and all additional questions
Read full response

Meeting with Marie-Helene Boulanger (Head of Unit Justice and Consumers) and Make.org and

29 Apr 2025 · Consultation of civil society representatives in the context of the preparation of the upcoming European Democracy Shield (“focus group”)

Meeting with Nathalie Loiseau (Member of the European Parliament, Committee chair)

24 Mar 2025 · Lutte contre la désinformation

EU DisinfoLab: Prioritise Countering Disinformation in Security Strategy

11 Mar 2025
Message — Coordinated campaigns designed to mislead and misinform must be prioritised by the Internal Security Strategy. Disinformation networks are systematically infiltrating AI training data to propagate false narratives through a tactic called LLM grooming.12
Why — This would strengthen the organisation's efforts to protect core democratic values from foreign interference.3
Impact — Foreign disinformation actors lose the ability to manipulate users by exploiting artificial intelligence training data.4

Meeting with Irene Roche Laguna (Head of Unit Communications Networks, Content and Technology)

11 Mar 2025 · Meeting to discuss recent activities of DG Connect in the implementation of the DSA related to information integrity.

Meeting with Diana Vlad-Calcic (Cabinet of Vice-President Věra Jourová), Marie Frenay (Cabinet of Vice-President Věra Jourová)

20 Sept 2024 · Preparation event

Meeting with Marie Frenay (Cabinet of Vice-President Věra Jourová)

7 Sept 2023 · Disinformation

Response to Delegated Regulation on data access provided for in the Digital Services Act

31 May 2023

Following is the summary of the EU DisinfoLab contribution. Please consult the attachment for more details and explanations. There is a need to ensure that data access is provided in a manner that is easy to access and user-friendly. Researchers need to understand how sharing mechanisms on the platforms work. Online interface models such as Crowdtangle need to be enhanced and replicated for different platforms and make truly accessible for the public. We also need to understand if, why, and how a piece of content has been recommended. In particularly urgent cases, researchers should be able to request access to specific information contained in private accounts, always maintaining strict protocols that safeguard privacy in accordance with GDPR. Data access should also include comments (and not simply posts). When requested to do so, platforms should map and publish threat indicators. Data access should also shed light on content moderation practices, accessing information about the overall number of content reported that were not moderated and why. Data access should include aggregated statistics on monetisation practices. Furthermore, it is of the utmost importance to create a repository of archived takedowns, removed ads, and any relevant harmful content linked to disinformation campaigns, enabling researchers to access it for future investigations. Accessing a strike history archive would also be relevant. In order to identify repeated offenders, it would be helpful to know if a particular user has multiple accounts. The DSCs will validate the request through an evaluation of the researchers and the data request considering the viability of the latter. In case of rejection, researchers must be clearly notified of the reasons in order to improve their future requests and be given a right to appeal. We also encourage the creation of a centralised database that collects all data requests, including those rejected, with a justification of why data access was denied and the final DSCs decision. A template clarifying the elements the request must contain needs to be developed in order to facilitate standardisation of data access requests and ensure objectivity in evaluations. Creating a two track system where vetted researchers maintain their status a bit longer and where one can still get vetted on project basis should be introduced. Civil society organisations but also journalists could be vetted researchers. We strongly favour the creation of an independent intermediary body that could help create a community of experts, common standards to review, as well as certification process of the platforms database, codebook, and technical systems. Perplexities arise with regard to the art. 40.8b. We strongly believe that independence from commercial interests should not imply a limitation for CSOs that compile research because they received public or private funding to do so. Another point that calls for caution is 40(8c) regarding the need to disclose the research funding in the application. We believe that this sensitive information should be shared exclusively with the DSCs or the independent body performing vetting instead of making it available publicly. As for the greater opportunities for accessing data, we favour a broader interpretation of article 40(4), as allowing to obtain information not linked to the EU. As the internet is without borders, we need to consider the cross-national nature of foreign interference. Regarding the technical specifications of data access interfaces, we support the standardisation of API access across platforms; adopting a common machine-readable data exchange format; and anonymising sensitive data. Also scraping should be recognised as a valid method to obtain data for public interest. In conclusion, we want to stress the importance of a flexible approach in this delegated act that will allow the structure to evolve and adapt to the needs of those doing research.
Read full response

Meeting with Didier Reynders (Commissioner) and

24 Apr 2023 · Defence of democracy

Meeting with Eleonora Ocello (Cabinet of Commissioner Thierry Breton)

12 Apr 2023 · DSA, EMFA

Meeting with Marie Frenay (Cabinet of Vice-President Věra Jourová)

12 Apr 2023 · European Media Freedom Act

Meeting with Werner Stengg (Cabinet of Executive Vice-President Margrethe Vestager)

5 Apr 2023 · Media Freedom Act

Meeting with João Albuquerque (Member of the European Parliament)

4 Apr 2023 · EMFA Regulation

Meeting with Irena Joveva (Member of the European Parliament, Shadow rapporteur)

21 Mar 2023 · European Media Freedom Act

Meeting with Věra Jourová (Vice-President) and Transparency International Liaison Office to the European Union and

17 Mar 2023 · Defence of democracy package

Meeting with Alex Agius Saliba (Member of the European Parliament, Shadow rapporteur)

16 Mar 2023 · EMFA

Meeting with Tiemo Wölken (Member of the European Parliament)

1 Mar 2023 · European Media Freedom Act (staff level)

Meeting with Andrus Ansip (Member of the European Parliament, Shadow rapporteur)

1 Mar 2023 · Media Freedom Act

Meeting with Pedro Marques (Member of the European Parliament)

15 Feb 2023 · Transparency in the EU Institutions

Meeting with Bart Groothuis (Member of the European Parliament, Shadow rapporteur)

23 Jan 2023 · APA meeting on ING2

Meeting with Ramona Strugariu (Member of the European Parliament, Rapporteur for opinion)

8 Dec 2022 · European Media Freedom Act

Meeting with Frances Fitzgerald (Member of the European Parliament, Rapporteur) and EUROPEAN TRADE UNION CONFEDERATION and

27 Oct 2022 · Proposal for a Directive on combatting violence against women and domestic violence

Meeting with Věra Jourová (Vice-President)

25 Oct 2022 · Disinformation

Meeting with Alexandra Geese (Member of the European Parliament)

19 Oct 2022 · Disinformation and the Russian invasion of Ukraine

Meeting with Daniel Braun (Cabinet of Vice-President Věra Jourová)

11 Oct 2022 · Disinformation

Meeting with Agnieszka Skonieczna (Cabinet of Commissioner Thierry Breton)

24 Jun 2022 · Disinformation; Media Freedom Act

Meeting with Lesia Radelicki (Cabinet of Commissioner Helena Dalli)

16 May 2022 · Gender-based disinformation issues

Meeting with Filomena Chirico (Cabinet of Commissioner Thierry Breton)

29 Mar 2022 · Fight against disinformation and DSA

Meeting with Arba Kokalari (Member of the European Parliament, Shadow rapporteur)

29 Mar 2022 · Digital Services Act and disinformation

Meeting with Stéphane Séjourné (Member of the European Parliament)

28 Mar 2022 · DSA (équipe)

Meeting with Daniel Braun (Cabinet of Vice-President Věra Jourová), Marie Frenay (Cabinet of Vice-President Věra Jourová)

15 Mar 2022 · Disinformation

Meeting with Christel Schaldemose (Member of the European Parliament, Rapporteur)

10 Mar 2022 · fight against disinformation

Meeting with Alexandra Geese (Member of the European Parliament, Shadow rapporteur) and Bureau Européen des Unions de Consommateurs and

10 Mar 2022 · Digital Services Act

Response to Digital Services Act: deepening the Internal Market and clarifying responsibilities for digital services

31 Mar 2021

How the Digital Services Act (DSA) Can Tackle Disinformation March 2021 Executive Summary The Digital Services Act (DSA) proposal is a step forward in the EU’s approach to creating a safer online space. The harmonisation of regulatory oversight and the introduction of due diligence obligations for online platforms will create a stronger incentive structure for companies to tackle illegal and harmful content. However, the DSA does not do enough to tackle disinformation. Disinformation is not merely a content moderation issue for social media platforms but a challenge endemic to many digital services. While the current regulation appears to understand disinformation-related risks as ‘intentional manipulation’ or ‘exploitation’ of the service, research continues to show that disinformation is linked to the design and innate characteristics of these services (virality, velocity, content optimisation, network effects), including their interplay with one another and with other internet services. By focusing so heavily on the largest actors, the DSA risks overlooking harmful content on smaller, fringe, alternative, and emerging services, as well as the ways in which these services are systematically combined and abused in disinformation campaigns. Our key points: ➢ Bolster Risk Assessments. The Commission should consider altering Article 26 (“risk assessments for systemic risks”) to further specify the risks related to disinformation along with a framework for assessing systemic risks related to disinformation, and provide further obligations for all services faced with such risks. ➢ Expand “Know Your Business Customer” Obligations. The Commission should consider expanding the “Know Your Business Customer” (KYBC) rules to cover a wider range of digital services. ➢ More nuanced understanding of “vetted researchers''. Over the past five years, the field of disinformation research has grown beyond the university lab, the DSA risks excluding a large number of competent researchers by giving a monopoly of the “vetted researcher” status to academics affiliated to a university.
Read full response

Response to Preventing and combatting gender-based violence

12 Jan 2021

EU DisinfoLab is an independent non-profit organisation focused on tackling sophisticated disinformation campaigns and documenting the disinformation phenomenon in Europe. Beyond our research, we aim to serve as a gathering place for experts and organisations to exchange best practices, cooperate, and develop new approaches to countering disinformation and misinformation. Drawing on open source intelligence techniques and social media monitoring, our researchers have been attentive to gender-based and in particularly misogynistic disinformation. Violence against women must be prevented online as well as offline. Misogynistic disinformation exemplifies the porosity between the online and offline spaces and the need to eradicate violence against women in all its forms. Gender themes such as misogyny are strategic and recurrent in the disinformation landscape. Misogynistic disinformation can be understood as the dissemination of false or misleading information attacking women (especially political leaders, journalists and public figures), basing the attack on their identity as women. The techniques for diffusing gendered and misogynistic disinformation are diverse and include misogynist comments that reinforce gender stereotypes, sexualisation and the diffusion of graphic contents, online harassment, trolling, and cyber-attacks. Misogynistic disinformation has the effect of perpetuating a negative perception of women in society: it undermines women’s credibility in occupying positions of power, discourages women from participating in the public debate, and serves to silence women in general. Like other forms of disinformation, misogynistic and gendered disinformation take away agency from the group it attacks, whether silencing dissenting voices or suppressing the possibility to recognise, measure and address real problems. In countries across the world, there numerous examples of how online harassment is linked to gender-based violence. The COVID-19 pandemic has had a disproportionately negative impact on women’s rights and seen a surge in domestic violence. According to a global poll by the Web Foundation, even before COVID-19, more than half of girls and young women had experienced online abuse. During the pandemic, one in five young women quit or reduced their use of social media, according to a survey last year by Plan International. Recent months have seen a surge in spyware, stalkerware and other online monitoring software used to abuse women. COVID-19 has pushed more people online, a trend unlikely to reverse, which means digital abuse against women will only intensify. Gendered disinformation is yet another form of digital violence that pushes girls and women out of digital spaces, hinders digital inclusion and digital opportunities. Our researchers tracked how misogynistic narratives have been retrieved and adapted to fit within the European mis- and disinformation landscape. (https://www.disinfo.eu/publications/misogyny-and-misinformation%3A-an-analysis-of-gendered-disinformation-tactics-during-the-covid-19-pandemic). We are forced to conclude that disinformation taking aim at women is not merely a “women’s issue”; it has a detrimental effect on civil rights and democratic institutions as a whole. It also means a chilling of freedom of expression for women, gender-diverse people, and gender equity advocates. A crucial first step in fighting misogynistic and gendered disinformation is to recognise these campaigns as an articulate and consistent phenomenon rather than a series of isolated events. Researchers should be attentive to this phenomenon and its implications. Meanwhile, European legislation addressing violence against women and domestic violence must take into account the disinformation threat among other forms of digital violence. There is need for capacity building and support at the intersection of violence against women, gender equity, and the fight against disinformation.
Read full response

Meeting with Věra Jourová (Vice-President)

2 Oct 2020 · Conference - interview: Protecting European democracy

Response to European Democracy Action Plan

26 Aug 2020

EU DisinfoLab welcomes the Commission’s initiative to present a European Democracy Action Plan (EDAP), in particular the efforts to address disinformation. We call for a renewed EU-wide approach, from focusing solely on foreign interference during electoral periods to instead approaching disinformation holistically, including information manipulation by domestic and pan-European actors. When attribution can be established, it should be made public by EU institutions, and relevant sanctions should be considered. Any new regulatory initiatives against online information operations should be considered with great caution. Instead of regulating content, we should create the right conditions for resilience in society. A. Funding a decentralised network of journalists, academics, fact-checkers and open-source investigators This would be best achieved through a decentralised approach for a civil society ecosystem, comprising journalists, academics, fact-checkers, and open-source investigators. New types of expertise, such as open source investigations, do not fit within the existing funding scheme for fact-checking or independent media. The EU must have an ambitious framework to fund this civil society ecosystem. This would guarantee the healthy participation and empowerment of independent organisations to both counter disinformation and hold platforms accountable to democratic principles. Their findings would eventually provide evidence for the competent authorities to act. We support renewing the Rights Equalities and Citizenship programme for this objective. This could take the form of small grants, similarly available to startups, which would enable an active ecosystem tackling disinformation across Europe. In particular, funding should guarantee editorial freedom to prevent confirmation bias as well as any form of conclusions that could be perceived as censorship. B. Guaranteeing the physical and psychological well-being of those tackling disinformation Producing disinformation is cheap, but tackling it is expensive. Debunking disinformation through research, investigations and advocacy campaigns exposes staff to real threats, both physical and psychological, online and offline. For organisations working on disinformation, financial and physical security go hand in hand, therefore the EDAP, and any accompanying funding, must account for the risks involved in working in such a sensitive field. C. Setting standards on data-access and enforcing consistent definitions for platforms to respond to cases of information manipulation On tackling disinformation, the Commission should apply the lessons learned from the implementation and assessment of the Code of Practice. In particular, standards should be set regarding access to data from the online platforms, as well as consistent definitions and processes such as a mechanism for online platforms to respond to reported cases of information manipulation. Regarding consumer protection, users should be notified in cases where they have been exposed to dis or misinformation. This should be accompanied by a clear oversight mechanism and possible sanctioning powers for the regulator. D. Defining best practices for political campaigning and a clear distinction between disinformation and strategic communications Disinformation should not become a regular political campaigning strategy. Political candidates should commit to respecting best practices for online campaigning and funding should be conditioned on fair and transparent online campaigning. Finally, the present roadmap conflatates action against disinformation with strategic communications. Countering disinformation narratives and strategies must be separated from the overall fight against disinformation. Mixing the two, or even politicising the fight against disinformation, sets a dangerous precedent for the rule of law in Europe.
Read full response

Meeting with Filomena Chirico (Cabinet of Commissioner Thierry Breton) and Amnesty International Limited and

15 Jul 2020 · Responsibility and accountability of online platforms

Meeting with Věra Jourová (Vice-President) and

22 Jun 2020 · Supporting civil society in the fight against disinformation