Fundacja Obserwatorium Demokracji Cyfrowej

CEE Digital Democracy Watch

CEE Digital Democracy Watch is a Warsaw-based NGO focused on the political content moderation and transparency.

Lobbying Activity

Response to Format, template and technical specifications of the labels and transparency notices of political advertisements

28 May 2025

We welcome the opportunity to provide feedback on the implementation of TTPAR (Transparency of Targeted Political Advertising Regulation) labels and related transparency requirements for political advertising. We offer the following recommendations to ensure clarity and effectiveness of the digital side of things, based on the experienced of current presidential campaign in Poland. 1. Integrity and Consistency of Labelling To maintain trust in the labelling system, we recommend including guarantees that labels are not altered between the time of account verification by the platform and the broadcast of the ads. Furthermore, the publicly visible label should consistently display the same data as was provided during the verification period. This will help prevent discrepancies and ensure transparency for the public and regulators. 2. Transparency of Removed Ads For the sake of research and public accountability, we urge that ads taken down by platforms for rule violations remain accessible in ad repositories with information on the amount of spending. This will support independent research and transparency, allowing for a better understanding of enforcement practices and the nature of infringing content. 3. Real-Time Visibility and Silent Periods It is important that the visibility of labels and corresponding ad repositories is maximised in real time not only on the ads themselves but also through ad repositories. This visibility should extend to election silent periods. We encourage the implementation of automatic shutdown mechanisms for restricted ads during these periods to uphold electoral integrity. 4. Alignment of TTPAR Labels with National Requirements A clear relationship must be established between TTPAR labels and national requirements for labelling electoral content. This is essential to avoid the need of double-labelling where the same content would be labelled both within the ad itself and in online platform systems. Guidance should be provided to delineate the labelling accountability of platforms versus buyers. 5. Human Factor in Verification We advise that the technical specifications clarify the Commissions position on the use of automated systems for vetting labels. We strongly encourage the inclusion of a human content moderator in the verification process, especially for ads with significant spending. Ideally, this moderator should speak the local language and understand the local context to ensure accurate and culturally sensitive labelling.
Read full response

Meeting with Marie-Helene Boulanger (Head of Unit Justice and Consumers) and EU DisinfoLab and

29 Apr 2025 · Consultation of civil society representatives in the context of the preparation of the upcoming European Democracy Shield (“focus group”)

Meeting with Thomas Schmitz (Cabinet of Executive Vice-President Henna Virkkunen) and Make.org and

4 Apr 2025 · Exchange of view on the recommendations published by the Democratic Shield Civil Society Task Force on the European Democracy Shield initiative

Meeting with Simona Constantin (Cabinet of Commissioner Michael McGrath) and Make.org and

4 Apr 2025 · Exchange of view on the recommendations published by the Democratic Shield Civil Society Task Force on the European Democracy Shield initiative

Response to Protection of Minors Guidelines

30 Sept 2024

CEE Digital Democracy Watch is a Polish non-profit organisation dedicated to promoting responsible online discourse and advocating for a democratic future where regulation and free expression go hand in hand. We strive to ensure that fundamental online values are protected, particularly for the youth. We recognise the significant effort involved in expanding the regulatory framework under the Digital Services Act (DSA) and the European Commission's emerging priorities, as indicated in the Political Guidelines, including the focus on mental health in digital spaces and the regulation of dark patterns. In this context, we believe that the overarching goal of European digital policy should be to ensure that young people are not excluded from meaningful connectivity. Such connectivity must allow for communication, entertainment, information access, online education, public services, and financial activities. To achieve this, the following areas must be addressed: Quality of Information | One of the most pressing risks with long-term societal implications is the declining quality of online information. A Pew Research Center study showed that social media platforms increasingly serve as the primary news source for individuals aged 18 to 29. Furthermore, research by the Center for Countering Digital Hate revealed that a significant proportion of teenagers aged 13 to 17 exhibit concerning levels of belief in conspiracy theories, particularly regarding health and racialised narratives. For this reason, the development of a European Democracy Shield and any related framework addressing misinformation must prioritise the protection of minors. Such efforts should ensure impartial and transparent content reviews without depriving young people of access to valuable resources. Safeguarding Communities | The internet should be recognised as a platform for community-building among young people, serving as an outlet for creativity and a means to engage in rich digital lives. Online connections are especially crucial for dispersed groups, such as LGBTQ+ youth, young community organisers, and providers of credible health information. These groups are at heightened risk of being deplatformed, shadow-banned, or subjected to non-transparent moderation practices. Their ability to connect and explore should not be curtailed, but rather encouraged. In addition, their culture and expressions face constant threats from national regulations that seek to suppress education and expression, as seen in recent years in countries like Bulgaria, Hungary, Lithuania or Poland. Responsible Age Verification | The expansion of age verification is often presented as a catch-all solution for protecting minors online. However, this may not be effective if such measures require the collection of sensitive data, thereby infringing on the privacy and safety of both minors and adults. Proposals that seek to weaken encryption or implement age verification systems that track users and link them to specific content undermine the protections currently guaranteed by the EU legal framework. This also applies to national exceptions being proposed in several Member States.
Read full response