Access Now Europe

Access Now is a global non-profit defending digital rights and secure communications for users.

Lobbying Activity

Meeting with Cecilia Strada (Member of the European Parliament)

17 Nov 2025 · Digital surveillance aspects within the proposal for Return Regulation and the upcoming revision of Europol/Frontex mandates

Access Now Warns Digital Omnibus Threatens Fundamental Rights

14 Oct 2025
Message — Access Now requests a full impact assessment and immediate clarity on proposed AI Act changes. They oppose any deregulation that weakens existing privacy and data protection standards.12
Why — Preventing these changes preserves their ability to defend digital rights and privacy.3
Impact — Users will face increased surveillance and lose the power to refuse tracking.4

Meeting with Sergey Lagodinsky (Member of the European Parliament) and Future of Life Institute and Internet Society

19 Sept 2025 · Exchange of Views

Response to European Border and Coast Guard – update of EU rules

11 Sept 2025

We see the gradual reinforcement of Frontex as an integral part of the problematic expansion of the EU security and surveillance industrial complex, which thrives by developing technologies of control and repression against migrant people. Efforts to increase state control over people on the move as well as migrants, refugees and racialised people have led to systemic and massive human rights violations at and within EU borders: - In the past eleven years, the Missing Migrants Project has recorded the deaths of tens of thousands of people in the Mediterranean Sea. - The number of pushbacks - a practice prohibited under international and EU laws - at Europes external borders has increased so much in past years that they are described as a systematic practice and a central part of the European migration regime. - Degrading, violent, inhumane and life-endangering treatments by national border guards are seriously neglected by the EU and concealed from the public. Frontex is directly complicit in these human rights violations through its surveillance operations. The technologies it employs thus play a key role in the EUs violent and harmful system of border management and migration control. The level of responsibility of Frontex in human rights violations should call into question the very existence of this agency as part of the EUs migration policy framework. We are therefore extremely concerned about the stated aim of the reform to equip it with state-of-the-art technology for surveillance and situational awareness. This submission attached to this consultation contains analysis and recommendations regarding the initiatives mentioned in the call for evidence falling within our scope of work, i.e. the rights and safety of racialised communities and migrants in the digital environment. Our joint answer specifically touches on three points: 1) Firstly the dangerous proposal to weaken Frontexs data protection rules; 2) Secondly Frontexs structural and agenda-setting role in EU-funded research; 3) Thirdly the inherent unfit nature of any governance structure that legitimizes Frontexs proven unlawful practices. The absence of comments on certain parts of the call shall not be interpreted as an endorsement. Overall we want to highlight how Frontex is suffering from severe distrust, lack of transparency, accountability and legitimacy, which are completely absent in the list of problems the initiative aims to tackle. It is particularly striking that most of the feedback shared by migrant rights organisations during the evaluation of the Regulation does not feature in the list of gaps. We strongly encourage the Commission to take into account and listen to the input and calls of communities and groups affected by Frontexs operations, as part of this consultation process. These important views on the current and emerging problems and challenges related to Frontexs tasks, activities, deployments, structure and governance should be centred in the impact assessment of the Commission. This answer was written with the contribution of the following human rights organisations members to the #ProtectNotSurveil coalition: European Digital Rights (EDRi), Equinox Initiative for Racial Justice, Access Now, AlgorithmWatch, Platform for International Cooperation on Undocumented Migrants (PICUM), Border Violence Monitoring Network (BVMN). The #ProtectNotSurveil coalitions mission is to challenge the use of digital technologies at different levels of EU policies and advocate for the ability of people to move and to seek safety and opportunity without risking harm, surveillance or discrimination. We thank the European Commission for the opportunity to share our serious concerns with regards to the proposed reform of the European Border and Coast Guard Agencys (hereby referred to as Frontex) mandate.
Read full response

Meeting with Alex Agius Saliba (Member of the European Parliament)

16 Jul 2025 · Democratic Tech Alliance

Meeting with Cecilia Strada (Member of the European Parliament)

3 Jun 2025 · Digital and surveillance issues in the reform of the Europol Regulation and the new Return proposal

Meeting with Birgit Sippel (Member of the European Parliament, Rapporteur) and European Digital Rights and

14 May 2025 · Democratic Tech Alliance

Meeting with Alexandra Geese (Member of the European Parliament, Shadow rapporteur) and Bureau Européen des Unions de Consommateurs and

14 May 2025 · Event: Democratic Tech Alliance

Meeting with Markéta Gregorová (Member of the European Parliament, Rapporteur)

20 Feb 2025 · GDPR and human rights protection

Meeting with Markéta Gregorová (Member of the European Parliament, Rapporteur) and noyb - European Center for Digital Rights

20 Jan 2025 · GDPR enforcement regulation discussion

Meeting with Alexandra Geese (Member of the European Parliament)

27 Nov 2024 · Digital policies, human rights agenda

Meeting with Tineke Strik (Member of the European Parliament) and Amnesty International Limited and

16 Oct 2024 · LIBE Civil Society Meeting

Meeting with Daniel Freund (Member of the European Parliament) and Amnesty International Limited and

16 Oct 2024 · Cooperation on LIBE related matters

Meeting with Martin Hojsík (Member of the European Parliament)

4 Oct 2024 · Digital legislation

Meeting with Alexandra Geese (Member of the European Parliament) and European Digital Rights and ARTICLE 19

26 Sept 2024 · Digital Rights policy

Meeting with Leila Chaibi (Member of the European Parliament, Shadow rapporteur) and European Digital Rights and ARTICLE 19

26 Sept 2024 · Speaker at an event on digital policies in the EU

Meeting with Bruno Gencarelli (Cabinet of Commissioner Didier Reynders) and European Digital Rights

12 Sept 2024 · GDPR and data flows - state of play and future prospects.

Meeting with Věra Jourová (Vice-President) and

8 Apr 2024 · Tech and Russian independent media

Meeting with Mohammed Chahim (Member of the European Parliament) and The German Marshall Fund of the United States - The Transatlantic Foundation and

19 Mar 2024 · Transatlantic Tech Exchange

Meeting with Erik Marquardt (Member of the European Parliament, Shadow rapporteur) and European Digital Rights and

13 Feb 2024 · Facilitators Package - stakeholders’ consultation meeting

Meeting with Lucrezia Busa (Cabinet of Commissioner Didier Reynders)

6 Feb 2024 · data and data protection

Access Now demands stricter Digital Services Act transparency standards

24 Jan 2024
Message — The group demands granular specifications for content types and more precise definitions of illegal content. They request platforms disclose the exact legal basis for all content restrictions.12
Why — Standardised reporting helps the organization monitor human rights impacts and platform compliance.3
Impact — Tech companies lose reporting flexibility and must reveal internal methodology for counting human resources.4

Meeting with Eleonora Ocello (Cabinet of Commissioner Thierry Breton) and European Digital Rights and Electronic Frontier Foundation

13 Dec 2023 · Digital policy

Meeting with Maurits-Jan Prinz (Cabinet of Commissioner Thierry Breton)

9 Nov 2023 · AI Act and fundamental rights

Meeting with Sergey Lagodinsky (Member of the European Parliament, Rapporteur) and European Digital Rights

25 Oct 2023 · GDPR Enforcement Regulation

Meeting with Maria-Manuel Leitão-Marques (Member of the European Parliament, Shadow rapporteur) and Centre for Democracy Technology, Europe

15 Sept 2023 · Political Advertising

Meeting with Brando Benifei (Member of the European Parliament, Rapporteur) and Bureau Européen des Unions de Consommateurs and

6 Sept 2023 · AI Act and fundamental rights impact assessment

Meeting with Birgit Sippel (Member of the European Parliament, Rapporteur)

13 Jul 2023 · ePrivacy (staff-level)

Meeting with Birgit Sippel (Member of the European Parliament, Rapporteur)

11 May 2023 · ePrivacy regulation (Staff-level)

Meeting with Petra Kammerevert (Member of the European Parliament, Shadow rapporteur) and European Digital Rights

27 Apr 2023 · APAs - European Media Freedom Act

Meeting with Wojtek Talko (Cabinet of Vice-President Věra Jourová)

13 Apr 2023 · Transparency of Political Advertising Regulation

Meeting with Alexandra Geese (Member of the European Parliament, Shadow rapporteur) and European Digital Rights and

27 Mar 2023 · Political advertising

Access Now Urges Stronger Rules for Equal GDPR Enforcement

22 Mar 2023
Message — Access Now requests that new rules apply to both national and cross-border cases. They recommend creating a common complaint form and guaranteeing the fundamental right to be heard.123
Why — Harmonized procedures would ensure all European citizens effectively enjoy the same data protection rights.4
Impact — Tech companies would lose the ability to exploit procedural differences to delay enforcement actions.5

Meeting with Marcel Kolaja (Member of the European Parliament, Shadow rapporteur for opinion) and Meta Platforms Ireland Limited and its various subsidiaries and

8 Mar 2023 · discussion about protection of children on-line and encryption

Meeting with Alexandra Geese (Member of the European Parliament, Shadow rapporteur) and European Partnership for Democracy

28 Feb 2023 · Event: Transparency in the European Elections

Meeting with Birgit Sippel (Member of the European Parliament)

17 Feb 2023 · AI Act (Staff-level)

Meeting with Daniel Freund (Member of the European Parliament, Shadow rapporteur for opinion)

3 Feb 2023 · European Media Freedom Act (staff-level)

Access Now urges rejection of media privilege in EMFA

23 Jan 2023
Message — Access Now calls for the rejection of Article 17 to prevent rogue actors gaining privileged treatment. They recommend focusing on Digital Services Act enforcement and broadening the spyware definition.123
Why — This would protect users by ensuring consistent content moderation rules under the Digital Services Act.45
Impact — State-controlled propaganda outlets lose the ability to bypass moderation through the self-declaration system.67

Meeting with Deirdre Clune (Member of the European Parliament, Shadow rapporteur) and European Digital Rights

12 Jan 2023 · Artificial Intelligence Act & Migration

Meeting with Paul Tang (Member of the European Parliament, Shadow rapporteur for opinion)

14 Nov 2022 · Staff Level: Political Advertising Regulation

Meeting with Birgit Sippel (Member of the European Parliament) and Chaos Computer Club e.V. and

8 Nov 2022 · Biometric surveillance

Meeting with Gwendoline Delbos-Corfield (Member of the European Parliament) and noyb - European Center for Digital Rights

16 Sept 2022 · GDPR enforcement

Meeting with Marcel Kolaja (Member of the European Parliament, Shadow rapporteur for opinion)

28 Jun 2022 · discussion about political advertising

Meeting with Paul Tang (Member of the European Parliament, Shadow rapporteur for opinion) and European Partnership for Democracy and

21 Jun 2022 · Shadows meeting on Political Advertising with stakeholders

Meeting with Maria-Manuel Leitão-Marques (Member of the European Parliament) and European Digital Rights and Platform for International Cooperation on Undocumented Migrants

11 May 2022 · AI Act

Meeting with Alexandra Geese (Member of the European Parliament, Shadow rapporteur) and Bureau Européen des Unions de Consommateurs and

10 Mar 2022 · Digital Services Act

Meeting with David Cormand (Member of the European Parliament) and Human Rights Watch

3 Mar 2022 · AI

Meeting with Marc Tarabella (Member of the European Parliament)

18 Jan 2022 · Digital Services Act” (DSA)

Meeting with Stéphane Séjourné (Member of the European Parliament)

27 Sept 2021 · DSA (équipe)

Response to Requirements for Artificial Intelligence

5 Aug 2021

Access Now welcomes the European Commission’s pioneering proposal for a regulatory framework for artificial intelligence. We have consistently pointed to the insufficiency of ethics guidelines and self-regulatory approaches, and have long called for regulatory intervention in the field of AI. The current Proposal provides a workable framework to ensure the protection of fundamental rights, but requires significant modifications in a number of areas or it risks failing to achieve the objective of protecting fundamental rights. Broadly speaking, the main aim of the Proposal is to regulate so-called high-risk AI systems, namely those which pose a risk of harm to health and safety or a risk of adverse impact on fundamental rights. Title III of the Proposal “contains specific rules for AI systems that create a high risk to the health and safety or fundamental rights of natural persons,” and a list of high risk AI systems is provided in Annex III. The Proposal also contains provisions to prohibit certain AI practices in Article 5, which the Explanatory Memorandum says “comprises all those AI systems whose use is considered unacceptable as contravening Union values.” Additionally, the Proposal contains provisions for transparency obligations for certain AI systems such as chatbots and emotion recognition systems. As we will argue in the attached pdf document, a number of shortcomings of the Proposal prevent it from adequately achieving the aim of protecting fundamental rights. As we explain in the attached pdf document, the definitions of emotion recognition and biometric categorisation contain certain technical flaws which must be addressed. Without defining these applications of AI correctly, we cannot hope to adequately address them in regulation. Section II therefore proposes alternative definitions of both of these terms, as well as a discussion of the severe risks they pose to fundamental rights. Having defined these two terms correctly, we then argue in Section III that they should both be prohibited. Section III begins with a discussion of the four existing prohibitions in Article 5 of the Proposal. We point out flaws in the formulations of the prohibitions in Article 5, paragraphs 1(a), 1(b), and 1(c), and propose alternative formulations. We then provide justifications and proposed formulations for a number of additional prohibitions. Finally, we argue that Article 5 must be supplemented with a list of criteria to define ‘unacceptable risk’ so that other applications of AI can be added to the list of prohibited practices if evidence emerges that they pose unacceptable risks, and that a mechanism must be added to Article 7 to allow for additional practices to be added to Article 5. In Section IV we address the relative lack of obligations placed on ‘users’ of AI systems as compared to those placed upon ‘providers.’ We suggest that additional obligations be placed upon users, including mandating that some form of impact assessment is carried out for all high-risk AI systems. In Section V, we discuss the need to extend the scope of the publicly viewable database of high risk AI systems proposed in Article 60. Section VI raises a number of concerns about the current proposal for AI regulatory sandboxes in Articles 53 and 54, and makes recommendations for a number of modifications, including that the use cases under paragraph (i), namely those related to law enforcement applications of AI, be removed from Article 54. Section VIII addresses a number of gaps in the enforcement and redress mechanisms provided by the proposal and, finally, in Section VIII we note a number of additional concerns which need to be addressed in the Proposal. While the current Proposal provides a workable framework for regulating harmful applications of AI, it requires serious modifications in a number of areas. We look forward to working with the co-legislators and other stakeholders in the coming months to ensure that these issues are addressed.
Read full response

Response to Declaration of Digital Principles

9 Jun 2021

We welcome the opportunity to provide feedback to help shape the future "Declaration of Digital Principles – the ‘European way’ for the digital society". Please find Access Now's submission attached.
Read full response

Response to Digital Services Act package: ex ante regulatory instrument of very large online platforms acting as gatekeepers

29 Apr 2021

Access Now welcomes the opportunity to provide comments on the European Commission’s proposal for a Digital Markets Act (DMA). We share the Commission’s assessment that gatekeeper platforms’ behaviour leads to “inefficient market outcomes in terms of higher prices, lower quality, as well as less choice and innovation to the detriment of European consumers.” We support the goal set by the Commission to address problems through legislation. However, we find that the proposed DMA does not respond adequately to the challenges at stake and leaves some important gaps. In particular, we recommend that lawmakers: 1. extend the scope of the DMA beyond digital markets, 2. address the impact of gatekeepers not only on consumer rights but also users’ human rights, and 3. provide remedies that address, and not entrench, potential data protection violations by platforms. With regards to the scope of the DMA, we support the objective to set up measures for gatekeepers to comply with up front rather than letting detrimental market conditions develop. Several online platforms hold significant market power as a result of massive data collection and harvesting practices that is leveraged to ascertain power in one or more markets, exacerbating lock-in and network effects, as well as information asymmetries. This power is not limited to digital markets. In fact, the boundaries between digital and non-digital markets are increasingly hard to draw and platforms often engage on both. Some platforms may benefit from the large market power they hold in one or more traditional markets to further ascertain their dominance in a digital market - and vice-versa. By limiting the scope of the DMA to digital markets, the EU may not be able to comprehensively address the power and impact that gatekeeper holds. We therefore recommend to ensure that the DMA has a horizontal scope. Then, the DMA largely focuses on addressing issues arising from the impact of gatekeepers on business users and to a lesser extent, on consumers. The focus should be on protecting consumers rights and they should not be considered as passive actors in markets. In addition, the DMA must consider the impact of gatekeepers on people’s human rights. In fact, a few large platforms act not only as “economic” gatekeepers, but also as “human rights” gatekeepers. The rules they put in place for the use of their services impact how people exercise their rights in the digital ecosystem, in particular the right to freedom of expression and information, and the right to privacy. In addition, various practices by gatekeepers exploit people’ data which leads to lower levels of protection across markets given the market power they hold. It also impacts business-users’ compliance with privacy and data protection requirements as they become dependent on data-harvesting gatekeepers to be able to offer goods and services. The legislators should amend the DMA to strengthen consumer and user rights and ensure that compliance with human rights regulations is taken into account when assessing gatekeepers' impacts on markets. Finally, remedies proposed under the DMA to address gatekeepers’ powers and impact should strengthen human rights. We welcome a proposal in Article 6.1.a that would limit gatekeepers ability to use data from business users, although this language could be strengthened to limit both access and use. However, we are concerned by a proposal in Article 6.1.i that would create an incentive for further disclosure of users data between gatekeepers and business users as a way to improve competition. Many gatekeepers’ data practices are contrary to the principles of data minimisation and purpose limitation despite being guaranteed by EU law. Encouraging a disclosure of data rather than addressing the underlying data violations that gave an unfair competitive advantage to gatekeepers would perpetuate violations of users’ rights and limit incentive to change privacy-invasive business models.
Read full response

Response to Legislative framework for the governance of common European data spaces

29 Jan 2021

Access Now welcomes the opportunity to provide comments on the European Commission’s proposal for a Data Governance Act (DGA) to inform the upcoming legislative debate in the European Parliament and in Council. The DGA marks an objectionable shift in the European Commission’s approach to the governance of personal data, where the focus moves from empowering people to empowering the data economy. With the General Data Protection Regulation (GDPR) and the ePrivacy Directive, the European Union designed a data governance model rooted in privacy and data protection principles, including data minimisation and the protection of people’s fundamental rights. With the DGA, the Commission proposes an unjustified change towards a data-driven economy and opens the door to the monetisation of certain practices. To avoid conflicting rules and the risk of undermining the protection of personal data, we recommend removing personal data from the scope of the DGA as the GDPR already provides for avenues and mechanisms for the sharing of data. If legislators choose to go forward with the law’s current scope, we ask lawmakers in the Council and the European Parliament to increase privacy and data protection safeguards in the DGA throughout the upcoming legislative process. The provisions on data altruism and re-use currently overlap or contradict the GDPR. These must be aligned with the data protection acquis. Particular attention must be given to the definition of data altruism in Article 2 (10) and the measures on reuse in Article 5. While the Commission seeks to increase the exchange of data, the GDPR’s mechanisms for control, empowerment, and information rights granted to data subjects must continue to apply and prevail. Regarding Article 5, we note that that the ‘de-identification release-and-forget model’ outlined in Article 5 (3), cannot satisfy the GDPR’s standards for anonymisation, and that even the proposals for using secure environments outlined in Article 5 (4) have been shown to be vulnerable to reidentification methods. Anonymisation, pseudonymisation, and even the use of secure environments are never free from vulnerabilities. Today’s state of the art will always be vulnerable to developments in re-identification techniques, meaning that personal data cannot be securely shared for re-use by any of these means in a long term perspective. We therefore recommend that personal data, or data from which personal data may be inferred, not be shared for re-use according to the measures set out in Article 5. Following the shift noted in the European Commission’s approach to governing data from data protection to a data-driven economy, Article 6 of the DGA authorises public bodies to charge a fee for the re-use of data. As authorising the re-use of personal data could become a source of income, it could reduce incentives to limit data processing or better protect information. To limit these risks, at minimum, data defined under Article 3.1.d (personal data) should not be covered by the scope of Article 6. Finally, data protection authorities should be tasked with overseeing the measures concerning data altruism, data reuse, use of consent as well as any provisions relating to the processing of personal foreseen under the DGA. The competent bodies and authorities referred to in Articles 7, 12 and 20 shall refer to the data protection authorities. We recall that DPAs and the European Data Protection Board (EDPB) should be tasked with drawing a “data altruism consent form” described in recital 39, 41 and 42 and Article 22, and not the European Commission who should instead be consulted. Lastly, as the DGA creates a European Data Innovation Board, which would include the EDPB, it shall be noted that the EDPB and its members shall be the responsible for tasks described under Article 27 (a) as it is related to the application of data re-use measures. Please find our detailed position attached.
Read full response

Response to Commission Implementing Decision on standard contractual clauses for the transfer of personal data to third countries

10 Dec 2020

Access Now welcomes the opportunity to provide feedback to the draft standard contractual clauses for the transfer of personal data to third countries pursuant to Regulation (EU) 2016/679 of the European Parliament and of the Council, through the public consultation organised by the European Commission. Our detailed comments can be found attached. We remain available for any questions you may have.
Read full response

Meeting with Thierry Breton (Commissioner) and

9 Dec 2020 · Roundtable with NGOs on DSA and DMA

Meeting with Daniel Braun (Cabinet of Vice-President Věra Jourová)

7 Oct 2020 · European Democracy Action Plan

Response to Requirements for Artificial Intelligence

8 Sept 2020

Access Now opposes the uptake of AI as an objective for a potential regulatory intervention. Our opinion on a legislative proposal will be based on the assessment of whether it ensures adequate safeguards for the protection and promotion of fundamental rights including societal impacts. All points outlined here are dealt with more substantially in our White Paper response, uploaded here. Access Now recommends regulatory option 4 to combine options 3.1 and 3.3. The EU legislative instrument should have one part to cover all AI applications (3.3) in order to enforce a basic level of transparency and due diligence, and a further component to focus on specific categories of AI applications (3.1) for which there should be an outright ban. We expand on these distinctions below. On the provision for all AI applications (3.1), we note that all private companies have existing responsibilities to carry out human rights due diligence. Large companies have resources to do this, and through the establishment of national centres of expertise, SMEs can be helped to carry out due diligence procedures for all AI applications. For all public sector usage of AI applications, human rights impact assessments (HRIAs) should be carried out, and the results and technical details made available in public registers of AI systems. Such registers should contain publicly accessible information about all AI systems being used or proposed for use in the public sector. Further, a mechanism must be established for contesting the results of impact assessments. In all cases, the possibility of scrapping or not using the system in question must be on the table, if the result of a HRIA shows that the system undermines the fundamental rights of those affected by it. The legislative instrument must also contain provisions for a ban on certain applications of AI. As Access Now has repeatedly stated, a ban is necessary where mitigating any potential risk or violation is not enough and no remedy or other safeguarding mechanism could fix the problem. We provide details in our uploaded document. Additional points: The document overstates the case when it says that AI can “contribute to a wide array of economic and societal benefits across the entire spectrum of industries and social activities.” We have no evidence that AI can make meaningful contributions “across the entire spectrum” and overselling the positive impact of this technology is ultimately harmful and undermines trustworthiness as those expectations cannot be met. On the definition of ‘AI’, we believe that the important thing is to regulate the impacts of the technology, rather than the specific technical processes underlying machine learning or neural networks, for example. Where the specific functioning of a given system requires some special consideration, it should be given, but otherwise we must look at AI as a form of automation and apply generally applicable measures. We support the acknowledgement that SMEs must also fall under the scope of legislation due to the scalability of AI applications. We further note that some of the most harmful developments in AI have come from SMEs and ‘innovative startups,’ such as Clearview AI and PimEyes, both of whom develop surveillance systems which are incompatible with fundamental rights. The document mentions that “not a lot of currently valid evidence is available at this stage” for the effects of AI systems. This is not the case. Access Now has already provided the Commission with such evidence, and a new report from Algorithm Watch adds more in relation to Covid-19: 'Automated Decision-Making Systems in the COVID-19 Pandemic.' The document mentions targeted consultations organised with “technical experts, conformity assessment bodies, standardisation bodies and experts on biometric data.” We request that consultations also be arranged with civil society organisations and with representatives of groups likely to be adversely affected by these systems.
Read full response

Response to External dimension of the EU policy on Passenger Name Records

7 Sept 2020

Access Now welcomes the opportunity to provide feedback to the European Commission’s roadmap on the external dimension of the EU policy on Passenger Name Records. We support the Commission’s assessment that “the Court of Justice‘s Opinion on the EU-Canada PNR agreement constitutes a central element of the evidence base” for the upcoming Communication to update the EU policy framework in relation to international PNR data transfers. On this basis, we make recommendations for the Commission to take into account for this upcoming Communication in the three domains its has identified: - EU bilateral relations with third countries We recommend that the Commission suspends the existing PNR agreements and pauses further negotiations with additional third countries. The EU Pilot Project on the “Fundamental rights review of EU data collection instruments and programmes” identified that the PNR measures of the international agreements in place in the EU do not follow the criteria established by the CJEU in Opinion C-1/15: 1.Broad data retention mandates: either in terms of retention duration, and/or scope of data list covered; 2.Unclear or unspecified measures to limit access to retained data by authorities and/or lack appropriate safeguards on data security; 3.Authorisation to process sensitive data, or failure to adequately prevent the possibility of such processing. (For more information, see: http://www.fondazionebrodolini.it/sites/default/files/final_report_0.pdf and https://www.accessnow.org/border-surveillance-europes-pnr-ruling-means-privacy/) - International trends in the use of PNR The Commission should ensure that outcomes of the negotiations at the International Civil Aviation Organisation to draft new PNR standards are in line with EU law and do not lead to new obligations on the EU and its Member States. In particular, the EU cannot adhere to standards that would contradict Opinion C-1/15 of the CJEU or lower the level of data protection guaranteed in the EU – both for data processed in the EU and in the context of international transfers. As several cases regarding the legality and validity of PNR measures are in front of the CJEU, the EU should refrain from agreeing to new obligations on PNR until the Court has provided guidance as to which measures are lawful under EU primary law. - EU internal legal framework Several cases questioning the legality of the specific measures of the PNR Directive and the instrument as a whole are in front of the CJEU. Ahead of these rulings, the Commission could take action to address known shortcomings of the law. In particular, in its Communication on the Way forward on aligning the former third pillar acquis with data protection rules from June 2020, the Commission notes that, following the recommendations of the Pilot Project aforementioned, the EU PNR Directive should be amended to be brought in line with the Police Directive. Beyond this issue, the same Pilot Project noted significant shortcomings of the EU PNR Directive in relation to the protection of the rights to privacy and data protection. For instance a number of measures fail to meet the standards of necessity and proportionality. In the report on the review of the PNR Directive, the Commission failed to demonstrate the necessity - or efficiency - of the PNR measures. It noted that “it may be difficult to single out the exact impact that the use of PNR data has had in each specific case” yet still reaffirming that authorities in Europe found the measures to be “useful”. “Usefulness”, however, is not a legal standard. In the absence of this evidence and following the list of shortcomings identified, the Commission should repeal the EU PNR Directive. We look forward to continuing engaging with the Commission to assist in the reform of the PNR instrument. To complement our submission, we would like to point to the joint feedback provided by EDRi to this consultation which is endorsed by Access Now.
Read full response

Response to European Democracy Action Plan

27 Aug 2020

Access Now considers “nurturing, protecting and strengthening democracy” as a primary goal of our policy work in the protection of digital rights. We agree with the Commission’s opinion that European democracies are seriously challenged, which became evident during the COVID-19 pandemic. Online misinformation and disinformation are not unique to this health crisis. However, the global crisis demonstrated how hasty and shortsighted solutions in content governance endanger fundamental rights. All objectives and problems described by EDAP are intertwined. Our response focuses on how platform business models and content recommender systems impact media freedom and pluralism, the integrity of forming political opinions, and the engagement with online content including the amplification of disinformation. Access Now urges the EU to shift focus from swift removals of online content hosted by online gatekeepers to how content is actually being distributed and amplified for platforms’ profit. Content recommender systems deployed by online platforms that personalise users’ experience significantly contribute to amplification of potentially harmful content, including the spread of disinformation and misinformation online. There are other commercial, for-profit and strategic decisions implemented into algorithmic curation of content that are made behind the scenes and with no awareness of online users or scrutiny by public authorities. Algorithms determine what users will see, which information will be prioritised and what piece of content will be excluded. Large online platforms increasingly rely on content recommender systems that study patterns of user behaviour to determine what someone will prefer among a collection of information. They personalise how the content is being offered to a user, which may have a detrimental impact on democratic discourse and diversity of information. Even if online platforms incorporate some level of diversity into content recommender systems, their primary goal is to engage the user and to increase their profits, rather than to promote democratic debate. Recommender systems may also have unintended consequences for the society. The European Union should ensure that content distribution tools are sufficiently transparent and that effective remedy and redress mechanisms are always easily accessible and available to online users. In this regard, Access Now proposes a tiered approach to transparency and control that consists of three main elements: 1) user-centric transparency that seeks to empower users to have control over information they receive and impart; 2) meaningful transparency necessary for effective public oversight and 3) a data access framework that will allow researchers, experts and civil society organisation to conduct research on how content is being curated that will inform evidence-based policy making. This approach must go hand in hand with a robust enforcement of the EU data protection law, in particular in the area of adtech. The initiatives under EDAP should be in line with reforms that should be covered by the DSA legislative package including minimum transparency and accountability requirements imposed on online platforms. The legislative framework should allow for third party auditing of algorithmic decision making and mandate human rights impact assessments exercised by independent stakeholders, such as civil society organisations and public authorities. The EDAP should ensure that these requirements apply horizontally to tackle the described issues in a systematic way. The EDAP should include specific EU actions against propaganda by public officials and governments. Finally, to achieve content diversity the main policy goal should focus on creating the conditions under which users can actually find and choose between diverse content themselves. Meaningful transparency as proposed in our position paper attached to this submission is essential for achieving this goal.
Read full response

Response to Advance Information on Air Passengers

29 Jul 2020

Access Now welcomes the opportunity to provide feedback to the European Commission’s public consultation on the impact assessment on possible revised rules on advance air passenger information. We support the objective of the Commission to reform the rules on the use of API as the current Directive does not provide adequate safeguards for the protection of personal data. We also question the necessity to add a new objective for the processing of API for “law enforcement” purposes. Our comments will focus on these two main issues. Regarding the need to update the API Directive to bring it in line with data protection acquis in the EU, we agree with the Commission’s inception impact assessment’s gap analysis indicating that “the list of data fields included in the Directive is a minimum list; in addition, the method to collect and verify data, as well as the moment to transmit the data, are not specified.” For an interference with fundamental rights, including the right to data protection, to be justified, the European Court of Human Rights has established that legislation must follow the foreseeability criterion which requires measures to be clear and precise. An open list of data fields does not fulfil this criterion and nor does the absence of specification as to how data shall be processed. The API Directive pre-dates the adoption of the EU’s modern data protection framework which consists of the GDPR and the so-called Police Directive. The entry into application of these legislations may require changes in the way airlines and authorities process personal data as clearer and more robust obligations now apply. Second, the API Directive currently “allows” member states to decide whether or not to use this data for law enforcement purposes, but it does not create rules around it. The Commission seeks to add these purposes in future API rules but it has not provided information - let alone evidence - regarding their necessity. However, this processing would restrict fundamental rights and any such restriction must follow the criteria of the necessity and proportionality test. Adding these purposes to the scope of API rules could also risk doubling measures set forth in the EU Passenger Name Records Directive. In fact, several data fields falling under the scope of API are also processed under the PNR Directive for law enforcement purposes. The Commission is aware of these risks as reflected in the inception impact assessment: “the PNR Directive includes provisions on the use of API data for law enforcement purposes. This partial overlap creates inconsistencies and uncertainty for both data subjects and national authorities on which data are collected and for which purpose.” Having two legislations creating obligations on the use of API data fields for law enforcement purposes would lead to further uncertainty and is contrary to the Commission's better regulation guidelines. The inception impact assessment states that “preliminary results of the currently ongoing review of the PNR Directive show the usefulness to combine API and PNR data in order to strengthen the reliability and effectiveness of PNR data as a law enforcement tool.” “Usefulness”, however, is not a legal standard and the necessity and proportionality of these measures in the context of these purposes shall be demonstrated. If the Commission has identified an issue with the processing of PNR data leading to false positives, it may require to reform the PNR Directive to address this and other serious shortcomings related to the protection of fundamental rights. For instance, several cases are pending at the Court of Justice of the EU to evaluate the compliance of the PNR Directive with the EU Charter and the necessity of its measures. One of these cases also raises questions related to the API Directive. We look forward to continuing engaging with the Commission to assist in the upgrade of the API and PNR rules to bring them in line with the EU data protection law.
Read full response

Response to Legislative framework for the governance of common European data spaces

29 Jul 2020

Access Now welcomes the opportunity to provide feedback to the European Commission’s public consultation on the inception impact assessment on the governance of common European data spaces. We support the goal of the Commission to make more data usable for the common good for research and innovative uses in compliance with EU data protection rules. Following up on our answer to the larger consultation on the European Data Strategy (attached), our comments will focus on the concept of “data altruism” and the role of “data intermediaries”. We reiterate that the EU already has “mechanisms for data governance at European level which may support data-driven innovation”, at least for the protection and the free flow of personal data under the General Data Protection Regulation. In particular, the requirements for data protection by design and by default enshrined in Article 25 of the GDPR are established to enable models of sustainable data practices and foster the development of privacy and data protection friendly innovation. The Commission should dedicate more resources to making this data governance model a reality. The inception impact assessment notes that “There is a strong potential in the use of ‘consented data’ made available voluntarily by individual data subjects for the common good (‘data altruism’)”. It is unclear what the Commission means with the idea that “consented data” would be “made available voluntarily”: is this an explanation of what consent means? A proposal to re-use “consented data” for another purpose? In any case, the GDPR set forth conditions for consent to be valid, which includes that it must be linked to a specific purpose, and it establishes limitations on the “further processing” of data. Beyond the question of compliance with the GDPR, the framing around the undefined concept of “data altruism” is problematic. This suggests that personal data can be treated as a commodity for transactions, standing at odds with the fact that personal data is protected not only under the GDPR but also under the EU Charter of Fundamental Rights. It is also at odds with the nature of data as information.The governance and regulatory models of data should follow the logic and principles of constitutional and fundamental rights rather than concepts related to property law. (See also Access Now’s submission to the consultations on Digital Services Act package). We must also consider that data subjects are constantly forced into giving up their data (“forced consent”), knowingly or unknowingly, by misleading data collection and data sharing practices of some companies. The EU must prioritise the application and enforcement of the GDPR and ensure that people can adequately give consent to the use of their data. This could entail further research into data governance models to complement individual enforcement of data subject rights but this should not be conflated with the described concept of “data altruism”. Finally, the role of “data intermediaries” is briefly discussed in the inception impact assessment. These intermediaries will potentially hold a lot of powers as to how data spaces will be managed and information disclosed. Previous consultation presented “data brokers” as innovative services that could help businesses grow in this area. A large number of data brokers however operate in the dark and/or have been linked to countless data protection and privacy abuses and human rights violations. Given the significant negative impact some of these companies have on human rights, it would be unacceptable for the Commission to rely on these actors to develop a data strategy for the “common good”. We look forward to continuing engaging with the Commission to assist in the implementation of data governance mechanisms deriving from the EU data protection law. To do so, we recommend relying on guidance from the European Data Protection Board and its members instead of developing hard law on the concept discussed above.
Read full response

Response to Digital Services Act: deepening the Internal Market and clarifying responsibilities for digital services

30 Jun 2020

Access Now supports the European Commission’s effort to establish a new legislative framework that would regulate online platforms’ procedural obligations in content governance, while modernising the current legal regime in the e-Commerce Directive. There is a strong need for an updated formal legal framework in the EU that should guarantee legal certainty, legitimacy, and harmonization of regimes across the Member States. In order to secure the protection of the right to freedom of expression, a formal legal instrument must contain protective safeguards that are established through a democratic process that respects the principles of multistakeholderism and transparency and is subject to public debate. The future legislation regulating and harmonizing content governance in the EU must be foreseeable and accessible. The formal legal framework should have a clearly defined scope, contain the definition of associated procedures – such as notice-and-action – and set high transparency standards for both states and online platforms. Most importantly, the legal framework must reinforce a clear distinction between the obligations of states and the responsibilities of private actors to protect users’ fundamental rights and freedoms. Access Now welcomes the European Commission’s commitment to preserving the existing liability rules of the e-Commerce Directive for online platforms. Especially those private actors whose market dominance has elevated them to the role of online gatekeepers should hold a larger scale of responsibilities. The gradual scaling of responsibilities based on platforms’ market and other forms of dominance should be determined by the set of criteria to be met by those platforms holding significant market power while being in an extraordinary position to shape and influence public discourse. The European Commission should focus on regulating and holding gatekeepers accountable for how content is amplified and targeted, especially when accompanied by network and user lock-in effects that raise serious human rights concerns for users and negative implications for democratic discourse. Access Now supports the idea of effective notice-and-action mechanisms for illegal content. Importantly, different types of illegal content online and activities will require different responses specifically tailored to the type of user-generated content that they are supposed to tackle. The DSA legislative package has to clearly define these procedures and provide appropriate safeguards for their implementation by the Member States. We urge the European Commission to uphold the prohibition of general monitoring imposed on online platforms to establish minimum requirements for meaningful transparency for both content moderation tools and open content recommendation systems deployed by online platforms, including user-centric algorithmic transparency and data access granted to public oversight, researchers and civil societies. ‘Harmful’ content online should be left outside the DSA scope, as this concept is inherently vague and may lead to potential human rights abuses. Furthermore, the concept of ‘legal but harmful’ user-generated content imposes a serious challenge to the legality principle. The European legislator should make sure that content moderation, as well as content distribution tools, are sufficiently transparent and that effective remedy and redress mechanisms are always easily accessible and available to online users. The DSA legislative package should establish minimum transparency and accountability requirements imposed on online platforms, especially when platforms enforce their terms of service for content regulation. More specifically the legislative framework should allow for third party auditing of algorithmic decision making and mandate human rights impact assessments exercised by independent stakeholders, such as civil society organisations and the public authorities.
Read full response

Response to Digital Services Act package: ex ante regulatory instrument of very large online platforms acting as gatekeepers

30 Jun 2020

On behalf of the undersigned civil society organisations on this statement, Access Now calls on the European Commission to consider human rights issues in its competition policies and potential regulatory actions. We believe that this small number of large online platforms not only act as economic gatekeepers, but also as 'fundamental rights' gatekeepers. Through their business models, their terms of services and community guidelines, these platforms set standards in the market with regards to, among others, consumers' rights to privacy, data protection and freedom of expression. The impact of these platforms' behaviours and business models on the guarantee of fundamental rights in the digital single market is a major challenge for the EU, and the European Commission should include it in its understanding of the problem it aims to fix with these welcomed initiatives. Please find attached our "Joint statement in response to the inception impact assessments on a new competition tool and ex ante regulatory instrument for large online platforms acting as gatekeepers"
Read full response

Response to New competition tool

30 Jun 2020

Access Now welcomes the opportunity to provide feedback to the European Commission’s public consultation on the impact assessment on the New Competition Tool (NCT). We support the objective of the Commission to explore possibilities to intervene proactively in digital and non-digital markets to address anti-competitive behaviours of actors that impacts innovations and users’ rights, including the rights to privacy and data protection. We encourage the Commission to develop a NCT with a market structure-based approach and a horizontal scope (Option 3). Digital innovations have the potential to increase users’ benefits, however a number of large platforms are becoming gatekeepers for many digital products, services and even access to users. These platforms have sometimes established dominant positions but not always. In any case, they hold significant market power as a result of massive data collection and harvesting practices that lead to concentration of power in one or more markets, exacerbating lock-in and network effects, as well as information asymmetries. By designing a NCT that focuses on a market structure-approach rather than a dominance one, the Commission will be able to more comprehensively address the competitive shortcoming of different markets. In addition, the NCT should have a horizontal scope, instead of focusing on specific digital markets. In fact, the boundaries between digital and non-digital markets are increasingly hard to draw and platforms often engage on both. What is more, some platforms may benefit from the large market power they hold in one or more traditional markets (such as, ads for example) to further ascertain their dominance in a digital market (such as, social media for instance). We further encourage the Directorate-General for Competition and competition authorities to closely work with data protection authorities and the European Data Protection Board in determining the impact of mergers and acquisitions on markets and users’ rights. In accordance with EU primary and secondary legislation, personal data should be controlled by data subjects, not by a company. This means, for instance, in case of mergers between two companies, users of company A should be able to refuse for their data to be disclosed with company B, and vice versa. By polling databases together, mergers not only have an impact on competition but they also affect data protection rights of users. This is why we are asking the Commission to block the Google/FitBit merger. This approach of considering both the data protection and competition impact of mergers is growing among European regulators. Recently, Germany’s Federal Supreme Court ruled in favour of the Germany’s competition regulator, the Bundeskartellamt, which concluded that Facebook should stop combining users’ data from different sources, including WhatsApp and Instagram accounts. We however caution against the development of any “data ownership” concept. We understand the appeal of treating personal data as a property given the asymmetrical relationship between people and companies who profit from personal data. However, treating personal data like a commodity fails to recognise its nature as part of a fundamental right. We instead urge upholding data protection law. Finally, to complement our submission, we attached a joint letter in which we further highlighted the need for the NCT of the Commission to integrate criteria to measure the impact on fundamental rights in the digital environment, beyond data protection and privacy. In fact, “we believe that (a) small number of large online platforms not only act as economic gatekeepers, but also as 'fundamental rights' gatekeepers.” We look forward to continuing engaging with the European Commission to assist in the upgrade of competition policies and tools in line with new market realities that would contribute to better protect users’ rights and promote innovation.
Read full response

Meeting with Věra Jourová (Vice-President) and

15 May 2020 · Fundamental Rights, EC response to COVID 19, GDPR, Artificial Intelligence

Meeting with Christiane Canenbley (Cabinet of Executive Vice-President Margrethe Vestager)

4 Feb 2020 · Digital policy

Meeting with Andrus Ansip (Vice-President) and

1 Mar 2018 · Privacy Shield, GDPR implementation and e-Privacy

Meeting with Carl-Christian Buhr (Cabinet of Commissioner Mariya Gabriel)

15 Feb 2018 · ePrivacy Directive and the EU cybersecurity strategy

Meeting with Věra Jourová (Commissioner) and

7 Mar 2017 · Privacy shield, GDPR

Meeting with Andrus Ansip (Vice-President) and

7 Mar 2017 · e-Privacy proposal, general data protection regulation-state of play, privacy shield and international transfer