Stiftung Digitale Chancen

SDC

Die Stiftung Digitale Chancen ist eine gemeinnützige, operativ tätige Stiftung.

Lobbying Activity

Response to Update of EU rules on audiovisual media services

18 Dec 2025

The German Digital Opportunities Foundation welcomes the European Commissions Call for Evidence for an Evaluation and Impact Assessment of the Audiovisual Media Services Directive (AVMSD). The aim of the AVMSD is to establish and ensure a level playing field of a single market for audiovisual media services in the European Union, while contributing to the promotion of cultural diversity and offering an adequate level of consumer and child protection. With this statement, we will contribute particulary with regards to child protection. Our expertise and experiences stem from the project Child Protection and Children`s Rights in the Digital World . Besides others objectives, the AVMSD regulates (surreptitious) commercial communications. Since its introduction in 2010 and its last revision in 2018, the digital environment has developed considerably. Influencers, influencer marketing and the impact of influencing are rising. Children and adolescents are increasingly following influencers and their content, sometime also seeing them as role models, trusting them and strive to become influencers themselves. A study among youths (aged 12 to 19) in Germany revealed that more than one third of the surveyed adolescents (35 per cent) had purchased a product promoted by an influencer. In this context almost two thirds (64 per cent) mistakenly thought influencers would label product placements or recommendations as commercial communication. One in four young people (26 per cent) felt that influencers were their friends and almost one in five adolescent (18 per cent) wanted to become an influencer themselves. Many influencers promote and support products or lifestyle in areas such as beauty, clothing, health and/or sports, which can lead to false asumptions and unhealthy habits. But, more than one in three young people (37 per cent) also follows influencers on politics and other current affairs. Against that backdrop, it is obvious that rules and guidelines need to be established for influencing content produced and/or received in the European Union, especially such content targeted at under aged children and youth. Article 32 CRC obliges State Parties to protect children from commercial exploitation. In its General Comment No.25 (2021) on children`s rights in relation to the digital environment the Committee on the Rights of the Child pre- and describes how the rights of minors should be realised online and how they can be protected from commercial exploitation without restricting them having access the internet. According to paragraph 41 states parties should make the best interests of the child a primary consideration when regulating advertising and marketing addressed to and accessible to children. Sponsorship, product placement and all other forms of commercially driven content should be clearly distinguished from all other content []. Therefore, we encourage the European Commission to revise the AVMSD and make the protection of minors from influencing and other manipulative commercial practices a new objective of the Directive. In doing so it seems to be also necessary to regulate influencer marketing not only from the perspective of the consumption of such content by children and young people, but also to set rules for children or adolsecents being part of such practices by adults or become influencers themselves.
Read full response

Meeting with Cathrin Bauer-Bulst (Cabinet of Commissioner Magnus Brunner) and ECPAT Deutschland e.V. and

1 Jul 2025 · Protection of children from sexual abuse – situation in the EU and in Germany

Response to European Democracy Shield

22 May 2025

The German Digital Opportunities Foundation welcomes the European Commissions initiative to establish an European Democracy Shield. Human rights and democracy are shared values that constitute our European society. Access to reliable information, freedom of expression and the right steto freedom of assembly and association as laid down in the United Nations Universal Declaration on Human Rights are the foundation of social cohesion and a pre-condition to equal participation. Both human rights and democracy are under pressure nowadays. Disinformation, hate speech and influencing based on false assumptions manipulate opion building and undermine trust and confidence in democratic processes and public instutions. Against this backdrop it is necessary to increase efforts and investment in maintaining the integrity of our democracy and protection of human rights. Growing up and living in a society increasingly steeped in digital media provides opportunities but, at the same time risks and dangers arise. To navigate safely and realise ones fundamental rights in such digital environments media literacy is a pre-condition. Therefore education and training in this regard for European citizens, children and youth as well as adults and elderly people is mandatory and should be accompanied by awareness raising campaigns informing e.g. on current manipulation strategies. Democracy needs citizens being able to critically scrutinise media content and distinguish between reliable information and misleading or false content. Building ones own well-informed opinion be it a whole weltanschauung or an opinion on a single issue enables their active and responsible participation in society. That is why the German initiative Growing up well with media demands a media literacy education campaign now! . The wave of disinformation, untrue allegations, manipulative story telling and hate speech is growing based on manipulative content created by AI tools. This has a huge negative impact and decreases active user participation online as well as confidence in institutions. As a consequence wrong assumptions on attitudes and values in society evolve and may even shift the perception of reality. Therefore, content moderation and fact checking are important corner-stones of a democracy shield to enable contetxtualisation of information and provide user orientation and support in addition to raising their skills and literacy to cope with such kind of content. Financial support needs to be provided for initiatives and campaigns countering disinformation and strengthening the abilities of users, which so far are mainly run by under-funded civil society organisations. Prevention of anti-democratic attitudes like hate speech, dis- and misinformation is mandatory. In addition, a strong legal framework to guarantee human rights and freedoms as a basis to counteract such trends builds the second strategic pillar of a democracy shield. Already existing legislation needs to be enforced and further regulation should be considered to balance freedom of speech with the protection against online threats and harm and provide mitigation and remedy. As a third pillar of the strategy the provision of realiable, trustworthy and diverse information and content created and provided online by a diverse landscape of accountable media and online platforms as well as digitally literate users should be supported by means of funding and a regulatory framework. A Europe wide network of counselling services, help- and hotlines offering support guidance to remedy to users especially those from vulnerable groups like children, elderly, people with disabilities and those being most often suject to hate and discreditation should be (further) financially sustained and promoted. Berlin, 22.05.2025
Read full response

Response to Protection of Minors Guidelines

24 Sept 2024

The German Digital Opportunities Foundation welcomes the European Commissions initiative to issue guidelines on the protection of minors on the internet. These guidelines should provide recommendations and best practices to support online platform providers to reduce risks for children while ensuring a high level of protection for children. The guidelines should therefore be based on the Convention on the Rights of the Child and build on General Comment No.25 on childrens rights in relation to the digital environment. The best interests of the child must serve as a guiding principle for online platform providers to improve the protection of minors in their services. While children use the services of online platform providers to stay in touch with family and friends, to play or engage, to learn or even to enjoy their leisure time, they are exposed to various risks that can lead to harm, e.g. solicitation by strangers, confrontation to explicit (sexual) content or commercial exploitation. Such risks can be increased by different dimensions of online activity. The 5C typology established by Livingstone and Stoilova is a comprehensive scheme to reflect these multiple dimensions. We encourage the European Commission to keep this model at the heart of the guidelines. Against the background of the constantly evolving digital environment and the complexity of potential risks and harmful phenomena, the German federal legislator established the objective of personal integrity in the second amendment of the Youth Protection Act in 2021. Although the term does not come with a final definition, an understanding of personal integrity has been established that includes the protection of physical and psychological integrity and personal data. In particular, the age-appropriate and future-open development as well as the informational and sexual self-determination of minors must be ensured. Conversely, processes or designs that exploit the inexperience and/or age of a child, economic exploitation and the commercial or other inadequate processing and dissemination of user data are contrary to the protection of personal integrity. This legal concept may also serve the European Commissions guidelines to enforce the protection of minors online in order to adhere to the principle of the best interests of the child in digital environments. Although the DSA, Art. 28, para 3 does not demand platform providers to collect additional data from their users, age assurance mechanisms can contribute to create age-approriate digital environments for all users thus ensuring that persons of age can access adult content while children can enjoy their right to communicate with their peers without being at risk of sexual harassment and grooming. The VOICE study 2024 shows that children prefer measures of age verification that take their protection and their privacy into account. More than half of the children are open to the use of measures to verify their age, while some are concerned that the data required for this could also be used for other purposes. Therefore, it seems necessary to invest in the development of age verification methods that are reliable and accurate in verifying a user's age, while respecting and protecting the user's anonymity and privacy. To evaluate wether childrens rights are properly taken into account by online platform providers, a Child Rights Impact Assessment tool should serve as the method of choice. This method will be able to reflect wether the protection of children in the digital environment is enhanced by respecting or violating other childrens rights, such as provision and participating. In addition, the CWA 18016 Age appropriate digital services framework will provide recommendations and best practices for creating platforms and services, which are child-friendly, safe and secure.
Read full response

Response to Combating child sexual abuse

21 May 2024

The German Digital Opportunities Foundation welcomes the European Commissions initiative to recast and amend the existing Directive. Since their came in effect diction and perception of some circumstances changed as well as CSA online rapidly increased. On the whole we support this proposal, but we want to stress some points out and raise the awareness of the European Commission for further improvement. We support that Article 2(3)(d) defines reproductions and representations also as CSAM. By understanding representation also in the meaning of an avatar, we encourage the EC to extend the scope of the regulation, e.g. in Article 3 to virtual sexual abuse against an avatar of a child. Children growing-up today do not distinguish between a physical and a digital world, recognising both as real. About 40 percent of children consider their digital representation more important than their real one. Having real-like 3D avatars and phenomena like embodiment will level this up. Offences and crimes against an own avatar will be experienced as one against oneself. In Germany, the Youth Protection Act has established the protectional objective of Personal Integrity of a child in 2021. This legal concept needs to be concretised in order to provide an understanding which entails the protection of the physical and psychological integrity of a child as well as the protection of a childs personal data as their individual representation in a digital environment. We welcome the extension of Art. 6 on proposing to meet a child either online or in person as thereby also cyber-grooming will be addressed. We recommend a reference to GC #25 on childrens rights in the digital environment, para 81 f. Article 10 addresses consensual sexual activities. In this context we welcome the specification regarding self-generated sexual material by children. In accordance with GC #25, para 118 we would like to the EC to adopt this notion. With Article 16 the EC proposed to extend the limitation of offences concerning sexual abuse. Anyhow, recent research shows that survivors need often long time to disclose crimes and therefore organisations of survivors are pledging to abolish limitations entirely. We welcome the ECs proposal to further establish the role of the EU-Centre to prevent and combat child abuse as foreseen in the draft CSA-R, Art. 40 also in the European Commissions Proposal for a directive of the European Parliament and of the Council on combating the sexual abuse and sexual exploitation of children and child sexual abuse material. Nonetheless we suggest to choose a more structural approach instead of mentioning this important resource seemingly arbitrary in Art. 21, 25, 28, and 31. Also we are afraid the intension of the ECs proposal would fail if the CSA-R will not be adopted. Last but not least, we appreciate the fact that the EC has already partially taken into account the terminology guidelines for the protection of children from sexual exploitation and sexual abuse adopted by the Interagency Working Group in Luxembourg in 2016 and encourage their more comprehensive application.
Read full response

Response to Amending of temporary derogation from certain provisions of Directive 2002/58/EC for combating online child sexual abuse

9 Feb 2024

The German Digital Opportunities Foundation welcomes the European Commissions initiative to extend the existing temporary derogation from specific provisions of Directive 2002/58/EC to open up the possibility for combating child sexual abuse after August 3, 2024 for further two years. The current report of the European Commission1 on the implementation of Regulation (EU) 2021/1232 on the temporary derogations reveals that voluntary reporting contributed significantly to the protection of a large number of children (see page 33). With regard to decisions of the European Parliament and the European Council on the duration of the designated extension we highly encourage all European legislative bodies to strive for a solution in the upcoming trilogue that opens up an appropriate window of opportunity to ensure a long-term regulation aimed to protect children from sexual abuse online. The ECs report mentioned above shows that in 2022 8.9 million corresponding content items or accounts were identified by providers (see page 7-8). Given that 6.6 million of the reports in 2022 were submitted by Meta their announcement in December 20232 of implementing end-to-end encryption by default gives rise to the assumption that the number of reports then will decrease significantly although child sexual abuse will at least stay on the same level or even increase more. The designated extension of the temporary derogation until August 3, 2026 will provide the opportunity to address more thoroughly so far highly discordant issues of the European Commissions proposal to combat and prevent child sexual abuse online. This time should be used efficiently to overcome seemingly insuperable positions, e.g. regards detection in encrypted environments or privacy preserving age verification. The draft regulation proposed by the EC combines measures to prevent and combat child sexual abuse while balancing fundamental rights of children as well as all users. Efforts to strive for measures and policies, which are commonly accepted are worthwhile. Many MS and other stakeholders refuse detection in encrypted digital environments due to the assumption that this might open a backdoor which might be unlawfully exploited by criminals or unauthorized persons. Against this backdrop it seems obvious to invest in research and development to address these legitimate concerns and find better solutions. For example, a locked backdoor in an encrypted service, which could be opened only for a single purpose with a special key that is kept safe from unwarranted usage, e.g. through different players to store the key and to permit controlled usage which shall be logged for transparency reasons. Concerns in regard of potential breaches to such a backdoor should be addressed appropriately but not prevent us from striving for solutions to keep children safe also in encrypted environments. Age assurance is the silver bullet to create safe spaces online for children and to allow respective freedoms for adults. Knowledge of the age range a user belongs to is essential to provide them with age-appropriate functionalities and settings and where applicable, with precautionary measures and support. In such a secure environment, the need for issuing a detection order would be significantly reduced. Nevertheless, the known methods of age verification are often devalued due to their intrusiveness through the collection of biometric and/or personal data, a potentially inaccurate age estimation or their possible circumvention through a false self-declaration of age. It therefore seems necessary and obvious to further develop tools that fulfil essential requirements such as guaranteeing the anonymity of the user and preventing their Internet activities or history from being linked to a specific person. Assuming such tools can be brought to market maturity within the next two years, this would create conditions for a future-proof regulation to combat child sexual abuse online.
Read full response

Response to Integrated child protection systems

19 Oct 2023

The German Digital Opportunities Foundation welcomes the EU initiative to support the development and strengthening of integrated child protection systems in the EU. Nowadays children are growing up in an all-pervasive digital environment where they are exercising their rights to information, association and play but also exposed to risks and harm, e.g. cyberbullying, hate speech, fake news, inappropriate content and contact like grooming. In light of the EU Charter of Fundamental Rights and its Strategy on the Rights of the Child, the EU is setting new rules with DSA, DMA and CSA-R and the Better Internet for Kids + strategy to create a safer digital environment. In 2021, Germany has amended the Federal Youth Protection Act that strikes a good balance between children's rights to protection, provision and participation thus being based on the UN Committee on the Rights of the Child in General Comment No. 25. The law has defined new protection goals that guarantee the personal integrity of children and adolescents and promote guidance for young people, parents and educators in media use and media literacy. It also ensures child participation in decision making on child protection issues through the Advisory Board of the Federal Agency for the Protection of Children and Young People. In implementation of the DSA, the Federal Agency will be established as an independent body being responsible for enforcing children's rights in digital services. But, since the DSA partly overwrites national law certain participatory concepts in the Youth Protection Act like the obligation for self-regulatory bodies to consult with children when developing guidelines for child safety in online services will be abolished. We therefore recommend to the EC to assess whether regulation on EU level may have an impact on already existing elements of child protection systems on national level. Furthermore regulation on EU level should not hold obstacles but pave the way for efficient measures for child online safety provided by industry. Since knowledge of the age of every user is the pre-condition to age appropriate design, referring to the DSA Art. 28 we suggest the EC assesses the option for the development and implementation of an EU wide approach to age verification guaranteeing anonymity and adhering to the principle of data minimization. In Germany the governmental process for a new law to combat digital violence was started in 2023, in order to empower every user to exercise his or her rights in the digital environment. By independent court decision victims of online harm will be able to learn to know the identity of the offender thus being able to claim remedy and held the perpetrator accountable. Also an option is foreseen to block the account of a perpetrator for a meaningful period of time depending on the harm and impact on other users. In our opinion the chosen approach holds potential to become part of integrated child protection systems and should be taken in consideration. In addition, we recommend that institutions be established and supported to raise knowledge and awareness and to promote media literacy on the EU, at national, regional and local levels. An integrated child protection system must be based on the principle of the best interest of the child as laid down in the European Charter on Fundamental Rights, Art. 24, put the child at the centre and ensure that all relevant stakeholders work together to prevent abuse, exploitation, neglect and other forms of violence against children hands-on and online and to protect and support children. But, actors in the third sector are generally underfunded and understaffed, thus not being able to unfold their full potential for child protection. Against this backdrop, the European Union is called upon to ensure permanent sufficient funding and encourage member states to do so
Read full response

Response to Virtual worlds, such as metaverse

3 May 2023

The Digital Opportunities Foundation is a non-profit, operational foundation based in Berlin. Since 2002, we are researching the social consequences of digitization and advocating equal opportunity access to the Internet for all people. With numerous projects at national and European level, the foundation pursues the goal of digital inclusion, participation and equal opportunities, thus counteracting a digital divide in society. Based on our mission to ensure equal opportunities for all in a digital world we welcome the European Commissions initiative on virtual worlds. We acknowledge the huge potential of virtual worlds for innovative economy and growth. But, we are also aware of the risks and threats that come along with real-time, immersive and persistent environments that blend physical and virtual realities. Society will only be able to harness the benefits of the up-coming technological transition when proper safeguards are in place. Therefore we suggest to start the new initiative on virtual worlds with a holistic technology assessment process taking into consideration the positive and negative effects of the developments ahead.
Read full response

Meeting with Ylva Johansson (Commissioner) and

20 Oct 2022 · European Commission proposal to fight child sexual abuse online

Response to Child sexual abuse online: detection, removal and reporting

12 Sept 2022

We expressly welcome the draft regulation since it is the first time a law in the EU follows a fundamental and comprehensive child rights approach focusing on the primacy of the best interests of the child according to the EU Charter of Fundamental Rights Art. 24 (2). Among the options in consideration the EC has chosen the most far-reaching proposal (E) with regard to the protection of children, addressing detection, reporting and deletion of both known depictions of abuse and "new" material, as well as solicitation of children with sexual intent (grooming). The obligation to assess the risk of grooming within apps is a milestone in combatting child sexual abuse online. The fight against CSAM must start with a risk assessment and risk mitigation measures by the service providers, since prevention is crucial. Only if and when these preventive operations turn out to be inefficient, the process potentially leading to a detection order will be initiated. The regulation outlines transparently the necessary steps before a detection order is issued and the safeguards to eliminate the violation of fundamental rights as far as possible. We appreciate detection orders being issued by a court or national authority based on a thorough validation process, any order will be limited in time and addressing only a certain type of content on the respective service. Research as well as law enforcement investigations give evidence that sexualised violence follows escalation pathways, the earlier these paths can be stopped the better. We acknowledge the need to ensure the privacy of interpersonal communications, nonetheless the regulation needs to take into account private chats because this is where perpetrators initiate contact with children. The EC suggests scanning technologies look for behavioural patterns suggestive of abuse, but not to analyse the actual content of the communication at first. We trust in the structural safeguards foreseen in the draft regulation to prevent surveillance of all personal communication without cause. The draft recognizes end-to-end encryption as an effective means of ensuring the confidentiality of communications and explicitly does not exclude it as an instrument (para. 26). The draft leaves it up to providers to choose the appropriate technology, but makes it unequivocally clear that providers are obliged to detect CSAM and grooming in their services. Our expectation would be for service providers to invest in the development and deployment of such technologies right now to have it operational when the parliamentarian process hopefully is finished in mid 2024 with the new regulation chiming into force. We welcome the co-operational approach for the EU Centre, that is an important component of the regulation. The tasks foreseen for the Centre must be addressed on a trans-national level without putting in question the work of national authorities. Only the time horizon of 8 years till it is fully operational is too long. We expect the EC to undertake preparational measures for the Centre to be ready to start in 2024. The centre must operate independent from law enforcement although in close co-operation with Europol. The interim derogation is a strong example how to overcome the only seemingly insurmountable contradiction between privacy and child protection. Like all people also children have the inalienable right to privacy as laid down in the UN-CRC Art. 16. Art. Also they have the right to protection from any form of exploitation (Art. 34 – 36). The approach of the regulation is justified by General Comment 25 on children’s rights in relation to the digital environment esp. by a call for special protective measures in chapt. 12. Also it requires states in para 118 not to criminalise self-generated sexual content children possess or share with their consent and solely for their own private use. We suggest the EC reviews the draft to ensure it is fully compliant also with this para.
Read full response

Response to Delivering for children: an EU strategy on the rights of the child

5 Aug 2020

Statement to the EU strategy on children’s rights from the German Digital Opportunities Foundation The remit of the German Digital Opportunities Foundation (established 2002) is to research the social impact of digitisation, to campaign for equal access to the Internet for all people and to advance digital literacy. Our objective is digital inclusion for all societal groups and counteracting the digital divide. The German Federal Ministry for Economics and Energy and the German Federal Ministry for Family Affairs have taken patronage over the Foundation. The foundation is registered in the European Transparency Register at http://ec.europa.eu/transparencyregister/public/consultation/displaylobbyist.do?id=948042627375-19 Since 2017 the German Digital Opportunities Foundation runs the project “Child Protection and Children’s Rights in the Digital World” funded by the German Federal Ministry for Family Affairs. In this framework we are glad for the opportunity to comment on the Commission’s initiative “Delivering for children: an EU strategy on the rights of the child.” The world – and with it the living environment of children – has changed in manifold ways since the enactment of the UN-Convention on the Rights of the Child in 1989. Coinciding with the elaboration process of the UN-CRC, Tim Berners-Lee developed the World Wide Web at the CERN Research Centre in Geneva. The code, initially designed for the exchange of information among researchers, has made the usage of the internet possible for everybody and thus shaped their way of living. Therefore it is necessary to take a closer look at the now 30 year old guidelines of the Convention with regard to the changes in society due to digitisation. A pre-condition for analysing the implications of digitisation on the living environment of children is a consistent understanding of the terms “child” and “digital environment”. According to the UN-Convention on the Rights of the Child, a child means any person under the age of 18 years old. We understand the term digital environment as more than just the internet. By analogy with the natural environment that people live in, the digital environment is experienced as an “online world”, accessed via digital devices. It encompasses the interaction of an evolving offer of connected digital services (content, software and applications) from commercial, public and other providers. This includes all computing and digitally networked technologies and services, often referred to as ICTs, the Internet, the World Wide Web, mobile devices and networks, online, “apps”, social media platforms, electronic databases, ‘big data,’ ‘Internet of Things’, ‘information society services’, the media environment, online gaming, and any developments resulting in access to or services for the digital environment. Our recommendations We suggest to base the EU strategy on children’s rights on the triangle of children’s rights according to the UN-CRC. The rights dedicated to children by the UN Convention on the Rights of the Child are based on the three “P”s Provision, Protection, and Participation. These three “P”s constitute the corners of a triangle where the best interest of the child is placed in the centre. For an holistic approach to children’s rights the digital environment should not be understood as an annex to the world children are living rather it is interwoven with their everyday life. As stated previously digitisation has an impact on all areas of children’s lives. Therefore a strategy to better promote and protect children’s rights must reflect the impact of digitisation and cover all child rights areas. For a more detailed argumentation please refer to the full text of our comment as uploaded.
Read full response

Response to EU strategy for a more effective fight against child sexual abuse

3 Jul 2020

Based on the UN-Convention on the Rights of the Child Article 19 [Protection from all forms of violence] and Article 34 [Protection from all forms of sexual exploitation and sexual abuse] but also Article 13 [Freedom of expression], Article 15 [Freedom of association], Article 17 [Access to information; mass media], Article 28 [Right to education], and Article 31 [Leisure, play and culture] we assume several aspects as relevant for the development of an EU strategy for a more effective fight against child sexual abuse (for detailed argumentation please refer to the attached document). We suggest to address the following issues by the EU strategy Further specification of the imagery (still and video) to be combated (any means of depiction or promotion of sexual abuse of children) o In addition to known classifications of imagery research on otherwise innocuous everyday life images with sexualized comments and development of a criteria catalogue to classify also this type of imagery Further development of technical instruments for the identification of all types of imagery also in the grey area in order to support industry’s efforts to combat such imagery o Automated generation of relevant key words and phrases o Training of a classifier for automated identification of images, also in the grey area Reinforced efforts to delete imagery (still and video) also grey area material and deprive such imagery of the platforms of distribution o Improvement of hotline procedures o Pro-active search for imagery by web crawler technologies o Optimization of Notice and Takedown procedures for all types of unlawful imagery o Raising awareness amongst hotlines (INHOPE members and beyond) for the necessity to identify and combat also grey area material o Raising awareness amongst industry (platform providers and intermediaries) and encourage them to search actively for all types of imagery including grey area material on their platforms and in their services Development of a strategy of systematic counter measures for a targeted approach to combat all types of unlawful imagery including grey area material o Identification of the top countries of dissemination, top platforms of dissemination and network nodes o Analysis of findings of hotlines and industry from their search and removal activities: Relevance of keywords, successful deletion, best practice procedures Improved efforts to address (and monitor) communication channels and platforms used by perpetrators for exchange of material and on their activities Empowerment of law enforcement o Increased resources (staff and technology) o Extended powers including for covert investigations In order to protect children and adolescents from being victimized establishment of legal obligations for platform and services providers o to apply the principle of safety by design in the development of their products, applications and services o to adhere to the principle of duty of care by implementation of precautionary measures such as for example: provision of a notification and redress procedure by which users can register complaints in regard of impairment and or endangerment of children or adolescents; provision of a reporting and redress procedure with a user guidance system suitable for children and young people, in the context of which underage users in particular can report impairments of their personal integrity; provision of technical means for controlling and accompanying the use of the services by persons in charge of care of children and youth; the establishment of default settings which limit the risks of use for children and youth, taking into account their age; Establishment of competent authorities for review of the implementation of the principle of safety by design and the implementation, the concrete form and the adequacy of such precautionary measures to be taken by service providers
Read full response