My favourites

Chapter III – Due diligence obligations for a transparent and safe online environment (Art. 11-48)

Art. 11 DSA - Points of contact for Member States’ authorities, the Commission and the Board arrow_right_alt

Art. 12 DSA - Points of contact for recipients of the service arrow_right_alt

Art. 13 DSA - Legal representatives arrow_right_alt

Art. 14 DSA - Terms and conditions arrow_right_alt

Art. 15 DSA - Transparency reporting obligations for providers of intermediary services arrow_right_alt

Art. 16 DSA - Notice and action mechanisms arrow_right_alt

Art. 17 DSA - Statement of reasons arrow_right_alt

Art. 18 DSA - Notification of suspicions of criminal offences arrow_right_alt

Art. 19 DSA - Exclusion for micro and small enterprises arrow_right_alt

Art. 20 DSA - Internal complaint-handling system arrow_right_alt

Art. 21 DSA - Out-of-court dispute settlement arrow_right_alt

Art. 22 DSA - Trusted flaggers arrow_right_alt

Art. 23 DSA - Measures and protection against misuse arrow_right_alt

Art. 24 DSA - Transparency reporting obligations for providers of online platforms arrow_right_alt

Art. 25 DSA - Online interface design and organisation arrow_right_alt

Art. 26 DSA - Advertising on online platforms arrow_right_alt

Art. 27 DSA - Recommender system transparency arrow_right_alt

Art. 28 DSA - Online protection of minors arrow_right_alt

Art. 29 DSA - Exclusion for micro and small enterprises arrow_right_alt

Art. 30 DSA - Traceability of traders arrow_right_alt

Art. 31 DSA - Compliance by design arrow_right_alt

Art. 32 DSA - Right to information arrow_right_alt

Art. 33 DSA - Very large online platforms and very large online search engines arrow_right_alt

Art. 34 DSA - Risk assessment arrow_right_alt

  1. Providers of very large online platforms and of very large online search engines shall diligently identify, analyse and assess any systemic risks in the Union stemming from the design or functioning of their service and its related systems, including algorithmic systems, or from the use made of their services.

They shall carry out the risk assessments by the date of application referred to in Article 33(6), second subparagraph, and at least once every year thereafter, and in any event prior to deploying functionalities that are likely to have a critical impact on the risks identified pursuant to this Article. This risk assessment shall be specific to their services and proportionate to the systemic risks, taking into consideration their severity and probability, and shall include the following systemic risks:

    1. the dissemination of illegal content through their services;
    2. any actual or foreseeable negative effects for the exercise of fundamental rights, in particular the fundamental rights to human dignity enshrined in Article 1 of the Charter, to respect for private and family life enshrined in Article 7 of the Charter, to the protection of personal data enshrined in Article 8 of the Charter, to freedom of expression and information, including the freedom and pluralism of the media, enshrined in Article 11 of the Charter, to non-discrimination enshrined in Article 21 of the Charter, to respect for the rights of the child enshrined in Article 24 of the Charter and to a high-level of consumer protection enshrined in Article 38 of the Charter;
    3. any actual or foreseeable negative effects on civic discourse and electoral processes, and public security;
    4. any actual or foreseeable negative effects in relation to gender-based violence, the protection of public health and minors and serious negative consequences to the person’s physical and mental well-being.
  1. When conducting risk assessments, providers of very large online platforms and of very large online search engines shall take into account, in particular, whether and how the following factors influence any of the systemic risks referred to in paragraph 1:
    1. the design of their recommender systems and any other relevant algorithmic system;
    2. their content moderation systems;
    3. the applicable terms and conditions and their enforcement;
    4. systems for selecting and presenting advertisements;
    5. data related practices of the provider.

The assessments shall also analyse whether and how the risks pursuant to paragraph 1 are influenced by intentional manipulation of their service, including by inauthentic use or automated exploitation of the service, as well as the amplification and potentially rapid and wide dissemination of illegal content and of information that is incompatible with their terms and conditions.

The assessment shall take into account specific regional or linguistic aspects, including when specific to a Member State.

  1. Providers of very large online platforms and of very large online search engines shall preserve the supporting documents of the risk assessments for at least three years after the performance of risk assessments, and shall, upon request, communicate them to the Commission and to the Digital Services Coordinator of establishment.
Related
Close tabsclose
  • 79
  • 80
  • 81
  • 82
  • 83
  • 84
  • 85

Recital 79

Very large online platforms and very large online search engines can be used in a way that strongly influences safety online, the shaping of public opinion and discourse, as well as online trade. The way they design their services is generally optimised to benefit their often advertising-driven business models and can cause societal concerns. Effective regulation and enforcement is necessary in order to effectively identify and mitigate the risks and the societal and economic harm that may arise. Under this Regulation, providers of very large online platforms and of very large online search engines should therefore assess the systemic risks stemming from the design, functioning and use of their services, as well as from potential misuses by the recipients of the service, and should take appropriate mitigating measures in observance of fundamental rights. In determining the significance of potential negative effects and impacts, providers should consider the severity of the potential impact and the probability of all such systemic risks. For example, they could assess whether the potential negative impact can affect a large number of persons, its potential irreversibility, or how difficult it is to remedy and restore the situation prevailing prior to the potential impact.

Recital 80

Four categories of systemic risks should be assessed in-depth by the providers of very large online platforms and of very large online search engines. A first category concerns the risks associated with the dissemination of illegal content, such as the dissemination of child sexual abuse material or illegal hate speech or other types of misuse of their services for criminal offences, and the conduct of illegal activities, such as the sale of products or services prohibited by Union or national law, including dangerous or counterfeit products, or illegally-traded animals. For example, such dissemination or activities may constitute a significant systemic risk where access to illegal content may spread rapidly and widely through accounts with a particularly wide reach or other means of amplification. Providers of very large online platforms and of very large online search engines should assess the risk of dissemination of illegal content irrespective of whether or not the information is also incompatible with their terms and conditions. This assessment is without prejudice to the personal responsibility of the recipient of the service of very large online platforms or of the owners of websites indexed by very large online search engines for possible illegality of their activity under the applicable law.

Recital 81

A second category concerns the actual or foreseeable impact of the service on the exercise of fundamental rights, as protected by the Charter, including but not limited to human dignity, freedom of expression and of information, including media freedom and pluralism, the right to private life, data protection, the right to non-discrimination, the rights of the child and consumer protection. Such risks may arise, for example, in relation to the design of the algorithmic systems used by the very large online platform or by the very large online search engine or the misuse of their service through the submission of abusive notices or other methods for silencing speech or hampering competition. When assessing risks to the rights of the child, providers of very large online platforms and of very large online search engines should consider for example how easy it is for minors to understand the design and functioning of the service, as well as how minors can be exposed through their service to content that may impair minors’ health, physical, mental and moral development. Such risks may arise, for example, in relation to the design of online interfaces which intentionally or unintentionally exploit the weaknesses and inexperience of minors or which may cause addictive behaviour.

Recital 82

A third category of risks concerns the actual or foreseeable negative effects on democratic processes, civic discourse and electoral processes, as well as public security.

Recital 83

A fourth category of risks stems from similar concerns relating to the design, functioning or use, including through manipulation, of very large online platforms and of very large online search engines with an actual or foreseeable negative effect on the protection of public health, minors and serious negative consequences to a person’s physical and mental well-being, or on gender-based violence. Such risks may also stem from coordinated disinformation campaigns related to public health, or from online interface design that may stimulate behavioural addictions of recipients of the service.

Recital 84

When assessing such systemic risks, providers of very large online platforms and of very large online search engines should focus on the systems or other elements that may contribute to the risks, including all the algorithmic systems that may be relevant, in particular their recommender systems and advertising systems, paying attention to the related data collection and use practices. They should also assess whether their terms and conditions and the enforcement thereof are appropriate, as well as their content moderation processes, technical tools and allocated resources. When assessing the systemic risks identified in this Regulation, those providers should also focus on the information which is not illegal, but contributes to the systemic risks identified in this Regulation. Such providers should therefore pay particular attention on how their services are used to disseminate or amplify misleading or deceptive content, including disinformation. Where the algorithmic amplification of information contributes to the systemic risks, those providers should duly reflect this in their risk assessments. Where risks are localised or there are linguistic differences, those providers should also account for this in their risk assessments. Providers of very large online platforms and of very large online search engines should, in particular, assess how the design and functioning of their service, as well as the intentional and, oftentimes, coordinated manipulation and use of their services, or the systemic infringement of their terms of service, contribute to such risks. Such risks may arise, for example, through the inauthentic use of the service, such as the creation of fake accounts, the use of bots or deceptive use of a service, and other automated or partially automated behaviours, which may lead to the rapid and widespread dissemination to the public of information that is illegal content or incompatible with an online platform’s or online search engine’s terms and conditions and that contributes to disinformation campaigns.

Recital 85

In order to make it possible that subsequent risk assessments build on each other and show the evolution of the risks identified, as well as to facilitate investigations and enforcement actions, providers of very large online platforms and of very large online search engines should preserve all supporting documents relating to the risk assessments that they carried out, such as information regarding the preparation thereof, underlying data and data on the testing of their algorithmic systems.

Art. 35 DSA - Mitigation of risks arrow_right_alt

Art. 36 DSA - Crisis response mechanism arrow_right_alt

Art. 37 DSA - Independent audit arrow_right_alt

Art. 38 DSA - Recommender systems arrow_right_alt

Art. 39 DSA - Additional online advertising transparency arrow_right_alt

Art. 40 DSA - Data access and scrutiny arrow_right_alt

Art. 41 DSA - Compliance function arrow_right_alt

Art. 42 DSA - Transparency reporting obligations arrow_right_alt

Art. 43 DSA - Supervisory fee arrow_right_alt

Art. 44 DSA - Standards arrow_right_alt

Art. 45 DSA - Codes of conduct arrow_right_alt

Art. 46 DSA - Codes of conduct for online advertising arrow_right_alt

Art. 47 DSA - Codes of conduct for accessibility arrow_right_alt

Art. 48 DSA - Crisis protocols arrow_right_alt