My favourites

Chapter III – Due diligence obligations for a transparent and safe online environment (Art. 11-48)

Art. 11 DSA - Points of contact for Member States’ authorities, the Commission and the Board arrow_right_alt

Art. 12 DSA - Points of contact for recipients of the service arrow_right_alt

Art. 13 DSA - Legal representatives arrow_right_alt

Art. 14 DSA - Terms and conditions arrow_right_alt

Art. 15 DSA - Transparency reporting obligations for providers of intermediary services arrow_right_alt

Art. 16 DSA - Notice and action mechanisms arrow_right_alt

Art. 17 DSA - Statement of reasons arrow_right_alt

Art. 18 DSA - Notification of suspicions of criminal offences arrow_right_alt

Art. 19 DSA - Exclusion for micro and small enterprises arrow_right_alt

Art. 20 DSA - Internal complaint-handling system arrow_right_alt

Art. 21 DSA - Out-of-court dispute settlement arrow_right_alt

Art. 22 DSA - Trusted flaggers arrow_right_alt

Art. 23 DSA - Measures and protection against misuse arrow_right_alt

Art. 24 DSA - Transparency reporting obligations for providers of online platforms arrow_right_alt

Art. 25 DSA - Online interface design and organisation arrow_right_alt

Art. 26 DSA - Advertising on online platforms arrow_right_alt

Art. 27 DSA - Recommender system transparency arrow_right_alt

Art. 28 DSA - Online protection of minors arrow_right_alt

Art. 29 DSA - Exclusion for micro and small enterprises arrow_right_alt

Art. 30 DSA - Traceability of traders arrow_right_alt

Art. 31 DSA - Compliance by design arrow_right_alt

Art. 32 DSA - Right to information arrow_right_alt

Art. 33 DSA - Very large online platforms and very large online search engines arrow_right_alt

Art. 34 DSA - Risk assessment arrow_right_alt

Art. 35 DSA - Mitigation of risks arrow_right_alt

  1. Providers of very large online platforms and of very large online search engines shall put in place reasonable, proportionate and effective mitigation measures, tailored to the specific systemic risks identified pursuant to Article 34, with particular consideration to the impacts of such measures on fundamental rights. Such measures may include, where applicable:
    1. adapting the design, features or functioning of their services, including their online interfaces;
    2. adapting their terms and conditions and their enforcement;
    3. adapting content moderation processes, including the speed and quality of processing notices related to specific types of illegal content and, where appropriate, the expeditious removal of, or the disabling of access to, the content notified, in particular in respect of illegal hate speech or cyber violence, as well as adapting any relevant decision-making processes and dedicated resources for content moderation;
    4. testing and adapting their algorithmic systems, including their recommender systems;
    5. adapting their advertising systems and adopting targeted measures aimed at limiting or adjusting the presentation of advertisements in association with the service they provide;
    6. reinforcing the internal processes, resources, testing, documentation, or supervision of any of their activities in particular as regards detection of systemic risk;
    7. initiating or adjusting cooperation with trusted flaggers in accordance with Article 22 and the implementation of the decisions of out-of-court dispute settlement bodies pursuant to Article 21;
    8. initiating or adjusting cooperation with other providers of online platforms or of online search engines through the codes of conduct and the crisis protocols referred to in Articles 45 and 48 respectively;
    9. taking awareness-raising measures and adapting their online interface in order to give recipients of the service more information;
    10. taking targeted measures to protect the rights of the child, including age verification and parental control tools, tools aimed at helping minors signal abuse or obtain support, as appropriate;
    11. ensuring that an item of information, whether it constitutes a generated or manipulated image, audio or video that appreciably resembles existing persons, objects, places or other entities or events and falsely appears to a person to be authentic or truthful is distinguishable through prominent markings when presented on their online interfaces, and, in addition, providing an easy to use functionality which enables recipients of the service to indicate such information.
  2. The Board, in cooperation with the Commission, shall publish comprehensive reports, once a year. The reports shall include the following:
    1. identification and assessment of the most prominent and recurrent systemic risks reported by providers of very large online platforms and of very large online search engines or identified through other information sources, in particular those provided in compliance with Articles 39, 40 and 42;
    2. best practices for providers of very large online platforms and of very large online search engines to mitigate the systemic risks identified.

Those reports shall present systemic risks broken down by the Member States in which they occurred and in the Union as a whole, as applicable.

  1. The Commission, in cooperation with the Digital Services Coordinators, may issue guidelines on the application of paragraph 1 in relation to specific risks, in particular to present best practices and recommend possible measures, having due regard to the possible consequences of the measures on fundamental rights enshrined in the Charter of all parties involved. When preparing those guidelines the Commission shall organise public consultations.
Related
Close tabsclose
  • 86
  • 87
  • 88
  • 89
  • 90

Recital 86

Providers of very large online platforms and of very large online search engines should deploy the necessary means to diligently mitigate the systemic risks identified in the risk assessments, in observance of fundamental rights. Any measures adopted should respect the due diligence requirements of this Regulation and be reasonable and effective in mitigating the specific systemic risks identified. They should be proportionate in light of the economic capacity of the provider of the very large online platform or of the very large online search engine and the need to avoid unnecessary restrictions on the use of their service, taking due account of potential negative effects on those fundamental rights. Those providers should give particular consideration to the impact on freedom of expression.

Recital 87

Providers of very large online platforms and of very large online search engines should consider under such mitigating measures, for example, adapting any necessary design, feature or functioning of their service, such as the online interface design. They should adapt and apply their terms and conditions, as necessary, and in accordance with the rules of this Regulation on terms and conditions. Other appropriate measures could include adapting their content moderation systems and internal processes or adapting their decision-making processes and resources, including the content moderation personnel, their training and local expertise. This concerns in particular the speed and quality of processing of notices. In this regard, for example, the Code of conduct on countering illegal hate speech online of 2016 sets a benchmark to process valid notifications for removal of illegal hate speech in less than 24 hours. Providers of very large online platforms, in particular those primarily used for the dissemination to the public of pornographic content, should diligently meet all their obligations under this Regulation in respect of illegal content constituting cyber violence, including illegal pornographic content, especially with regard to ensuring that victims can effectively exercise their rights in relation to content representing non-consensual sharing of intimate or manipulated material through the rapid processing of notices and removal of such content without undue delay. Other types of illegal content may require longer or shorter timelines for processing of notices, which will depend on the facts, circumstances and types of illegal content at hand. Those providers may also initiate or increase cooperation with trusted flaggers and organise training sessions and exchanges with trusted flagger organisations.

Recital 88

Providers of very large online platforms and of very large online search engines should also be diligent in the measures they take to test and, where necessary, adapt their algorithmic systems, not least their recommender systems. They may need to mitigate the negative effects of personalised recommendations and correct the criteria used in their recommendations. The advertising systems used by providers of very large online platforms and of very large online search engines can also be a catalyser for the systemic risks. Those providers should consider corrective measures, such as discontinuing advertising revenue for specific information, or other actions, such as improving the visibility of authoritative information sources, or more structurally adapting their advertising systems. Providers of very large online platforms and of very large online search engines may need to reinforce their internal processes or supervision of any of their activities, in particular as regards the detection of systemic risks, and conduct more frequent or targeted risk assessments related to new functionalities. In particular, where risks are shared across different online platforms or online search engines, they should cooperate with other service providers, including by initiating or joining existing codes of conduct or other self-regulatory measures. They should also consider awareness-raising actions, in particular where risks relate to disinformation campaigns.

Recital 89

Providers of very large online platforms and of very large online search engines should take into account the best interests of minors in taking measures such as adapting the design of their service and their online interface, especially when their services are aimed at minors or predominantly used by them. They should ensure that their services are organised in a way that allows minors to access easily mechanisms provided for in this Regulation, where applicable, including notice and action and complaint mechanisms. They should also take measures to protect minors from content that may impair their physical, mental or moral development and provide tools that enable conditional access to such information. In selecting the appropriate mitigation measures, providers can consider, where appropriate, industry best practices, including as established through self-regulatory cooperation, such as codes of conduct, and should take into account the guidelines from the Commission.

Recital 90

Providers of very large online platforms and of very large online search engines should ensure that their approach to risk assessment and mitigation is based on the best available information and scientific insights and that they test their assumptions with the groups most impacted by the risks and the measures they take. To this end, they should, where appropriate, conduct their risk assessments and design their risk mitigation measures with the involvement of representatives of the recipients of the service, representatives of groups potentially impacted by their services, independent experts and civil society organisations. They should seek to embed such consultations into their methodologies for assessing the risks and designing mitigation measures, including, as appropriate, surveys, focus groups, round tables, and other consultation and design methods. In the assessment on whether a measure is reasonable, proportionate and effective, special consideration should be given to the right to freedom of expression.

Art. 36 DSA - Crisis response mechanism arrow_right_alt

Art. 37 DSA - Independent audit arrow_right_alt

Art. 38 DSA - Recommender systems arrow_right_alt

Art. 39 DSA - Additional online advertising transparency arrow_right_alt

Art. 40 DSA - Data access and scrutiny arrow_right_alt

Art. 41 DSA - Compliance function arrow_right_alt

Art. 42 DSA - Transparency reporting obligations arrow_right_alt

Art. 43 DSA - Supervisory fee arrow_right_alt

Art. 44 DSA - Standards arrow_right_alt

Art. 45 DSA - Codes of conduct arrow_right_alt

Art. 46 DSA - Codes of conduct for online advertising arrow_right_alt

Art. 47 DSA - Codes of conduct for accessibility arrow_right_alt

Art. 48 DSA - Crisis protocols arrow_right_alt