My favourites

Chapter III – Due diligence obligations for a transparent and safe online environment (Art. 11-48)

Art. 11 DSA - Points of contact for Member States’ authorities, the Commission and the Board arrow_right_alt

Art. 12 DSA - Points of contact for recipients of the service arrow_right_alt

Art. 13 DSA - Legal representatives arrow_right_alt

Art. 14 DSA - Terms and conditions arrow_right_alt

Art. 15 DSA - Transparency reporting obligations for providers of intermediary services arrow_right_alt

Art. 16 DSA - Notice and action mechanisms arrow_right_alt

Art. 17 DSA - Statement of reasons arrow_right_alt

Art. 18 DSA - Notification of suspicions of criminal offences arrow_right_alt

Art. 19 DSA - Exclusion for micro and small enterprises arrow_right_alt

Art. 20 DSA - Internal complaint-handling system arrow_right_alt

Art. 21 DSA - Out-of-court dispute settlement arrow_right_alt

Art. 22 DSA - Trusted flaggers arrow_right_alt

  1. Providers of online platforms shall take the necessary technical and organisational measures to ensure that notices submitted by trusted flaggers, acting within their designated area of expertise, through the mechanisms referred to in Article 16, are given priority and are processed and decided upon without undue delay.
  2. The status of ‘trusted flagger’ under this Regulation shall be awarded, upon application by any entity, by the Digital Services Coordinator of the Member State in which the applicant is established, to an applicant that has demonstrated that it meets all of the following conditions:
    1. it has particular expertise and competence for the purposes of detecting, identifying and notifying illegal content;
    2. it is independent from any provider of online platforms;
    3. it carries out its activities for the purposes of submitting notices diligently, accurately and objectively.
  3. Trusted flaggers shall publish, at least once a year easily comprehensible and detailed reports on notices submitted in accordance with Article 16 during the relevant period. The report shall list at least the number of notices categorised by:
    1. the identity of the provider of hosting services,
    2. the type of allegedly illegal content notified,
    3. the action taken by the provider.

Those reports shall include an explanation of the procedures in place to ensure that the trusted flagger retains its independence.

Trusted flaggers shall send those reports to the awarding Digital Services Coordinator, and shall make them publicly available. The information in those reports shall not contain personal data.

  1. Digital Services Coordinators shall communicate to the Commission and the Board the names, addresses and email addresses of the entities to which they have awarded the status of the trusted flagger in accordance with paragraph 2 or whose trusted flagger status they have suspended in accordance with paragraph 6 or revoked in accordance with paragraph 7.
  2. The Commission shall publish the information referred to in paragraph 4 in a publicly available database, in an easily accessible and machine-readable format, and shall keep the database up to date.
  3. Where a provider of online platforms has information indicating that a trusted flagger has submitted a significant number of insufficiently precise, inaccurate or inadequately substantiated notices through the mechanisms referred to in Article 16, including information gathered in connection to the processing of complaints through the internal complaint-handling systems referred to in Article 20(4), it shall communicate that information to the Digital Services Coordinator that awarded the status of trusted flagger to the entity concerned, providing the necessary explanations and supporting documents. Upon receiving the information from the provider of online platforms, and if the Digital Services Coordinator considers that there are legitimate reasons to open an investigation, the status of trusted flagger shall be suspended during the period of the investigation. That investigation shall be carried out without undue delay.
  4. The Digital Services Coordinator that awarded the status of trusted flagger to an entity shall revoke that status if it determines, following an investigation either on its own initiative or on the basis information received from third parties, including the information provided by a provider of online platforms pursuant to paragraph 6, that the entity no longer meets the conditions set out in paragraph 2. Before revoking that status, the Digital Services Coordinator shall afford the entity an opportunity to react to the findings of its investigation and its intention to revoke the entity’s status as trusted flagger.
  5. The Commission, after consulting the Board, shall, where necessary, issue guidelines to assist providers of online platforms and Digital Services Coordinators in the application of paragraphs 2, 6 and 7.
Related
Close tabsclose
  • 61
  • 62

Recital 61

Action against illegal content can be taken more quickly and reliably where providers of online platforms take the necessary measures to ensure that notices submitted by trusted flaggers, acting within their designated area of expertise, through the notice and action mechanisms required by this Regulation are treated with priority, without prejudice to the requirement to process and decide upon all notices submitted under those mechanisms in a timely, diligent and non-arbitrary manner. Such trusted flagger status should be awarded by the Digital Services Coordinator of the Member State in which the applicant is established and should be recognised by all providers of online platforms within the scope of this Regulation. Such trusted flagger status should only be awarded to entities, and not individuals, that have demonstrated, among other things, that they have particular expertise and competence in tackling illegal content and that they work in a diligent, accurate and objective manner. Such entities can be public in nature, such as, for terrorist content, internet referral units of national law enforcement authorities or of the European Union Agency for Law Enforcement Cooperation (‘Europol’) or they can be non-governmental organisations and private or semi-public bodies such as the organisations part of the INHOPE network of hotlines for reporting child sexual abuse material and organisations committed to notifying illegal racist and xenophobic expressions online. To avoid diminishing the added value of such mechanism, the overall number of trusted flaggers awarded in accordance with this Regulation should be limited. In particular, industry associations representing their members’ interests are encouraged to apply for the status of trusted flaggers, without prejudice to the right of private entities or individuals to enter into bilateral agreements with the providers of online platforms.

Recital 62

Trusted flaggers should publish easily comprehensible and detailed reports on notices submitted in accordance with this Regulation. Those reports should indicate information such as the number of notices categorised by the provider of hosting services, the type of content, and the action taken by the provider. Given that trusted flaggers have demonstrated expertise and competence, the processing of notices submitted by trusted flaggers can be expected to be less burdensome and therefore faster compared to notices submitted by other recipients of the service. However, the average time taken to process may still vary depending on factors including the type of illegal content, the quality of notices, and the actual technical procedures put in place for the submission of such notices.

For example, while the Code of conduct on countering illegal hate speech online of 2016 sets a benchmark for the participating companies with respect to the time needed to process valid notifications for removal of illegal hate speech, other types of illegal content may take considerably different timelines for processing, depending on the specific facts and circumstances and types of illegal content at stake. In order to avoid abuses of the trusted flagger status, it should be possible to suspend such status when a Digital Services Coordinator of establishment opened an investigation based on legitimate reasons. The rules of this Regulation on trusted flaggers should not be understood to prevent providers of online platforms from giving similar treatment to notices submitted by entities or individuals that have not been awarded trusted flagger status under this Regulation, from otherwise cooperating with other entities, in accordance with the applicable law, including this Regulation and Regulation (EU) 2016/794 of the European Parliament and of the Council (1). The rules of this Regulation should not prevent the providers of online platforms from making use of such trusted flagger or similar mechanisms to take quick and reliable action against content that is incompatible with their terms and conditions, in particular against content that is harmful for vulnerable recipients of the service, such as minors.


(1) Regulation (EU) 2016/794 of the European Parliament and of the Council of 11 May 2016 on the European Union Agency for Law Enforcement Cooperation (Europol) and replacing and repealing Council Decisions 2009/371/JHA, 2009/934/JHA, 2009/935/JHA, 2009/936/JHA and 2009/968/JHA (OJ L 135, 24.5.2016, p. 53).

Art. 23 DSA - Measures and protection against misuse arrow_right_alt

Art. 24 DSA - Transparency reporting obligations for providers of online platforms arrow_right_alt

Art. 25 DSA - Online interface design and organisation arrow_right_alt

Art. 26 DSA - Advertising on online platforms arrow_right_alt

Art. 27 DSA - Recommender system transparency arrow_right_alt

Art. 28 DSA - Online protection of minors arrow_right_alt

Art. 29 DSA - Exclusion for micro and small enterprises arrow_right_alt

Art. 30 DSA - Traceability of traders arrow_right_alt

Art. 31 DSA - Compliance by design arrow_right_alt

Art. 32 DSA - Right to information arrow_right_alt

Art. 33 DSA - Very large online platforms and very large online search engines arrow_right_alt

Art. 34 DSA - Risk assessment arrow_right_alt

Art. 35 DSA - Mitigation of risks arrow_right_alt

Art. 36 DSA - Crisis response mechanism arrow_right_alt

Art. 37 DSA - Independent audit arrow_right_alt

Art. 38 DSA - Recommender systems arrow_right_alt

Art. 39 DSA - Additional online advertising transparency arrow_right_alt

Art. 40 DSA - Data access and scrutiny arrow_right_alt

Art. 41 DSA - Compliance function arrow_right_alt

Art. 42 DSA - Transparency reporting obligations arrow_right_alt

Art. 43 DSA - Supervisory fee arrow_right_alt

Art. 44 DSA - Standards arrow_right_alt

Art. 45 DSA - Codes of conduct arrow_right_alt

Art. 46 DSA - Codes of conduct for online advertising arrow_right_alt

Art. 47 DSA - Codes of conduct for accessibility arrow_right_alt

Art. 48 DSA - Crisis protocols arrow_right_alt