My favourites

Chapter III – Due diligence obligations for a transparent and safe online environment (Art. 11-48)

Art. 11 DSA - Points of contact for Member States’ authorities, the Commission and the Board arrow_right_alt

Art. 12 DSA - Points of contact for recipients of the service arrow_right_alt

Art. 13 DSA - Legal representatives arrow_right_alt

Art. 14 DSA - Terms and conditions arrow_right_alt

Art. 15 DSA - Transparency reporting obligations for providers of intermediary services arrow_right_alt

Art. 16 DSA - Notice and action mechanisms arrow_right_alt

Art. 17 DSA - Statement of reasons arrow_right_alt

Art. 18 DSA - Notification of suspicions of criminal offences arrow_right_alt

Art. 19 DSA - Exclusion for micro and small enterprises arrow_right_alt

Art. 20 DSA - Internal complaint-handling system arrow_right_alt

Art. 21 DSA - Out-of-court dispute settlement arrow_right_alt

Art. 22 DSA - Trusted flaggers arrow_right_alt

Art. 23 DSA - Measures and protection against misuse arrow_right_alt

Art. 24 DSA - Transparency reporting obligations for providers of online platforms arrow_right_alt

Art. 25 DSA - Online interface design and organisation arrow_right_alt

  1. Providers of online platforms shall not design, organise or operate their online interfaces in a way that deceives or manipulates the recipients of their service or in a way that otherwise materially distorts or impairs the ability of the recipients of their service to make free and informed decisions.
  2. The prohibition in paragraph 1 shall not apply to practices covered by Directive 2005/29/EC or Regulation (EU) 2016/679.
  3. The Commission may issue guidelines on how paragraph 1 applies to specific practices, notably:
    1. giving more prominence to certain choices when asking the recipient of the service for a decision;
    2. repeatedly requesting that the recipient of the service make a choice where that choice has already been made, especially by presenting pop-ups that interfere with the user experience;
    3. making the procedure for terminating a service more difficult than subscribing to it.
Related
Close tabsclose
  • 67

Recital 67

Dark patterns on online interfaces of online platforms are practices that materially distort or impair, either on purpose or in effect, the ability of recipients of the service to make autonomous and informed choices or decisions. Those practices can be used to persuade the recipients of the service to engage in unwanted behaviours or into undesired decisions which have negative consequences for them. Providers of online platforms should therefore be prohibited from deceiving or nudging recipients of the service and from distorting or impairing the autonomy, decision-making, or choice of the recipients of the service via the structure, design or functionalities of an online interface or a part thereof. This should include, but not be limited to, exploitative design choices to direct the recipient to actions that benefit the provider of online platforms, but which may not be in the recipients’ interests, presenting choices in a non-neutral manner, such as giving more prominence to certain choices through visual, auditory, or other components, when asking the recipient of the service for a decision.

It should also include repeatedly requesting a recipient of the service to make a choice where such a choice has already been made, making the procedure of cancelling a service significantly more cumbersome than signing up to it, or making certain choices more difficult or time-consuming than others, making it unreasonably difficult to discontinue purchases or to sign out from a given online platform allowing consumers to conclude distance contracts with traders, and deceiving the recipients of the service by nudging them into decisions on transactions, or by default settings that are very difficult to change, and so unreasonably bias the decision making of the recipient of the service, in a way that distorts and impairs their autonomy, decision-making and choice. However, rules preventing dark patterns should not be understood as preventing providers to interact directly with recipients of the service and to offer new or additional services to them. Legitimate practices, for example in advertising, that are in compliance with Union law should not in themselves be regarded as constituting dark patterns. Those rules on dark patterns should be interpreted as covering prohibited practices falling within the scope of this Regulation to the extent that those practices are not already covered under Directive 2005/29/EC or Regulation (EU) 2016/679.

Art. 26 DSA - Advertising on online platforms arrow_right_alt

Art. 27 DSA - Recommender system transparency arrow_right_alt

Art. 28 DSA - Online protection of minors arrow_right_alt

Art. 29 DSA - Exclusion for micro and small enterprises arrow_right_alt

Art. 30 DSA - Traceability of traders arrow_right_alt

Art. 31 DSA - Compliance by design arrow_right_alt

Art. 32 DSA - Right to information arrow_right_alt

Art. 33 DSA - Very large online platforms and very large online search engines arrow_right_alt

Art. 34 DSA - Risk assessment arrow_right_alt

Art. 35 DSA - Mitigation of risks arrow_right_alt

Art. 36 DSA - Crisis response mechanism arrow_right_alt

Art. 37 DSA - Independent audit arrow_right_alt

Art. 38 DSA - Recommender systems arrow_right_alt

Art. 39 DSA - Additional online advertising transparency arrow_right_alt

Art. 40 DSA - Data access and scrutiny arrow_right_alt

Art. 41 DSA - Compliance function arrow_right_alt

Art. 42 DSA - Transparency reporting obligations arrow_right_alt

Art. 43 DSA - Supervisory fee arrow_right_alt

Art. 44 DSA - Standards arrow_right_alt

Art. 45 DSA - Codes of conduct arrow_right_alt

Art. 46 DSA - Codes of conduct for online advertising arrow_right_alt

Art. 47 DSA - Codes of conduct for accessibility arrow_right_alt

Art. 48 DSA - Crisis protocols arrow_right_alt