The EU Digital Services Act (the DSA) is intended to modernise the existing e-commerce Directive and address illegal content, transparent advertising and disinformation. It aims to create a safer and more transparent online environment for users. It does this by placing obligations on all digital services that connect consumers to goods, services and content, including “intermediaries” that provide conduit, caching and hosting services. Although not yet finalised, the UK Online Safety Act contains some similar provisions.

Companies need to consider if they provide services attracting these enhanced obligations and devise a strategy on how to comply.

Is the text of the law finalised yet? When will it apply?

Yes. The DSA is finalised. It entered into force on 16 November 2022 and will apply (for the most part) from 17 February 2024. However, very large online platforms (VLOPs) and very large online search engines (VLOSEs) were required to comply from 17 February 2023.

Who does it apply to?

The DSA applies in a funnel-like manner with certain obligations applying to all intermediary services providers (ISPs) and certain other cumulative obligations arising depending on the specific intermediary services being provided and the number of recipients of those services. The DSA has extra territorial application meaning that it will apply to non-EU organisations that target the EU market, requiring such non-EU organisations to designate a legal representative in an EU Member State where the ISP offers its services.

The categories of services, platforms and providers referred to in the DSA are:

Intermediary service providers being providers of:

  • “mere conduit” services - these are services consisting of the transmission of information provided by a service recipient in a communication network, or the provision of access to a communication network (e.g. virtual private networks, domain name system services, internet exchange points, etc)
  • “caching” services - these are services involving the automatic, intermediate and temporary storage of information provided by a service recipient for the purposes of making the onward transmission of that information to other recipients more efficient (e.g. content delivery networks)
  • “hosting” services - these are services that store information provided by a service recipient - (e.g. cloud services)

Online platforms - these are hosting services that, at the request of the recipient of the service, store and disseminate information to the public, e.g. online marketplaces, app stores, collaborative economy platforms and social media platforms. Depending on their user numbers, some online platforms will be designated “very large online platforms” and be subject to additional obligations.

Online search engines - these are services that allow users to input queries in order to perform searches of websites and which return results in a format in which information related to the requested content can be found. Depending on their user numbers, some online search engines will be designated “very large online search engines” and be subject to additional obligations. 

What are the key obligations?

Please note that these are not exhaustive but are the key obligations that must be complied with.

All intermediary service providers must:

  • comply with orders from the relevant national authorities to act against illegal content. However, the DSA confirms that ISPs have no general obligation to monitor content and are generally exempt from liability even where they voluntarily take steps to detect, identify or remove illegal content or take measures necessary to comply with national or EU law. However, ISPs must provide certain information to authorities about the actions they have taken in relation to illegal content;
  • appoint a single point of contact for Member State and European-wide authorities and a single point of contact for recipients of the services. Details of the single point of contact must be made public;
  • prepare publicly available reports on their content removal and moderation activities on an annual basis; and
  • include certain specified information in their terms and conditions. This includes information on any policies, procedures, measures and tools used for content moderation, including algorithmic decision-making and human review and the rules of procedure of their internal complaint handling system. This must be in clear and unambiguous language.

Hosting services (including providers of online platforms) must:

  • implement a mechanism to allow third parties to report alleged illegal content and must notify law enforcement or judicial authorities of certain suspected criminal offences it becomes aware of; and
  • provide recipients of the service with information as to why content has been removed.

Online platforms must:

  • provide an internal complaint handling system to allow users to challenge the platform’s decisions to remove content; and
  • comply with enhanced transparency obligations, including around online advertising, recommender systems and the removal or disabling of illegal, or otherwise prohibited, content.

The DSA expressly prohibits online platform providers from using “dark patterns” (i.e. deceptive interfaces that trick users into doing things, like agreeing to hidden costs, disguised ads, etc.).

Online platform providers are subject to further requirements where the online platform is accessible to minors and/or where the online platform allows consumers to conclude distance contracts with traders.

VLOPs and VLOSEs

These very large providers are subject to additional obligations, including requirements to:

  • conduct an annual risk assessment to adopt and document effective mitigation measure;
  • establish an independent compliance function comprised of one or more appropriately knowledgeable and qualified compliance officer(s), including a dedicated head of the compliance function who reports directly into the management body of the VLOP or VLOSE; and
  • provide additional information and user optionality in relation to the online advertising and recommender systems used on their platforms.

Does the UK have anything similar?

Yes. There are some provisions of the UK Online Safety Act that are similar to aspects of the DSA concerning duties on restricting illegal content, updating terms and conditions and the implementation of reporting mechanisms.

What are some of the commercial impacts of the DSA?

Organisations will need to address the practical obligations of the appointment of legal representatives, points of contact and compliance officers. They will also need to consider carefully how to comply with enhanced transparency obligations and the requirements to publish certain user-facing and regulator-facing notices in relation to recommender systems and online advertising. It is currently unclear how the required information will be presented and what level of detail will be given, but organisations will need to be prepared to make technical changes to their platform to enable compliance.



Subscribe and stay up to date with the latest legal news, information and events . . .