Sorry, no posts matched your criteria.

The Digital Services Act has been approved: targeted advertising will soon be restricted

07.11.2022 Reading time: 8 minutes

Over time, we have published several articles about the upcoming Digital Services Act and Digital Markets Act on this blog. The prior was given the final green light by the European Council on 27 October. The DSA has thus been finally approved. The Regulation will enter into force in early 2024.

The DSA brings with it a lot of new rules, especially for online platforms and marketplaces. But one of the most eye-catching parts of the online marketing business is undoubtedly the strict rules that limit targeted advertising.  

We will briefly summarize the main features of the DSA and consider the rules regarding targeted advertising in more detail.

What exactly is that DSA again?

At the end of 2021, the European Commission announced the first drafts of the Digital Services Act and the Digital Markets Act. These 2 new European regulations together form the Digital Service Package, which should ensure more and better regulation of online platforms in particular within the European Union.

The intention of both Regulations is that, in conjunction with the existing GDPR and the future AI Regulation and ePrivacy Regulation, they should make European consumers and European companies more resilient to the strength and economic weight of large internet multinationals such as Facebook, Google, Apple, Microsoft, Amazon and others. 

The rather ambitious intent of the DMA and DSA is to ensure a reform of the digital space in the EU, with a comprehensive set of new rules for all digital services, including social media, online marketplaces and other online platforms operating in the EU. This should ensure better protection for European consumers as well as for European digital service providers. 

The Digital Markets Act has been passed for some time now and will come into effect in 2023. The second regulation, the Digital Services Act, has now also been passed and will enter into force in 2024.

Who does the DSA apply to?

With the DSA, the EU wants to create a clear and strict framework for social media platforms, marketplaces and search engines. The intention is to ensure more transparency, equality, respect for the basic rights of EU citizens and companies within these very popular platforms, as well as to combat illegal content on these platforms.

The regulation applies to every provider of intermediary services that is active in the EU. As is the case for the GDPR, companies that do not have a branch or registered office in the EU themselves must appoint a representative in the EU who will act as a point of contact for the European governments.

What’s in the DSA?

Most of the focus is on transparency obligations, which must ensure that disinformation, hateful content and fake news are combated, especially on social media. In the context of the war in Ukraine, a kind of crisis stress test has even been added to the text at the last minute, which should allow to analyze the impact of major online platforms and search engines on a particular crisis and intervene where necessary to protect fundamental rights. of EU citizens. 

In a second part of the Regulation, much attention is paid to transparency. Platform websites, for example, will have to provide clarity about the algorithms that are used to give certain content prominent attention, while other content is placed backwards. In the same quest for more transparency online, there will be a ban on so-called “dark patterns”. These are subtle techniques such as layout, placement of certain content, the course of order flows, etc… that unconsciously try to push visitors in the direction of a certain decision, such as an online purchase. 

Additional protection will also be provided with regard to minors. Platforms accessible to minors must take additional safeguards to ensure their safety online and prohibit targeted advertising or ‘targeted advertising’ to minors based on their profile information.

Profiling and targeted advertising

The DSA builds on the comprehensive protection that GDPR already offers to European citizens and adds a new layer to it. In the future, online marketers will therefore have to take into account not only the GDPR and ePrivacy rules (cookies and anti-spam), but also the new restrictions imposed by the DSA, at least if their activity falls within the scope of the DSA (as already indicated, this concerns intermediary services such as social media, marketplaces and search engines).  

GDPR already contains extensive rules regarding the right to object to profiling (and targeted advertising based on profile data) and against automated individual decision-making based on profile data. Those rules remain unaffected.

In addition, online platforms must now ensure that users of social media such as Facebook, marketplaces such as Amazon or Bol.com or search engines from Google and others know and understand what data is collected about them and used to show them targeted advertisements. 

Creating transparency about the use of consumer profiles for advertising purposes is something that can only be welcomed, because as citizens and consumers we share a lot of details about our private lives. This is not only the case on social media, marketplaces also collect a lot of data about consumers who visit their platform, far beyond basic information about what you as a consumer have purchased and how you have paid for those goods. The algorithms of marketplaces also process data about your surfing and clicking behaviour: which pages have you viewed, how long and how often have you viewed a certain product, at what time of day did you do that, etc… Moreover, you share address details, gender, place of residence and payment details each time you make a purchase. That data indirectly says a lot about you as a person: hobbies, preferences, work, marriage situation, holiday destinations, income, … can all be derived from the above data. Based on this, a very accurate profile of you as a citizen and consumer can be created and the Googles and Facebooks of this world earn big money with the sale of that profile to advertisers.

Digital platforms will now have to be transparent about this profiling in the future. With each ad, users need to know that they are seeing a targeted ad, based on what criteria that happens, and they need to know on whose behalf it is being shown and (if that is another party) who paid for it. For the aforementioned online platforms, this will undoubtedly become a significant logistical and technical challenge.

Users are also given the right to oppose profiling based on their surfing behaviour. In many cases, such a right to object already follows from the GDPR today, but the DSA explicitly confirms this once again. Users must also be able to report via a button that certain content is or contains advertising in their opinion.

Ads targeting minors are absolutely prohibited, at least if the platform is “aware with reasonable certainty that the recipient is a minor”. The DSA explicitly says that platforms should not collect additional data to verify the age of their users, so we suspect that some platforms would prefer not to ask for the age of their users if it is not absolutely necessary…  

Also targeted advertising (‘targeted advertising’) based on ‘sensitive data’ under GDPR, such as religion, ethnicity, sexual preference, political preference, … is absolutely prohibited.

Also important for online marketing and e-commerce

Online merchants and marketers should also take these innovations in the Digital Service Act into account:

  • Online marketplaces are given a ‘Know Your Customer’ duty with regard to the merchants who offer their goods on those marketplaces and obligation to do (sample) checks among their sellers. This should ensure that consumers receive more protection against abuse, counterfeiting and fraud online.
  • The criteria used to determine the order of recommended results (for example, a results list of recommended sellers of a product on a marketplace) must be transparent from the terms of use. 
  • Manipulating users’ choices through ‘dark patterns’ is prohibited. Online platforms and marketplaces may not manipulate the free choice of visitors by, for example, placing certain choices more in the foreground, placing choice buttons prominently on the screen or giving them a more striking color or shape, using disruptive pop-ups to encourage people to make or change a choice, to use unclear order flows, to always request the same opt-in with every new visit until the consumer finally agrees, … 
  • Rapid removal of illegal content, products and services: Platforms must have an efficient ‘notice and take down‘ procedure that allows users to report illegal content (counterfeiting, unfair market practices, but also racist or hateful messages and fake news). They must also report publicly annually with figures on the number of interventions, their nature, etc… 
  • The terms of use of providers of intermediary services must contain information about policies, procedures, measures and tools that ensure content moderation and automated decision-making and about their internal complaints procedures . Any change in the terms of use must also be communicated to all users. 
  • In the future, social media must also provide transparency about the reasons why accounts are restricted, suspended or closed. Anyone who has ever had the unpleasant experience of a suddenly closed Facebook account knows that social media today does not provide transparency at all about the reasons why accounts are closed.
  • Platforms must have an internal complaint-handling mechanism (and communicate about it clearly) and must give users access to alternative dispute resolution (arbitration, mediation, …) if they wish.
  • Compliance by design: (in particular) marketplaces must be constructed in such a way that they allow sellers on those marketplaces to comply with their legal obligations (possibly to state the seller’s identity details, provide mandatory product information, state quality labels, etc.) and the marketplaces also need to conduct spot checks to verify the compliance of their sellers.
  • Very additional rules also apply to very large platforms and search engines (Google, Facebook, Amazon, etc.), such as, for example, mandatory prior risk assessments. 

Need help?

Do you have questions about the Digital Services Act or about GDPR and e-commerce in a broader context? Feel free to schedule a free appointment with our team or, in the meantime, take a look at our legal webshop with many standard services that can help you out.

Questions about this topic?

Sorry, no posts matched your criteria.