Early on Saturday morning, a trilogue negotiation between EU member states, the European Parliament, and the European Commission ended with an agreement on the final terms of legislation called the Digital Services Act, or DSA. The DSA is sister legislation to the Digital Markets Act, or the DMA, the terms of which were agreed upon late last month. Both the DMA and DSA were crafted as part of a broader legislative initiative called Shaping Europe’s Digital Future which was introduced in December 2020. With the terms of the DSA proposal agreed upon, a vote in the European Parliament — which is largely seen as a perfunctory formality following negotiations — will enshrine the DSA into law. A slogan that was used to capture the motivation behind the DSA is, “what is illegal offline must also be illegal online.”
Taken together, the DMA and the DSA represent consequential and sweeping regulation of the consumer technology landscape — as is evidenced by the substantial lobbying effort pursued by the consumer technology giants against both pieces of legislation. While the focus of the DMA is on the competitive power of the largest digital platforms, the primary focus of the DSA is on content: the legislation harmonizes various national laws related to online content moderation at the EU level and modernizes and expands the EU’s e-Commerce Directive, which established a framework for online services and was adopted in 2000.
The DSA is designed to impose legally-binding content controls on digital platforms, specifically related to illegal content, transparent advertising, and disinformation. Among other requirements, the DSA imposes new content moderation and transparency obligations on digital platforms (especially related to illegal content and algorithmic curation), and it compels them to provide clarity to users around advertising targeting. The DSA identifies distinct classes of digital platforms which it terms ‘Very Large Online Platforms’, or VLOPs, and ‘Very Large Search Engines’, or VLSEs, and upon which it institutes more demanding requirements around accountability and transparency for the purpose of managing ‘systemic risk’.
The current text of the DSA, subject to change after the agreement was reached in trilogue negotiations, can be found here; helpful summaries are also provided by Euractive and Tech Policy Press. The DSA’s restrictions will go into force 15 months after being voted into law or on January 1st, 2024, whichever is later. The DSA was widely championed by politicians in both Europe and in the United States: Barack Obama celebrated the potential legislation in a speech last week at Stanford University that deliberated on the perils of online disinformation, and Hillary Clinton tweeted a similar sentiment.
The DSA’s impact on digital advertising
While the DSA is far-reaching and broad-based, the scope of this article is its impact on the digital advertising ecosystem. The word ‘advertising’ can be found 37 times in the original text of the DSA, and many of the amendments to the original text, negotiated in the trilogue, are directed at advertising protections. The final text of the DSA will not be released for some time, as it must be translated into each of the EU’s 24 official languages. It’s important to note that the DSA does not amend existing rules around consent and transparency with respect to advertising as imposed by the General Data Protection Regulation (GDPR), but rather its new restrictions are designed to complement them.
The new regulations on digital advertising introduced by the DSA, broadly, are:
A full ban on targeted advertising to minors. This is a common-sense, uncontroversial restriction introduced in the January amendments that is broadly popular across the advertising ecosystem. But the approach to implementation, and the degree to which this restriction is enforced, will depend on whether or not a platform can confirm that a person is not a minor. Heading into the most recent trilogue, it was unclear whether this restriction would be imposed in any case where a platform cannot confirm that a user is of legal age, which would expand its application much more broadly than merely in the case of minors (there are many more cases of, “we don’t know what this user’s age is” than there are, “we know this user is a minor”). As I argue in this interview, such an interpretation would also likely have the consequence of favoring the largest platforms that collect the most data on users.
A press release on the successful negotiations from the European Council uses the phrasing, “in particular when they are aware that a user is a minor” to describe the heightened obligations of digital platforms with respect to the treatment of minors’ data, and the latest DSA text that I have seen refers to “reasonable certainty,” which seems to imply that ads targeting is only prohibited when a platform has reason to believe that a user is indeed a minor (and not when it simply doesn’t know). A restriction of this nature, applicable only to users believed by a platform to be minors, is not surprising nor altogether disruptive given the various existing platform restrictions on the commercial use of minors’ data (see Apple, Google).
A ban on the use of sensitive data in advertising targeting. This is another restriction introduced in an amendment, and the specific text states that digital platforms “shall not present advertising to recipients of the service based on profiling within the meaning of Article 4(4) of Regulation 2016/679 using special categories of personal data” as defined by the GDPR, which includes sexual orientation, political opinions, racial or ethnic origin, etc. Neither the GDPR nor the revised DSA text that I have seen specifically cites gender or gender identity as a category of sensitive data in addition to what is defined in the GDPR. Without more context or clarification, it does not seem that gender has been prohibited as an acceptable targeting characteristic, as some reporting on the DSA has stated. According to Corporate Europe Observatory, a wholesale ban on targeted advertising was initially proposed as part of the DSA but was ultimately abandoned.
A requirement to provide users with meaningful information about how their data will be monetized, along with an opt-out mechanic. Platforms must inform users about how the data they emit on the service will be used to target ads to them, and platforms must ensure that “refusing consent [for ads targeting using behavioral data] shall be no more difficult or time-consuming to the recipient than giving consent.”
A requirement to provide transparency around the sponsor of, and targeting parameters used in, exposed ads. The DSA compels digital platforms to clearly label ads (eg. such that a user would not confuse an ad for native, organic content), to indicate on whose behalf the ad is being exposed, and to provide “meaningful information about the main parameters used” to target the users to whom the ad is exposed.
A requirement for VLOPs to maintain a repository of exposed ads, sponsor information, parameters used to target, and total exposures. Very Large Online Platforms must create repositories for all ads exposed on their platform for at least one year after each ad’s final exposure. For each ad exposed, data must be made available regarding the content of the ad, the sponsor of the ad, the period during which the ad was exposed, the targeting parameters used in serving the ad, and the total number of people to whom the ad was exposed, broken out by targeting group. All of this data must be made available via API.
Requirements around algorithmic transparency and data access. While these stipulations within the DSA don’t directly impact the targeting or serving of ads, they are puzzling and ambiguously worded to such a degree that they’ll almost certainly spawn a degraded level of service in the EU. Specifically, the DSA compels VLOPs to “provide access to data to vetted researchers…for the sole purpose of conducting research that contributes to the identification and understanding of systemic risks.” The DSA further clarifies that vetted researchers must be affiliated with an academic institution and be untethered from commercial interests, but these safeguards seem flimsy and short-term at best. It’s unclear to me how authors of the DSA can expect the largest consumer technology platform operators to give unfettered access to their datasets to academics and researchers — this requirement is an invitation for data access abuse or the siphoning of trade secrets.
Enforcement and interpretation
Ultimately, the provisions of the DSA as they relate to digital advertising are not overly disruptive or divergent from current policy momentum. In fact, the DSA’s proposed regulations of digital advertising are more tempered than those of the Banning Surveillance Advertising Act, which was introduced to both the US House and Senate in January of this year.
The DSA imposes substantial penalties on non-compliance — up to 6% of global annual revenue. But enforcement of the DSA, especially around algorithmic transparency related to the operations of the VLOPs, will require an army of specialists versed in computer and data science, and the DSA only provisions for the hiring of 230 additional regulatory staff. This seems insufficient for the investigatory nature of policing the world’s largest technology companies.