A deep dive on European data privacy law

My guest in this episode of the Mobile Dev Memo podcast is Mikołaj Barczentewicz, an expert on EU digital privacy law. Mikołaj is a law professor at, and the research director of, the Law and Technology Hub at the University of Surrey in the United Kingdom, and he has research affiliations with the Stanford Law School and the University of Oxford, from which he received his Ph.D.

I learned of Mikołaj after reading a piece he co-wrote titled GDPR Decision Against Meta Highlights that Privacy Regulators Don’t Understand ‘Necessity’. I invited Mikołaj onto the podcast to discuss the recent spate of decisions in the EU related to digital privacy, including:

  • The Irish DPC’s ruling against Meta over the company’s use of the contractual basis for processing user data related to personalized advertising;
  • The French CNIL’s recent sanctions of Apple and Voodoo Games;
  • The invalidation of the EU–US Privacy Shield.

Additionally, Mikołaj and I discuss a wide range of more abstract topics:

  • Consent as a mechanism for collecting and processing first-party data in the EU;
  • The difference between the necessity and legitimate interest bases under GDPR;
  • The dynamics between the European Data Protection Board (EDPB) and the data protection agencies within the various EU states;
  • The future of trans-Atlantic data flows.

A lightly-edited transcript of our conversation can be found below. As always, the Mobile Dev Memo podcast is available on:

Podcast Transcript

Eric Seufert:

Mikolaj, I am so happy to speak with you today. I appreciate you taking the time to chat with me. We are going to have, I’m sure, a very interesting discussion on the topic of European data privacy legislation, but before we do that, I would ask you to introduce yourself to the audience in your own words.

Mikołaj Barczentewicz:

Hi. I’m Mikolaj Barczentewicz. I’m an academic law professor at the University of Surrey. I’m also a senior scholar at the International Center for Law and Economics, and I work generally on tech law issues and particularly on EU privacy law.

Eric Seufert:

Great, and this is just out of curiosity, aside from living in Europe, being from Europe, what triggered your interest in this topic? What made you want to pursue that?

Mikołaj Barczentewicz:

In general, I’ve always been interested in tech, even longer than I’ve been interested in law. I used to be a coder. I worked briefly in this broader web industry as a web developer in the mid-2000s and then, I went to law school. So I kept that interest and now I get to teach the legal aspects of the same thing that I knew from a different side before.

Eric Seufert:

Fascinating, so you kind of have more hands-on tactical experience in the way these products are built, the way that users interact with them, and I think that’s probably a very rare path for people to take. Have you ever met anyone else that has pursued that path?

Mikołaj Barczentewicz:

I don’t think so. I mean, there are some people like that, but not too many. Usually, lawyers and law academics just have straight legal paths, at least in Europe. We are a bit different than America.

Eric Seufert:

From my experience, most lawyers kind of knew they wanted to be a lawyer from age eight or whatever and never really strayed from that path, but that’s really fascinating.

So the reason I reached out to you was that I very much enjoyed an article that you wrote and published last month titled GDPR Decision Against Meta Highlights That Privacy Regulators Don’t Understand Necessity. Now, the reason I wanted to have you on the podcast is that it’s not just regulators that don’t understand necessity. I don’t understand necessity, and I think a lot of people that work in the digital advertising space don’t understand necessity, and I thought after reading your very educational piece, I would invite you onto the podcast to explain to the digital advertising operator audience, what is necessity?

Mikołaj Barczentewicz:

Excellent. Thanks for having me on.

Eric Seufert:

Great. So I will start with the subject of your piece. I should have revisited it right before the podcast recording, but I’ll start with, in any case, the Irish DPC’s recent sanction [of Meta] related to the contractual basis. So that was related to the ads platform, not … [the Irish DPC] had another decision that was related to WhatsApp specifically, but the one I’m talking about here was that the Irish DPC fines Facebook, I think it was like a record-breaking fine, something like 300 something million euros, 400 something million dollars related to their use of the contractual basis for collecting and processing user data for the purposes of ads targeting. In this case, it was first-party data. It was the data that is generated from users’ interactions that happen on Facebook, that happened on Instagram.

This was the subject of the sanction, and I thought this was a very interesting case because … I’ll ask you to elaborate because I have a layman’s understanding here, but it was a very interesting case from my perspective for two reasons. One, that the Irish DPC originally disagreed with the notion that Facebook was violating privacy law, and then, they pushed back on the EDPB, and the EDPB said, “No, you must impose this fine,” and I want to get into those dynamics too.

The second reason I found this fascinating was that this was related to first-party data. Now, everyone in the digital advertising space has become very familiar with App Tracking Transparency, Apple’s policy which draws a very bright line between first-party data and third-party data.

In this case, the sanction was over the use of first-party data, and I think that’s something that not a lot of digital advertising operators or even big tech companies understand. So, I’d love to just, first of all, maybe you could kind of provide a better background on this case than I just did.

Mikołaj Barczentewicz:

So here, it’s in a sense, one case but the Irish Data Protection Commission issued two decisions against Meta Ireland. So one was with respect to Facebook and one was with respect to Instagram. These decisions are mostly identical, almost paragraph-for-paragraph, just with some details changed because of the different characteristics of the services but yes, as you said, they did add up to a 400 million Euro fine and also to perhaps a behavioral change on Meta’s part, which I’m not sure if it happened, because what Meta used to do since 2018 I think, and perhaps still does, was to say that the lawful basis on which they process a user’s personal data for personal advertising purposes — that this lawful basis under the GDPR is in contractual necessity.

So it stems from the contract between a user of Instagram, between a user of Facebook, and Meta. This decision which the Irish Data Protection Commission was forced to take, says that it is unlawful for Meta to rely on this lawful basis. It doesn’t say strictly speaking that they cannot do behavioral or personalized advertising at all. It’s just that they cannot rely on this particular basis for it.

Eric Seufert:

Right, and so for the listeners who aren’t totally aware of what the GDPR stipulates as being legal bases for collecting and processing data, there are six and it feels like … and correct me if I’m wrong here, but it feels like you’d rather use any of the five before you have to go to consent, right? Because once you go to consent, you are going to see some proportion, probably some large proportion of users opt out. So you’d probably rather use any of the other five before you resort to consent. Can you just walk listeners through what those are? I imagine you probably know them off the top of your head. If you don’t, we can edit this out, don’t worry about it, but I imagine you probably do.

Mikołaj Barczentewicz:

That’s okay and I do have Article Six of the GDPR just in front of me. So as you said, we have letters A to F, and the first one, the first lawful basis, is consent, as you said. What we also have is necessity, contractual necessity. We have legitimate interests of the business that’s processing the data or some other third party, and I think that these two — contractual necessity and legitimate interests — are the ones that are most attractive for several reasons, and I’m sure we’ll talk about that in more detail. Then we have some other ones which are less likely to be applicable, at least in business cases because we have the necessity to protect vital interests of the data subject or another person, and performance of a task carried out in the public interest, and compliance with a legal obligation.

These cover some uses and for example, if you look at Meta’s privacy policy, they, I think, rely on all of them for various things, but for business purposes, you would usually want to rely on necessity or legitimate interests.

Eric Seufert:

Got it. So, the GDPR, I think especially from an American perspective, but also from the kind of litigation lens — and again, correct me if I’m wrong here — but it feels inscrutable. It feels inscrutable. You’re either an expert or nothing, right? What I’ve learned is, it’s very hard to dabble and get a high-level valid intellectual grasp of this. You either delve totally into it or your grasp is so superficial that it’s useless and potentially dangerous. So I think there are a couple of things that I would love to have you unpack here just to set the baseline, and I know even setting the baseline could probably take up the whole hour and a half, but with some kind of consideration paid to time. So why are we even talking about Ireland?

Well, there’s a provision in the GDPR, because obviously Meta is based in California, that’s their headquarters. There’s a provision in the GDPR called the One Stop Shop, and it says, “Okay, well what we did with GDPR was, we attempted to unify a bunch of privacy regulation across the EU member states, and we want to give companies the ability to just work through one point of contact, one privacy regulator.” Well, it so happens that most [non-EU companies] have the Irish DPC as their [EU] privacy regulator. Why? Because most foreign or non-European headquartered companies set up their EU headquarters in Ireland. Why? Because it’s very tax-friendly, but also just generally very business-friendly.

So that’s why you see Meta Ireland, Apple Ireland, Amazon Ireland, TikTok Ireland, and they all, when these issues come up, when the GDPR issues surface, they’re dealing with the Irish DPC. Can you just talk to me a little bit about the history there and why that’s important?

Mikołaj Barczentewicz:

So what all you said is 100% correct. Maybe they also like the weather. I don’t know. I hear Dublin can be nice.

Eric Seufert:

Seems unlikely, but sure, maybe.

Mikołaj Barczentewicz:

Yes, so the GDPR replaced a previous data privacy directive and … one of the problems that the GDPR was meant to address was that there was a perception earlier of complexity, legal uncertainty, and administrative costs associated with having this separate system of enforcement of privacy law in various countries. The European Union, one of the main ideas why we have it, is to promote the single European market to be in some ways like the United States where you can operate mostly across state borders, and that’s what the EU is meant to provide in the GDPR world, where the idea was to bring us closer to that, but as you say, providing businesses, cross border businesses with what the GDPR calls a single interlocutor, that there is one authority you can talk to instead of having 27 or more authorities.

Eric Seufert:

Right. Ironically, the US seems to be … with respect to privacy legislation and data privacy legislation, it seems to be abandoning that principle, because now we’ve got whatever, I think there’s like three new state-level data privacy laws that passed the legislative process this year or something. Maybe I’m wrong on that number. Anyway, there’s some … I think we’re approaching double digits of states that have their own idiosyncratic [privacy laws] … I mean they’re mostly the same. These state-level privacy laws, they’re sort of modeled after GDP, and for the most part, they’re the same. They tend to differ, if I understand correctly, with the sanctions that are imposed and the ability to form a class to sue somebody.

That’s my understanding. Okay, so the GDPR established One Stop Shop, you only have to deal with one privacy regulator. If you’re headquartered in the EU, that’s wherever you’re headquartered. Companies tend to be headquartered in Ireland for a number of reasons. So that’s why, when you see these GDPR issues erupt, it’s usually the Irish DPC that’s at the center of it. So then talk to me about the EDPB because if I’m totally honest, and maybe this is embarrassing, I had never heard of the EDPB until recently. It was not something that I was aware of and I think it came to the fore with this case specifically. Can you talk to me about the dynamic between the EU state-level privacy regulators and the EDPB … and maybe about how a state-level privacy regulator gets to determine whether the GDPR was violated or not, and then, what kind of sanction to impose or not,

Mikołaj Barczentewicz:

Right, so the EDPB, that’s the European Data Protection Board, you can’t think of it as being the boss of the national Data Protection Authorities or DPAs. It’s more like an organization with a role to coordinate the cooperation between domestic, national data protection authorities. All that we said about the One Stop Shop principle is true, but what we also should note is that there are exceptions to it. So for example, when you have Meta Ireland, well, domiciled in Ireland, when they do business in various member states, there may be ways for domestic national authorities in those states to take action with respect to what Meta is doing. There is a special urgency procedure in Article 66, but what should interest us here a bit more, and that’s how this Irish decision took shape, is the special cooperation mechanism in Article Six.

The way it worked here was that there was a complaint, a complaint filed, I think in Austria and in Ireland by Max Schrems’ organization, NOYB, and this was a complaint against Meta alleging that Meta is violating all sorts of provisions of the GDPR. So I think the complaint was around 2018. It took the Irish DPC quite a while to investigate it, but once they finished their investigation, they prepared a draft decision. Then, when this happens, when there is a draft decision, especially from the Irish DPC, then other national regulators who also have users of Facebook and Instagram in their country — which I guess is all the national regulators — can object to the approach taken by the lead authority, in this case, the Irish DPC, and if we have this draft decision to which there are objections that triggers the cooperation mechanism.

Normally, the idea in the cooperation mechanism is that the lead authority reaches some sort of agreement with those objecting concerned authorities and it takes all the objections into account or those authorities will retract their objections. This way, a decision can be finalized without another mechanism, but in this case, there was no such agreement. This meant that the … once the Irish DPC finished their draft that went to the procedure, the dispute resolution procedure where basically within the EDPB mechanism, the national authorities get to vote on how the decision should be resolved. At first, they vote by a two-thirds majority, but if that doesn’t work, then after two months or so, they go … the level required goes down to a simple majority.

So in the end, a simple majority of those European authorities can overrule whatever approach was taken by the lead authority. So far, as far I know, this happened seven times, and the lead authority that received this binding decision was the Irish Data Protection Commission.

Eric Seufert:

Okay, let me see if I can play that back to you with respect to this particular case because I think it’s really fascinating. So as far as I understand, NOYB launched the complaint the day that GDPR went into effect. We’re going to whatever day it was, 2018, GDPR is now the law of the land. We’re going to file the complaint. Irish DPC took some time, a lot of time, five years … they came to like a draft conclusion that said, now we don’t think this is a violation. We think that Meta can use a contractual basis for this purpose, for advertising targeting. Every European country has people within its borders that use Facebook. So they got a say and a majority of them disagreed, or was it a super-majority disagreed, so 67% said no, or is that how that works or-

Mikołaj Barczentewicz:

I’m not sure we know, but at least the majority.

Eric Seufert:

Right. Okay. So anyway, at least the majority disagreed. They went back and forth, at least a simple majority kept disagreeing, and we got to the point where the EDPB says, “Okay, well you can’t come to an agreement amongst yourselves. We are going to adopt this case and we are going to decide whether or not [Meta] violated GDPR and also what the sanction is.” Is that correct?

Mikołaj Barczentewicz:

Yes.

Eric Seufert:

Okay. So once that decision is made, it gets pushed back down to the relevant privacy regulator and they have to enforce it.

Mikołaj Barczentewicz:

Yes.

Eric Seufert:

I see. So that’s the journey that Meta went on, or the Irish DPC went on, through this process. The EDPB said, “No, we disagree with you Irish DPC. We’ve determined that Meta did in fact violate GDPR. Here’s the sanction that you must impose on them.” The Irish DPC received that. Obviously, Meta said they’re appealing. The Irish DPC is going to impose the sanction, and Meta is appealing. We’ll see what the outcome of that is. Then, Meta also said, “We think you overstepped, EDPB.” Can you talk to me about that?

Mikołaj Barczentewicz:

So the Irish DPC said-

Eric Seufert:

Sorry, the Irish DPC said, “We’re going to impose the sanction on Meta. We disagree with it.” I mean, this is all in the press release, “We disagree with the sanction, but we’re going to impose it but also EDPB, we disagree with your power here. We disagree with your interpretation of what your power is. We think you’ve overstepped the power granted to you by the GDPR and we might take some action.” Could you talk to me about how the Irish DPC reacted to this?

Mikołaj Barczentewicz:

So there were three issues from the Irish DPC’s perspective. There is a part of their decision of which there was no disagreement and that was about certain transparency requirements violations, and the Irish DPC thought that Meta did violate those rules. That’s in a sense, not the serious part of the decision.

Then there is the substantive disagreement between the Irish DPC and the EDPB over contractual necessity. Here, the Irish DPC, even in the press release when they say, “Yes, we’re fining Meta, but actually we kind of disagree with our own decision. We are forced to take it.” So that’s one issue, but there is a third aspect which is not strictly about contractual necessity, but it’s about EDPB telling the Irish DPC that they should start a new, broader investigation that the Irish DPC does not want to start.

This is what the Irish DPC says is an overreach: they say that they are not like a national court with general supervisory authority over the Irish DPC, that they can only coordinate once … there is a draft decision, but they cannot force a national authority to start an investigation. So in a sense, the disagreement over which the Irish DPC said they are considering going to court is not strictly speaking about this, the core issue in that decision — this will probably be down to Meta to challenge — but over this procedural aspect about forcing the Irish DPC to start a new investigation.

Eric Seufert:

Got it. Okay. So let me read that back to see if I’m understanding correctly. So as part of this judgment that the EDPB handed to the Irish DPC to implement, they also said, “By the way, we think that you should undertake a much larger investigation around Meta’s business practices related to GDPR.”

Mikołaj Barczentewicz:

Yeah.

Eric Seufert:

And what the Irish DPC is pushing back on is not the judgment that the EDPB handed down to them in the case of this investigation, which they just sort of accept that they must impose, but it actually relates to the EDPB’s ability to tell any given privacy regulator that they must undertake an investigation. And that’s where the Irish DPC says that the EDPB is overstepping.

Mikołaj Barczentewicz:

Yes. Although who knows, maybe if they bring an action for annulment, which is the technical name of the legal measure they would use against an EDPB decision, maybe then they will broaden the scope of that action to include some substantive issues because they did tell us they disagree substantively with the EDPB. So maybe they will do it or maybe they will simply leave that to Meta to litigate.

Eric Seufert:

Got it. Okay. That’s probably as clear as it’s going to get for a layman like myself, but that’s very helpful.

There are a dizzying number of acronyms that must be used in this space. I don’t know how you manage to maintain your sanity here. I thought ad tech was bad with acronyms, but European privacy is even worse.

So, just thinking about the consequences of this, how should companies that utilize first-party data for ads personalization interpret this decision? How should they adjust their own practices? Because to your point, this doesn’t prohibit the use. This just says, “Okay, well if you want to process data for that purpose, you’ve got to use a different legal basis from the GDPR, or we don’t think you can use contractual basis. You probably have to use a different basis.”

So what Meta has said and what a lot of people are saying privately is that, “Well, we’ll just switch to legitimate interest and that’ll be fine,” right? Now my argument back — and again, I’m a layman, I don’t know that much about this and I would love to hear your thoughts on my argument back — is that, well, TikTok tried that. TikTok tried to alter its privacy policy such that they were using the legitimate interest basis and not the contractual basis to sidestep the consent mechanism. Because [TikTok has] a consent mechanism in Europe. If you open up TikTok [in the EU], there’s a consent popup that says, “Do you agree to have your data be used for ads personalization” or whatever, that exists, right? They wanted to stop doing that.

So what [TikTok] had proposed to do was to change its privacy policy such that it was using legitimate interest so they wouldn’t have to collect consent. And my understanding is the Italian DPA said, “No way. If you do that, we’re going to challenge it.” And then, TikTok consulted with the Irish DPC, which is their privacy regulator because they’re headquartered in Ireland, that conversation took place and then they said, “Okay, we’re not going to make this change. We’re going to stick with what we’ve got.” Can you talk to me about that? What is the legitimate interest basis and why might that not be the silver bullet here for any company that’s doing this exact same thing?

Mikołaj Barczentewicz:

So the legitimate interest basis may seem to be quite attractive because it means that it is lawful for you to process your user’s personal data for your own interest. And you have set some of the examples that GDPR gives for that are if you need to prevent fraud on your services, or even for direct marketing. Direct marketing is an example of legitimate interest that is used by the GDPR itself. So it does seem attractive.

The problem with this is that the same part of Article Six that introduces it also says that it’s allowed except, where such interests, those legitimate interests, are overridden by the interests or fundamental rights and freedoms of the data subject. So then you have this difficult legal exercise of balancing whether my interest as a business to do direct marketing, overrides or is overridden by the user’s interest not to have their own privacy restricted, so that’s one problem.

The other problem is that you cannot reuse legitimate interests to process special category data under Article Nine of the GDPR. That’s data like data revealing racial or ethnic origin, political opinions, religious or philosophical belief and if you’re Facebook, you may have this problem that actually you collect so much data that it’s very hard for you to say that some of the data you collect may not be revealing that kind of information. I guess TikTok is in a similar situation. So for those two reasons, the Italian authorities issued a formal warning to TikTok, and it does seem like TikTok shelved that idea, and that’s just the GDPR. We didn’t even mention the ePrivacy Directive, which doesn’t have the notion of legitimate interests, only consent. So TikTok did not go with that plan.

Eric Seufert:

Right, yeah. So I wanted to get to that next. One thing that kind of confused me when I began the process of trying to understand the space is Irish DPC is a DPA.

Mikołaj Barczentewicz:

Yes. Data Protection Authority

Eric Seufert:

Right, then the Irish DPC is just the national name for that DPA, which is Irish Data Protection Commission, right? Okay. Yeah, that was confusing to me because people seem to use those two acronyms interchangeably and they are I guess, but the Irish DPC is just that specific office.

So thank you for the segue because I want to talk about the ePrivacy directive and I want to couch that discussion in a position proposal. Here’s my position proposal, and tell me if I’m way off or if I’m close to target. My sense is that we are trending to a point where through the GDPR and ePrivacy directive decisions — and we’ll get to what some of those have been, some recent examples in a second — but through those GDPR and ePrivacy directive decisions, we are trending to a situation in which consent is the only viable mechanism for processing data.

We probably won’t see these other GDPR legal bases be approved for personalized advertising, for ads personalization, ads targeting, whatever you want to call it. These companies are probably going to have to resort to consent. What do you think of that? How would you respond to that?

Mikołaj Barczentewicz:

I think that it certainly does look like businesses are being pushed towards consent, at least for the kind of data processing that authorities perceive as having significant privacy impact. And profiling or behavioral advertising is seen as having this significant impact as it’s not just collecting a mailing list for direct marketing or direct emailing or snail mail. The problem with consent is that it’s really not easy, and you mentioned already this is one aspect that consent has to be informed. So we have an issue of how to find the balance between providing too much technical detail and then a user wouldn’t understand it because it’s too technical. On the other hand, you can be too general and simplify too much so that a user will not be adequately informed.

So informed consent is tricky, but I don’t even think that’s the biggest issue. I think that what could be the biggest issue, for example, for Meta is this problem of bundling consent. This comes from Article Seven of the GDPR, which says that there is a presumption that if a user has to give consent to data processing to access some service, and when the service provider cannot rely on contractual necessity, then this consent is not freely given and that’s not valid. So here, the problem is this bundling issue, if you tell your users, “If you want to use our service, you have to consent.” Normally, if there is an issue of true necessity, then you wouldn’t need to ask the user for consent because then you say, “Well, the processing will be based on the contract between us,” and that’s what Meta tried to do.

What Article 7-4 says is that, well, if you then cannot rely on contractual necessity, then there is at least a presumption that you cannot ask for consent if you make consent a condition of accessing the service. So if Facebook says, “You have to consent to this kind of data processing, or we will delete your account or you cannot open an account.”

Eric Seufert:

Then, in this specific context when we’re talking about personalized ads, it is … well, we need personalized ads to run the service. So if you don’t consent, we just can’t offer the service to you. That’s what you’re saying. They can’t do that. They can’t bundle these things together such that — and you talked about this in your article — that is take or leave it, you’ve got to give them an off-ramp for that specific feature for which they are consenting.

Mikołaj Barczentewicz:

Yeah, so I mean, we can try to look at it from Meta’s perspective. So the EDPB tells them that they cannot rely on contractual necessity for data processing for personalized advertising because at least according to EDPB, it is theoretically possible to run a social business without personalized advertising. By the way, they provide no evidence for that, so set that aside. There’s also the issue of legitimate interests. So in this case, we are pretty much left with consent, cannot use legitimate interest, cannot use contractual necessity. So here, Article 7-4 makes it very difficult, if not, impossible to use consent as a condition of access to a service, so what seems to be a conclusion so that Facebook may need to offer an identical service without personalized advertising or perhaps any personalization to any user who would like to refuse to consent. Then, you can ask further questions.

So can Facebook charge users for access to that service? The tricky issue here, and we go back to consent, is that if users have to pay for an alternative, that could mean that … they may be forced to consent to the free service because they cannot afford to pay for the personalization free option, option C. So [Meta] may be between a rock and a hard place: that if you tell users to pay, this may be forcing them to consent, and if [Meta] forces them to consent to the personalized version, then this consent is not valid and that’s illegal. Of course, we all know that Meta has always been opposed to the idea of having a paid subscription option, so forcing them to do it would be really tantamount to telling Meta that the regulators know better how to run their business, but that’s also an issue.

Eric Seufert:

Right, so there’s so much to unpack there. It’s funny because — having worked in the space for my whole career, mobile apps, digital ads — there are things that I just know to be true, and I don’t know these things through rigorous scientific interrogation. I know them just through osmosis from having seen a lot and observed a lot. I know that you cannot charge for a social media app. That will not be successful. If you try to charge for it will not be successful. You can’t even make it meaningfully difficult to onboard. Look at Mastodon, it will not work. [Social media] has to be very simple to onboard and it has to be free.

Every half a second of friction that you add to the onboarding process, but especially if that friction is related to someone pulling out their wallet, just dramatically reduces retention. So I know that to be true, I can’t prove that, but I just know it to be true. So that’s why they’ve never experimented with the paywall product. Now, what you’re saying, if I’m reading this back correctly, is if [Meta] said, “Okay, well look, we’ll provide you with the option. We’ve got to pay for our servers. We’ve got to pay for our engineers, we’ve got to make some money here. If we go purely contextual, we’re not going to make any money.”

Mikołaj Barczentewicz:

Yeah.

Eric Seufert:

So we’ll give you the option, user. Either you go free product with ads personalization or you pay. You’re saying that that would be challenged. That would be saying, well, that’s consent under duress, that’s consent under some alternative where money is exchanged, and that’s not real consent. You’re saying that that would probably be challenged.

Mikołaj Barczentewicz:

So that’s not how I would interpret the GDPR, but I’m highly confident that the same authorities, the domestic authorities, that objected to Irish DPC’s approach, they will also take that route and they will say, “Well, it’s not real consent because here, you’re not choosing between two free options, so they are not equivalent choices for the user, because here, one option is free and one option is paid. So the consent for the free option is that’s not fully free and freely given and that’s invalid.”

Eric Seufert:

Right. Okay. Sorry, I don’t mean to put words in your mouth. I’ll try to refrain from it. I’ll just try to summarize and have you tell me whether I’m right or wrong. I know lawyers are very particular about semantics. Okay, so it sounds like to me my interpretation of what I’m hearing is that that would likely be challenged.

Mikołaj Barczentewicz:

Yes.

Eric Seufert:

Okay. Now, if you look at Europe as a share of Meta’s revenues and then, you try to back … because the UK is not in the EU. So, if you look at Meta’s share of what they call “Europe” revenue, and then you look at the IAB’s breakdown of revenue within geographic Europe, 50% of it is UK. That’s what the IAB says. So, the other 50% is continental Europe. What if Facebook said, “Look, the juice is not worth the squeeze here. We’re going to keep our app as it is in the UK and of that line item on our quarterly earnings that says Europe, we know that about half of that is UK, so that’s probably fine. The other half is continental Europe. We’re just going to only make a paid app in Europe and we know that our revenue there is going to decrease substantially, but maybe we’re just willing to take that risk or we’re willing to absorb that loss because hey, we sure are paying a lot of fines.”

Then, they offered the sort of free app with personalized advertising baked in, all over the rest of the world. Could that be challenged because you’re saying, “Well no, you’re not giving us the same app, you’re disadvantaging this geographic territory relative to the other geographic territories by giving us an inferior app just so you don’t have to comply with our privacy laws, we’re not going to allow you to do that.” Could that be challenged?

Mikołaj Barczentewicz:

Well, but I’m assuming that in this scenario they actually do comply with the laws in the sense that they remove personalization, behavioral advertising, and then, they try to recover it by subscription.

Eric Seufert:

You see, let’s say that they create Facebook EU and there’s literally no personalization whatsoever. Your feed is all chronological. There are no ads, but you have to subscribe.

Mikołaj Barczentewicz:

I’m not sure it would be a problem under privacy law. If some very creative regulators maybe will then think about antitrust issues and try to make a case that well then, you’re price gouging. So depending on what the price is set for that, but yes, it could address the problem, just like going to contextual advertising. So then you could say, okay, so we’ll have less revenue, but then you’ll try to do it just with contextual advertising.

Eric Seufert:

Yeah. Well, I think people look at contextual advertising as some kind of panacea to all sorts of privacy concerns, but contextual advertising only works when there’s context.

Mikołaj Barczentewicz:

Yes.

Eric Seufert:

What’s the context of my Facebook newsfeed, right? I mean if I’m on espn.com, you can probably infer a few things about me, that might be commercially actionable. If I’m on weather.com, you can’t. Then, if I’m on the Facebook newsfeed, you really can’t. So contextual advertising, my sense is probably it just doesn’t work for that product or the revenue is not appealing,

Mikołaj Barczentewicz:

No one really knows for sure what any business should do, including Meta especially. But one way to look at the current situation is that at least some European authorities, they don’t care to find a way for personalized advertising, or even ad-supported business models, to be done lawfully. Their approach seems to be, “Here are all the conditions you need to satisfy to do advertising lawfully, and if it is impossible to satisfy them, tough luck. Find a different business model.”

Eric Seufert:

Right. I don’t want to speak for too wide of a group here, but I would say that the prevailing sentiment amongst digital advertising operators, or even just publishers, in the United States is that there’s no real compromise possible here. We’re probably not going to converge around some mutually agreeable solution.

Mikołaj Barczentewicz:

So as we noticed in this Irish case, there is no unanimity. Some authorities in some countries take a slightly more maximalist interpretation of the GDPR. Some take a different view, so perhaps the political process can still result in a different result, but the current momentum seems to be going in this direction as we just said.

Eric Seufert:

Right. That’s kind of depressing, but we’ll move on.

I’ve written a couple of articles recently about the CNIL, and they’ve had a couple of really interesting decisions, and we can get to those. But can you talk to me about what the CNIL is and what its jurisdiction is, because that seems kind of fuzzy.

Mikołaj Barczentewicz:

So CNIL is another DPA, like the Irish Data Protection Commission, but one reason why we are talking about CNIL right now is that in several other EU countries and former EU countries like the UK, the Data Protection Authority was given, in a sense, double authorities, under the GDPR and under domestic law, which implements an old or relatively old ePrivacy Directive. So authorities like the CNIL and the UK Information Commissioner’s Office (ICO), they can wear two hats and enforce those two different jurisdictions.

Eric Seufert:

Right, talk to me about the jurisdictional difference between the ePrivacy Directive and the GDPR. So GDPR is EU-level privacy legislation, and the ePrivacy Directive is a directive, right? It’s not legislation. It’s saying, “Hey, you should, you EU block, you 27 EU member countries, you should implement this law, but you need to implement it in a way that makes sense for your own domestic sensibilities.” Can you talk to me about the difference between those two concepts, an EU-level law and a Directive, and then also talk to me about the difference between the GDPR and the ePrivacy Directive?

Mikołaj Barczentewicz:

So the EU Law 101 on this is that regulations and directives are different because regulations, they bind member states immediately and they are their statutes, which are ready-made once they are enacted. Whereas when you have a directive, a directive sort of sets almost like a goal in the distance that member states are still bound to reach, but they do so by adopting special national statutes. So with regulation, you don’t get national laws separately from it, the regulation is directly applicable, as we say. With a directive, you get those separate national laws, and the thing about directives is that those national laws usually, there is a bit of leeway for various countries on how they will achieve those goals.

For example, one of those differences that can be among EU member states is whether they will have one authority both for GDPR and ePrivacy or separate authorities. So France and the UK, they went with one authority approach.

Eric Seufert:

Got it. So that’s helpful. Talk to me a little bit about that leeway though, because I think there’s probably some explanatory power there when you think about these specific cases that we’ll talk about in a second.

Mikołaj Barczentewicz:

So one problem with that leeway is that because CNIL, the French authority has its domestic law, the way they use it is they say, “Oh, we have separate authority from the GDPR, which means that based on that authority, we can act completely outside of this One-Stop Shop process which is required for the GDPR, but not for the ePrivacy Directive. So the ePrivacy Directive does not have this idea that because Meta Ireland is domiciled in Ireland, that they have the sole interlocutor main authority to talk to in Ireland. Under the ePrivacy Directive, you have 27 national laws and each of those laws is enforced by the domestic authority, and I think it may be fair to say that the French Data Protection Authority decided to be very creative and aggressive in using that legal basis instead of GDPR.

A cynical interpretation of it is that they are simply trying to sidestep that limitation, to recover some authority they lost by the One Stop Shop principle and by the fact that most big techs are domiciled in Ireland.

Eric Seufert:

Right, okay and that’s great because that’s a great segue into the first case I want to talk about. So the CNIL sanctioned Apple.

Mikołaj Barczentewicz:

Yes.

Eric Seufert:

And it’s a little bit ironic because Apple rolled out ATT.

Mikołaj Barczentewicz:

Yes.

Eric Seufert:

And it rolled it out with iOS version 14.5, and [Apple] sort of slow-rolled it. So usually what Apple does is when they launch a big new feature release, they’ll make it available, and people will, of their own volition, go and download it. Early adopters will go and download it and they’ll update their iPhone and they’ll use it. Then, Apple can test as a tech test whether there are any breaking bugs that they didn’t catch or whatever. And after some usually moderate amount of time, like a week, they’ll say, “Okay, this looks pretty stable.” What they’ll do is they’ll send a push notification to everyone else’s phone and say, “Hey, this new version is ready for release.”

So what you can track is the upgrade graphs, basically, adoption graphs of people that have upgraded to the newest version, and usually it’s this kind of very low level of adoption growth. Then, there’s an inflection up and it’s this vertical graph that goes up because everyone gets the push notification. So what they did with 14.5 is they waited much longer and then, they made some changes and then, they launched 14.6 and let it ruminate and then, they pushed the notification and then, the adoption curve inflected, and the majority of people had it in very short order. That was right around WWDC 2021, so it was late June.

So what the CNIL said was … and this was not really made public until recently, but at the time, what the CNIL said was, “Hey look Apple, you are doing ads personalization and you are using a bunch of identifiers that these users have not consented to have accessed by you. And you’re using those identifiers to collect data from these users and to build profiles of them and target ads to them in your own ad platform, which is Apple Search Ads, which is the placements within the App Store and the Apple News and Apple stocks apps.” And what Apple said was, “Hey, CNIL, mind your own business. If we’re headquartered [in the EU] in Ireland, you can’t intervene here if this is a GDPR issue, have the Irish DPC investigate it. But we have no responsibility to explain this to you and this is not a French national issue. If it’s a GDPR issue, then we’ll go through the One-Stop Shop provision and we’ll talk to the Irish DPC.”

CNIL said, “No, because you have two companies based in France. You’ve got Apple Retail and some other kind of domestic company.” Now, Apple Retail sells … they run Apple stores. You go in there and you buy an iPhone, but then the other company was like the local branch that managed the ASA, the Apple Search Ads support. And they said, “You’ve got a domestic entity here and so therefore we can interrogate your practices here under the French Data Privacy Act.” And they did and they fined them. I think it was eight million euros, which to Apple is not a lot of money. But nonetheless, they were able to make the case that no, this isn’t a GDPR issue. We don’t have to go through the One-Stop Shop provision. We can litigate this by French domestic privacy law and we do and we’ve found that you are in violation.

Can you just talk to me about that? Let me know if I’ve gotten anything wrong and then, what I find very, very interesting is that [the CNIL was] able to push back on this idea that no, this is a French domestic issue. Yes, we can investigate it here based on French law.

Mikołaj Barczentewicz:

Yes. So there are some circumstances under the GDPR when you can have a domestic non-Irish authority to investigate, and that’s what happened with TikTok and the Italian authority. They used this urgency procedure, but that was under the GDPR. What happened, what CNIL does, and that seems to be their M.O. recently, is they just don’t use the GDPR. They use the same … their national data protection law, which implements the ePrivacy Directive, and there is this provision in this.

So, as many of your listeners will probably know, [the ePrivacy Directive is] the cookie law. So that’s the real reason why we have consent banners, not the GDPR, but really the ePrivacy Directive.

But there is one other feature of the ePrivacy Directive in its Article 5-3, which says that whenever you want to store or gain access to any information on a user device, you need to inform the user and give them an opportunity to refuse, to switch. This basically means consent.

So that’s the basis that CNIL is using under the ePrivacy Directive as implemented in French law, not the GDPR.

Eric Seufert:

Right, and I think there’s something important to underscore there, but there are two things. So one is, I said at the outset, I said it was ironic because it was Apple. Apple has significantly disrupted the digital advertising space with ATT and then the CNIL found, “Well, you’re not compliant with the privacy directive.” So I think the interesting outcome of that was Apple implemented the consent dialogue after this happened. My sense is — I would conclude from that — that they were never going to do that, had no one pushed back on it. And this is just my opinion, right? I’m not trying to sort of implicate you in this, but my sense is that Apple only decided to implement that consent popup, which by the way, says ads personalization.

It’s very different from the ATT popup, but it says ads personalization and they get a lot more sort of real estate to use in making the case than you do with the ATT popup, but nonetheless, they wouldn’t have done that had this not been sort of pushed back on is my belief, but I think the other interesting thing here, or at least the other takeaway is that … well, let me say there’s two more. One is that if you’re compliant with ATT that doesn’t mean you’re compliant with the ePrivacy Directive.

Mikołaj Barczentewicz:

Correct.

Eric Seufert:

ATT is a platform policy, it’s not a law. The ePrivacy Directive is not a law either, it’s a directive, but it’s transposed into law across the EU and you might not … just being compliant with ATT is necessary but not sufficient for being compliant with the ePrivacy Directive.

But the other insight I would parse from this, which I think is important, but I would like to get your thoughts on it because I might be misinterpreting it, is that there is a liability inherent with complying with the ePrivacy Directive at the EU state level. And you might say, “Okay, well I might not be compliant with the ePrivacy Directive as transposed into French law because the French are being very litigious with this, but I might have less liability in Germany and I might have almost no liability or I have some liability but not very much liability in Spain.”

So I might make business decisions on that basis. I might say, “Okay, well I understand that what I’m doing is probably not compliant with the ePrivacy Directive as transposed into national laws across the EU, but only some of these countries are actually going to pursue that. So maybe I will launch a different version or not have a version available at all in some of these countries, depending on the risk.” Now for Apple, eight million euros, they probably don’t care that much, but Voodoo might, or some other company might, and that might actually convince a company to just simply … because it’s very easy with the App Store, I just click a box. I click a box and my app is available in your country.

That’s all I have to do, so if I think the liability is potentially too material to justify having the app live there, I just unclick the box. Is that something that could happen or would that violate some kind of EU-level accessibility law?

Mikołaj Barczentewicz:

I’m not sure. I think you may have enough freedom to choose which countries to operate in, and also, there is that risk and which may support that kind of decision, because if you implement … So if you go along with the interpretation of, for example, what consent under Article 5-3 of the ePrivacy Directive, what it requires, and if you go with the interpretation adopted by the French regulator, and if you do it for one country, then your position versus vis-a-vis all other regulators is weaker because then they can say, “Well, you did that for the French market, why wouldn’t you do that here?” So that may be one reason also to go for that sort of semi-nuclear option. I’m not observing the market closely enough to say whether anyone did that.

Eric Seufert:

Country by country, I don’t think so, but in the sort of early days of GDPR, you just had … especially a lot of local newspapers in the United States, you said, if you’re in Europe, we’re not going to allow you to access [our product]. We can’t bear that liability. That would kill us. Whatever, The Little Rock Observer, probably the 10 readers they get from the EU every day are not worth the risk. Now obviously, the risk scales with the size of the potential sanction based on the size of the user base in Europe. I don’t think anybody is going to go after The Little Rock Observer but nonetheless, I think you get my point.

Mikołaj Barczentewicz:

Going back to the rationale of the One-Stop Shop that we mentioned at the beginning of our conversation, I think this is a perfect example and it is perceived as something as a bad situation in EU law because it does go against the idea that EU law is meant to provide harmonization and provide a single market where you can operate without regulatory borders between countries. So there are some ideas to change the situation when the ePrivacy Directive is finally going to be replaced with its successor.

Eric Seufert:

Right, talk to me about that, talk to me about the ePrivacy Regulation. What will that change?

Mikołaj Barczentewicz:

So the ePrivacy Regulation was in a sense, a sister idea to the GDPR, and I think the original idea was that they were going to be enacted at the same time. That didn’t happen. It’s really hard to tell what exactly is the current state of the ePrivacy Regulation. The point is that it would … and one of the ideas for it was to have similar enforcement mechanisms as the GDPR. So going from this fragmented 27 countries with slightly different roles and ideas about enforcement scheme, we have now under the ePrivacy directive, towards something like One-Stop Shop in the GDPR. What I do know is that there is disagreement between the various legislators, so the European Commission seems to be in favor of the GDPR approach for it.

But the governments of various member states adopted their own negotiating position, which would actually preserve the fragmented system of enforcement. So even if the ePrivacy Regulation is adopted, there is no guarantee that it will bring that improvement in terms of enforcement.

Eric Seufert:

If I would liken it to a situation in the United States, it’s kind of like California vis-a-vis federal privacy law. They want to maintain their agency with the CCPA and CPRA and they don’t want to give that up to the federal government. Is that roughly similar? Is that an okay comparison?

Mikołaj Barczentewicz:

I think so. I don’t know which countries are the strongest voices behind this sort of fragmented enforcement approach, but I would be very surprised if France is not among them.

Eric Seufert:

Right, so one question I have for you is, is there a double jeopardy clause here? Because that would be my concern, right? So, okay, France sanctioned me based on national French law, transposed from the ePrivacy Directive and by the way, so did Spain and so did Germany. Is there a double jeopardy protection here or could I just get fined by every EU state for an infraction?

Mikołaj Barczentewicz:

That’s a good question, but the problem is that technically, because it is national law, each state fines you for what you did in that state. It’s also not criminal law, so the general principle, double jeopardy, may not apply, but technically, these are separate infractions, right?

Eric Seufert:

Sure.

Mikołaj Barczentewicz:

So yes, I think you are at risk of being fined for the same thing many times over.

Eric Seufert:

Okay, got it. So I guess that raises the question, why is France the one at the vanguard of this? Why not Spain? Why not Germany? Why is it France? Why is it the CNIL that is this active? Because if you look at the number of cases that they’ve litigated, it’s quite a lot.

Mikołaj Barczentewicz:

That is true. There are some active regulators in Germany, but it does seem that the French National regulator, CNIL is particularly aggressive. So I don’t want to speculate about the French policy choices, A, it could be a cultural thing, partially, it could be … There could be, I suspect a bit of, as of almost nationalist issue or at least European sovereigntist issue because many of those countries which are being prosecuted by CNIL are American companies. So I think this is not entirely without influence on the motivations behind it, but I think it’s also tempting to note that the same French government, although CNIL is an independent authority, but CNIL takes a very maximalist interpretation of privacy though, whereas the French government is known for taking a very minimalist interpretation of the GDPR and generally EU privacy law when it comes to any restrictions on governmental data processing.

To the extent that they keep litigating and losing cases before the European courts about their law enforcement and intelligence data processing, which I think is a pretty interesting juxtaposition, right? Both the strongest at enforcing against tech companies, but also the most keen on large-scale data processing, but by the government, and according to the EU courts even to cross the boundaries of EU law.

Eric Seufert:

Yeah, well, don’t get me started. I mean that’s … what drives me up the wall is when you see so-called — always self-appointed — disinfo experts saying, we need to have access to all the data that Facebook has so we can prevent Cambridge Analytica from ever happening again. It’s like, “Well how do you think Cambridge Analytica happened? It didn’t happen … it wasn’t some like rogue hacker group in North Korea. It was a researcher.”

Mikołaj Barczentewicz:

Yes.

Eric Seufert:

Right. So I mean, I don’t buy that argument that researchers can never want to be motivated by money or whatever. I don’t know if you’re a Simpsons fan, any chance of that?

Mikołaj Barczentewicz:

I do watch The Simpsons from time to time. Yeah.

Eric Seufert:

One of my favorite scenes was … Sideshow Bob infamously tried to murder Bart a couple of times and he went to prison. And he had “Die Bart, Die” tattooed on his chest. The parole board says, “Well, we don’t believe you when you say that you’re not going to try to murder Bart again because you’ve got Die Bart, Die written on your chest.” He said, “Oh no, that’s German for The Bart, The,” and the parole board says, “Well, no one who speaks German can be an evil man.” And it’s like no one who’s an academic could have an ulterior motive with this data.

Mikołaj Barczentewicz:

So it’s not our main topic for today, but I think it’s worth mentioning the new DMA and DSA regulations, which also impose some data access requirements, for example, creating this new ad database, which is meant to be accessible through APIs for everyone. So it’s quite interesting how at the same time, so with one hand, EU law demands more privacy protections, but with another, it looks like it may be undermining that which is potentially a somewhat schizophrenic situation.

Eric Seufert:

Yeah, I mean, I’d love to have you back to talk about the DMA and the DSA. That’s obviously another hour and a half, but we’ve got our hands full here, because you made a point, which I think I would’ve been fully onboarded with until recently, which is that maybe there’s some — and I don’t want to put too fine a point on this because I don’t know whether this is a motivation or not — but maybe part of the motivation for pursuing these cases was that these are American companies and you could sort of interpret a number of reasons for why a national privacy regulator would want to very, very aggressively enforce their laws against American companies. I think up until whatever, six months ago, I would’ve said, yeah, that probably is explanatory here, but they also sanctioned Voodoo Games. Voodoo Games was sanctioned by the CNIL.

Voodoo Games is a French company and they fined them millions of euros. They said that … kind of similar to the Apple case, they said, “Look, you are collecting the IDFV.” I think most of the listeners are familiar with ATT, but what ATT does is it exposes this pop-up, this consent prompt to users when they open an app. It only does it one time, but it says, “Do you agree to have this app track your behavior within this app and across third-party apps and websites?” or something like that. The user says yes or no. If the user says no, then what happens is the policy is in effect and the policy covers more than just the IDFA. The policy covers any kind of identifier that could be used and transmitted to a third party for the purposes of ads tracking.

It says you can’t do that, but the sort of concrete result of the user opting out is the IDFA gets set to all zeros, right? So it’s effectively useless. The IDFA is set to zeros, and what Apple made available to developers was when the IDFA is zeroed out — because the user opted out of tracking, they said no to the ATT prompt — is the IDFV. So the ID for Vendors. The IDFA is the ID for Advertisers. The IDFV is ID for Vendors. What that is, it’s a publisher-specific device identifier. So it’s unique to that publisher for that device, but it would be different for a different publisher for that device. So Voodoo Games operates a number of games, they publish very many games and for any user that plays multiple of those games, their IDFV is the same, whenever they play the game.

So that way Voodoo could say, “Okay, well this user Eric is playing this game and this game because I’ve seen that IDFV in both games.” So they know that that’s me playing multiple games. Apple made that available in the case of ATT opt-out because they said, “Okay, well that’d be difficult to stitch together. If you sent that off to a third party, they can’t do a whole lot with it.” Now, it’s possible that they could, but it would take a really concerted effort and probably a lot of cooperation across third parties that are not likely to want to cooperate. So that’s probably privacy safe.

[The CNIL] said, “Well, we don’t care, ATT is platform policy … the French Data Privacy Protection Act says you may not read data from a terminal unless the user consented and they didn’t consent. The IDFV, we don’t care that Apple says that’s privacy safe. We don’t say it’s safe. French law says you can’t read this data from the terminal without their consent and you didn’t ask for consent and so we’re going to fine you.” Talk to me about that because I think that was interesting for a number of reasons. One, it’s a French company. I thought that was interesting. Maybe it’s less interesting than I think it is.

Mikołaj Barczentewicz:

Yeah.

Eric Seufert:

The other thing is the motive here, which Voodoo said was, “Look, we’re doing this to be protective of the user privacy.” There’s a bunch of other stuff we could be doing, but we’re not. We’re using the IDFV because that’s privacy-protective, and the French court said, yeah, fine, but it’s not compliant with the law.

Mikołaj Barczentewicz:

We’re still talking about this article 5-3 of the ePrivacy directive and it’s interesting how closely it tracks some of the GDPR discussions, for example, in Irish Meta cases because this rule does say if you want to store or access any information on the user device, well, you need to give the user an opportunity to refuse, to inform and give the right to refuse, but there is an exception. The exception says, “This shall not prevent any technical storage or access for the sole purpose of carrying out, facilitating the transmission of communication or as strictly necessary in order to provide an information society service explicitly requested by the subscriber or user.” So this last part, it really looks like the contractual necessity consideration.

In a sense, yeah, I think you could have very similar legal debates about whether, for example, what Voodoo Games was doing, whether that was actually necessary for them to be able to provide that service because they may be funded at least partially by advertising. So it’s really a very … at least to me, it seems like a very similar conversation but the problem is that even under the GDPR, it went the way … well, let’s wait for the courts, but saying that no, you cannot use contractual necessity for this reason. Then perhaps the ePrivacy Directive could be interpreted in the same way, but if the courts go … it is at least possible that the courts could interpret it otherwise. So to say, “No, advertising is part of the deal, it’s part of the contract, it’s necessary for provision of the service because it’s an economic necessity, not just a technical necessity.”

And for that reason, then you may not require this consent, specific consent for storage or retrieval of information from the user’s device. So yes, I think that’s worth noting, but I’m not surprised that the French authority would take the same view on this as they take on contractual necessity and say, “No, you have to have specific consent.”

Eric Seufert:

Yeah, and you made that same point in your article, right, about the economic necessity. That’s not really being considered when these are being litigated. Okay, I wanted to get to one more big topic. Trans-Atlantic data flows. I think this is something that probably most people are superficially aware of. You see the headlines every once in a while, but the details are staggering or there’s just a lot there. Can you just kind of briefly give a background of that tension, that issue, and then maybe give us your take on how you think it’ll be resolved?

Mikołaj Barczentewicz:

So under the GDPR, transferring personal data outside of the EU is only allowed if certain conditions are met, and broadly those conditions and a deal with whether it’s a place, the jurisdiction to which data is meant to be transferred, protects privacy sufficiently and similar persons involved, as in the Meta investigation, there is Max Schrems. He is probably most famous for litigating this issue twice now and probably soon for the third time. So, far twice, he managed to convince the European courts, the Court of Justice to say that the scheme under which … under EU law, the United States was recognized as a safe jurisdiction for data privacy, that this scheme was not … was invalid under EU. So that first happened in Shrems One and more recently under Shrems Two.

So now there are some ways around this because those two decisions dealt with an overarching scheme called previously the privacy shield under which it was an agreement and under which the European Commission and the US government, they agreed that the US will make some representations about protecting Europeans data, and based on that, know if I was an EU or US business transferring data between the two jurisdictions, I didn’t need to make my own individual assessments, whether this is fine for me to do. Since the Shrems Two decision, since the previous decision was invalidated, now, we have some ways around the problem, the main of which is known as SCCs. The standard contractual clauses and the SCCs are a scheme where you as a business individually have to assess legal risks to your user’s data from that data being transferred to the US.

The SCCs is what Meta and all other major providers are relying upon for transferring user data. What happened in several Google Analytics cases and in a pending Meta data transfers case looks like the domestic data protection authorities started to take a very hard line about how the theoretical possibility of personal data being accessed by US intelligence authorities undermines any use of those SCCs. So because it is possible that under US law, the information, the personal data of a European will in the USP accessed by the authorities without, some argue, sufficient due process, this means that it would be unlawful for a company like Meta to transfer the data of Europeans to the US. And I think it last year, meta even made an announcement in their SCC filing saying that this case, the spending case is such a risk that if it goes one way, they may need to stop providing services in Europe.

So they disclose that as a significant business risk in an SSC filing. So that decision is expected in April and everyone is waiting for a different decision, this time from the European Commission creating a new … so-called adequacy framework. So privacy shield, you could say three, that will make this meta decision in a sense, mote. So even if the data protection authorities, were going to find that no Meta is violating EU law by transferring personal data to the US, this will become more once EU law again allows everyone to transfer personal data to the US. So we’re very eagerly awaiting for the European Commission to formally adopt a draft adequacy decision that they already announced. So it’s expected around, hard to tell June maybe.

Eric Seufert:

Right. So if I understand correctly there was some news recently, two weeks ago or three weeks ago, and there’s an issue around the timing because this is being attacked from two different directions where it’s being handled.

Mikołaj Barczentewicz:

Exactly.

Eric Seufert:

So there’s the DPAs, the association of DPAs or whatever, and then, the EC and the DPAs are … once they go into the process, they have some fixed amount of time, but the EC might take longer. So the DPAs could say, “Well no, this is not legal,” before the EC proposes the third solution. And then, in that interim period, there would be a gray zone or something where it’s not just Meta, right?

Mikołaj Barczentewicz:

Black hole.

Eric Seufert:

Right, I mean, this would apply to every American company. Yeah.

Mikołaj Barczentewicz:

Google, Meta, everyone.

Eric Seufert:

Can you talk about the timing?

Mikołaj Barczentewicz:

I mean, the solution itself would only have legal effects with respect to one company, obviously, but then everyone would be at risk of identical or very similar enforcement proceedings.

Eric Seufert:

Right. Mikolaj, this was so informative. I’ve really been looking forward to this call and it absolutely lived up to expectations. I have so many more questions, but I’ve already sort of eaten up an hour and a half of your day. I will ask, how can people connect with you? How can they read your writing? How can they follow you? Where do you live on the internet?

Mikołaj Barczentewicz:

So I think it’s best to follow me on Twitter and it will be probably best if you just post a link in the show notes to my handle because my handle is my initial and then, my surname, which is very difficult but yes, so Twitter is best.

Eric Seufert:

Got it, and just kind of last quick question, so I feel like I have a superficial grasp on these topics and I’ve probably invested, I don’t know, 20, 30 hours into this research. Most people don’t have that kind of time to commit to this. What do you do, if you’re working at an American company, you’re based in the US and you want to make sure you’re compliant with myriad EU privacy law, what do you do? You just hire a law firm or do you hire a full-time person to manage this?

Mikołaj Barczentewicz:

I’m afraid … I mean, of course, it depends on how big your operation is, but it may be very difficult to avoid some expensive lawyers, unfortunately, which is one of the sad aspects of this situation because those costs can add up very quickly. But I’m not sure there is any other responsible advice I could give than just getting a good lawyer.

Eric Seufert:

All right. Fair enough. It’s wise words for most situations. All right, Mikolaj, I very much appreciate your time. Thanks very much for walking through these topics with me and with the audience. Take care.

Mikołaj Barczentewicz:

Thank you.