A brief history of privacy legislation

In this episode of the Mobile Dev Memo podcast, I speak with Jessica Lee on the topic of digital privacy legislation. Jessica is a Partner and serves as the Chair of the Privacy, Security, and Data Innovations practice at Loeb & Loeb, a New York-based law firm. Jessica’s practice focuses on emerging media, technology, advertising and promotions, privacy, and intellectual property, and she has represented clients in a variety of fields, including Internet, film, music, sports, telecommunications, and consumer products.

Over the course of the podcast, Jessica and I discuss the history of digital privacy, the prospect of a federal privacy law in the US and how the recent midterm election results impact that, why Europe leads the United States in codifying privacy protections into law, and the efficacy of self-regulation, among other topics.

See a lightly edited transcript of our conversation below.

The Mobile Dev Memo podcast is available on:

Transcript: a conversation between Eric Seufert and Jessica Lee

Eric Seufert:

Jessica, nice to see you. Thank you for joining me.

Jessica Lee:

Nice to see you, too. Thanks for having me.

Eric Seufert:

I appreciate your time. We were first acquainted when we both did a panel for, I believe it was The Drum. They had some kind of advertising week bonanza, peak COVID, so it was all on Zoom and you and I were on a privacy panel together. I was impressed by your insights and that is the reason that I have asked you to join me today, and you graciously accepted that invitation.

Jessica Lee:

Yeah, and I was excited because I was impressed with you as well. I started following your blog and Twitter. Lawyers have to stay plugged into what business teams are doing, so it always helps to keep me informed.

Eric Seufert:

I’m glad to hear you say that. You are a lawyer, you are a specialist in this area. You’re the chair of Privacy, Security, and Data Innovation at Loeb and Loeb, and I am just an opinionated person. Those are my bona fides there, and so I thought it would be great to get a real expert on the podcast to give essentially a survey of the most recent developments in privacy legislation and also maybe give us a little sneak peek as to what we can expect.

Jessica Lee:

Yep, sounds good.

Eric Seufert:

Excellent. I’ll start it off with a high-level premise, which is: why do you think a federal privacy law has yet to be enacted? What roadblocks have prevented one from being passed? Because I think anyone in this digital advertising space and just operating generally in digital keeps hearing about the fact that we need a federal privacy law. It’s going to come any moment now, it’s imminent, it’s inevitable. I guess the question is, why are we still talking about that? Why hasn’t one been passed?

Jessica Lee:

Sure. Probably a number of reasons. One, not surprising if we’re talking about a law, we’re talking about politics, and privacy is a bipartisan issue. I think there’s bipartisan support for the concept of federal privacy, but each side has its own agenda and motives in terms of what they’re looking for. I think that always makes things challenging.

If you go into a year like we’re going to go into where the House and Senate have divided parties, that generally makes legislation hard to pass and then we’re coming into two-plus years of COVID and potentially a recession, so there are also competing priorities. But more specifically, historically there’ve been two key areas where privacy legislation gets stuck. One is preemption, because for businesses to get behind a federal privacy bill, they want to have security that it will preempt other comprehensive state laws so that they’re not stuck in the patchwork where they are today.

That’s a benefit, that’s a motivation why you see sort of a louder call right now for federal privacy. It’s because there are five states and there likely will be more states with privacy laws. And those states have different definitions, they all have different contractual requirements, and they have slightly different obligations. And so navigating that five times 10, if you get to 50 states, that’s a lot to manage. So, I think there is a desire from that perspective to get to a place where there’s one standard that companies have to deal with. There will be sector-specific things for health and financial information, but there’s a desire to have one standard that covers in a comprehensive way, personal information.

Then the other piece is a private right of action. Privacy advocates don’t feel like privacy will be adequately protected unless there’s a private right of action, which means an individual has a right to bring a lawsuit in their own name or as part of a class action. Again, the business community is very opposed to a private right of action because if you look at other states or other statutes, I should say, like TCPA for example, which regulates text spam, you see these huge fines levied when one text got sent, or a text got sent to a wrong number or something.

In fairness, obviously, there are cases where there has been text spam that is a clear violation of the law, but it leads to these huge fines that are usually paying off the plaintiff’s lawyers and leading to something very small for the actual consumer. But preemption and private right of action are kind of the two sticky areas that are hard to get past.

Eric Seufert:

Right. I saw when the DSA was passed, I read a piece about it or maybe it was a podcast. It was a podcast, and someone had said that any privacy bill that you might see, basically like 75% of all privacy legislation relates to the kind of core of what you’d expect in a privacy bill. And then that remaining 25% is really what differs from bill to bill. You see that especially in the penalties and the state-level bills and then the private right to action, too. But I guess when I read those bills as just a layman, I’m like, “Okay, well they’re describing the penalty and there’s a penalty and the penalty is conceptually what happens if you violate it.”

And so that comports, that checks, and I guess when a lawyer reads that, that’s very much a reason for… A privacy lawyer, someone who has a professional interest in this topic, that’s a substantive difference, or that could represent a substantive difference versus a layman like myself reading, saying, “Okay, well there’s a penalty component. That makes sense,” right? Is that-

Jessica Lee:

Yeah, I think that’s right. I think particularly if you’re thinking, and obviously, I’m a lawyer, I represent companies for the most part. I feel like I’m a privacy advocate, but I still have sympathy for the industry position, which is: there’s a desire to comply. You need to have teeth, every law needs to have teeth, even inside these corporations, lawyers will tell you it’s actually helpful to have some fines at some point because it helps them come in-house and say, “This is a real thing and we need to get funding and support for it.”

So, the idea of fines, the idea of the penalties, you can’t have a workable legislative structure without it. But I think the private right of action in particular causes concern because it leads to some gotcha litigation. It’s not clear that there’s a real benefit for consumers and it does shift the risk analysis. When you talk about biometric data, I mean, all the BIPA lawsuits that come out of Illinois, we’re talking hundreds of millions of dollars.

It’s actually one of the statutes where the consumer actually does get a substantial amount of money when they’re part of these class actions. These are huge fines and I think that just causes so much more angst and we’re going to have a regulator. Those could also be huge fines, but regulators are looking to protect consumers. Private right of action is usually brought by the plaintiff’s lawyers who have an economic interest in seeing a particular outcome from a case.

Eric Seufert:

Right. And the economics there can be very much skewed in favor of those trial lawyers. I was talking to someone recently about this and I was kind of shocked to hear about the economics of those class action cases with respect to what the class actually gets versus the legal team.

Jessica Lee:

Right. I appreciate, again, you’ll hear from privacy advocates and it’s like: we have to have this private right of action. That’s the only way consumers get some redress. But if you look at the economics, in most cases outside of BIPA and the biometric data, you’ll get these involuntarily — I’m sure you’ve gotten these emails like, “Oh, you’re qualified for this class action. You can get a coupon to the store for $10.” Is that really helping you get redress for your rights? Probably not. I think actually, it’s more effective when it’s enforced by regulators and then they’re fined. And then there are things that are even more painful for companies than fines, in some cases. If you deal with the FTC, you might be under a consent decree, a 20-year consent decree where you have to report to the FTC. Recently the FTC has started doing things like requesting companies delete the algorithms that are trained on data that was collected illegally. That, to me, has more teeth and more potential to protect consumers going forward than a class action does.

Eric Seufert:

Right. There was a ruling that demanded exactly that, recently. I don’t recall the case, but it was where some party was found guilty. The FTC said, “You have to delete this algorithm. It was trained on data that you had no legal right to own or to access, and so you have to just get rid of the algorithm completely.” [Editor’s note: the case was with WW, formerly known as Weight Watchers, for illegally collecting health data from minors.]

Jessica Lee:

Right. And this Everbound case, and there might have been one or two since then. But yeah, that’s one of the remedies that the FTC is pursuing. Whether or not that would stand up in court. Because usually in past, some of the recent FTC enforcement actions, they’ve been settlements. This is a penalty that’s been agreed to and a settlement hasn’t come in front of a court, so there’s a question about whether or not those stand up in court. But putting that issue aside, those penalties have teeth.

I speak a lot, like you said, to The Drum and to other advertising conferences where it’s a non-lawyer audience and you say, “disgorge the data” and you say, “delete the algorithm,” and it’s a very different reaction than fines, which impact the corporation, obviously. But for the business people on the floor day to day, the idea that data gets deleted or algorithms get deleted, I think that sends a bigger signal.

Eric Seufert:

Right. I can absolutely imagine that. You’d prefer the fine in some cases.

Jessica Lee:

Yeah, exactly. Exactly. You might build the fine into your business plan. You can’t build for losing all of your data.

Eric Seufert:

What are the benefits of federal privacy legislation? What kind of clarity would a federal privacy law bring to the digital operating environment?

Jessica Lee:

Well, I mean, it goes back to that patchwork. Like I was saying earlier, right now you have five states… Well, starting in 2023, you’ll have five states that have comprehensive privacy laws. You have the concept of opting out of sale. Sale is defined differently in different states. You have the concept of opting out of sharing for cross-contextual advertising. You have the concept of opting out of targeted advertising, defined slightly differently than sharing for cross-contextual advertising.

You still have self-regulatory frameworks that talk about interest-based advertising. You just have all these different concepts swirling around and it leads to inconsistency. I think that inconsistency negatively impacts businesses in terms of how they’re able to understand how they destruction themselves. But I also think it disadvantages consumers who — the average consumer doesn’t want to have to think about what is targeted advertising versus share or sale or whatever it is, having to parse through all of these different terms. And then you go, more broadly, each law comes with its own obligations to have contract terms in place. It leads to this flurry of contracting and it’s just all this activity that I think takes away from the core function of, obviously, you have a business to run, but if you care about privacy and data protection, actually focusing on those things rather than having to parse through this very complex patchwork of laws.

I also think that patchwork means that there are holes: there are places where things can fall that aren’t completely covered, because it’s not a complete overlap in places. If you do have bad actors, I think it opens room for people to be kind of cute with the law. If you have federal privacy, I think you give businesses and consumers consistency with what’s required, and I just feel like that’s a better path forward than what we’re dealing with right now.

Eric Seufert:

And I think when GDPR, when the deadline was reached to sort of adhere to it, a lot of companies had to make the decision, do we just cut EU users off from our service? I remember the first time traveling abroad after GDPR went into effect, and some local newspaper’s website said, “You’re in Europe, we can’t service you. There’s no way for us to comply with GDPR.”

I guess you could come to that calculus as a firm and just say, well, okay, sorry, Illinois. Right? That’s obviously not a great outcome for consumers if that’s the case. Or like Nevada or any of them. Well, I mean California is probably a bigger loss just in terms of the number of people there, but you might have to just face that calculus. Whereas, well okay, we can’t shut our service off for the entirety of the United States. That’s our business.

Jessica Lee:

Right. I think for US companies in particular, part of that calculus for GDPR is how big is the EU footprint? If we’re just launching, we’re trying to enter into this market, but if you balance out the economic value of being in the market with the cost of complying with this new law, some companies didn’t make the calculus that it’s just not worth being there. I think that gets a lot harder in the US. And then California in particular, I haven’t talked to any company that said, we’re thinking of just cutting off California so we won’t deal with any California consumers, particularly in the digital space. It’s too big. It’s too big of a state. It’s too important from an economic perspective to say we’re going to cut it off and not deal with this. You have to.

Eric Seufert:

Well, right, and a lot of these products are built there. Right?

Jessica Lee:

Right. It’s the home of Silicon Valley, you’re not going to say… Yeah, exactly.

Eric Seufert:

So there’s no federal privacy law in the US. We’ve seen the DSA was passed — codified into law in the EU, the DMA codified into law in the EU, obviously GDPR. Why do you think the US has lagged the EU in passing privacy legislation? Because they’re lapping us now. There’s GDPR and then now the DSA. I mean, this has been in law in the EU for quite some time. So what conditions in the EU exist that don’t exist in the US?

Jessica Lee:

Well, historically, probably dictatorship and authoritarianism. If you go back to the history of privacy. Privacy in the EU is a fundamental human right. Right? It’s been recognized like that I think since like the fifties. And part of that is because of some of the previous regimes that existed in the EU. So I tell people that privacy is, in the EU, what the first amendment is in the US. It’s just a core value that they have. So I think if you look at it through that lens, that’s why they’ve been ahead of us to a certain extent. And for the US, I think that we’ve looked at privacy more as a consumer protection measure. And I think this was… I’ll go back and check my timing, but there’s this concept of fair information principles. The EU has basically taken those principles and turned that into the directive that preceded the GDPR, which was implemented on a member-state-by-member-state basis.

And now that has become the GDPR, the regulation that covers all EU member states in the same way. So they’ve been iterating and evolving on their approach to privacy since, we’ll call it the fifties, but for these directives since the nineties. And these are all…they’re not individual specific. So it’s not consumers versus employees versus business versus government. This is just how it applies to any individual, no matter in what situation you encounter them, no matter what type of person they are. In the US, we took those information principles and they applied to the US government and how the US government handles data, but not to how businesses handle data. And so for the US, the way privacy is developed, it’s been — at least in my perspective — more reactive and more sector-specific. So email is invented and then people start spamming you. So now you have CAN-SPAM.

So something happens and we say, oh, this is now a problem now, so we’re going to pass this law that addresses this specific issue, but we haven’t looked broadly to say, who are we from a privacy perspective, what do we think, broadly, about privacy? It’s more: we see an issue and so we address it. Text messaging, the iPhone gets invented, and so now we have a law that addresses how text spam happens. So it’s always kind of chasing these evolutions in technology as opposed to having a broad-based philosophy: here’s our view on privacy.

And so that’s why the EU has gotten a little bit ahead of us. And I think that’s what we’re trying to do now. Who are we as a country? How do we think about privacy more broadly? And then how do we actually start to pass laws? The challenge is, all of this technology and these business practices have developed in the meantime and have been designed based on this gap in our privacy rules. So some of the friction I think you see with US-based companies trying to comply is they weren’t built to deal with these laws. They’re not set up that way. That doesn’t mean they can’t get there; obviously, they can. But it’s a bigger lift than I think regulators understand because of how things have evolved in the US versus the EU.

Eric Seufert:

I feel like I go back and forth a lot on this notion. Sometimes I speak to people in Congress who are just trying to better understand the digital advertising space, or even regulatory agencies. And I go back and forth on this view of, look, the evolutionary cycles of consumer tech sort of necessarily get more complex and potentially even shorter by design. And that’s just the nature of, call it, technological progress. And you can get to a point where those cycles are so short and they’re so profound that it’s just an impossibility for a legislative body or a regulatory body to keep up. And it’s not because the people there are stupid, it’s just that they’re not specialists in those technologies or in those applications of technology. And the tech is just running away with these compounding complexities that even people within that technological field may not understand because they’re two or three cycles behind.

But then I think about if that is exclusively true about the kind of consumer tech that I care about, like digital advertising or identity or just personalization in general. And would that be less true of, whatever, the energy industry, which seems to have been — and correct me if I’m totally wrong here — but it seems like that’s been pretty effectively regulated, or at least there’s been regulation that has applied there that has kept up. And so that could be just a way of excusing either the consumer technology industry of not working proactively or not working productively with governments and just being very loath to open itself up to regulation and to collaborate on productive legislation. Maybe I’m overestimating that industry, or maybe it is really that dire. Where do you fall there?

Jessica Lee:

Maybe somewhere in the middle. I mean, I do think there are industries that have been more highly regulated. And so because of that… And that’s not just privacy regulation. If you’re talking about the energy sector, obviously there are other types of regulation that impact how technology evolves. So you see things moving more slowly because they have to. If you look at healthcare or the pharmaceutical industry, there’s only so fast you can move because there are approvals that need to happen there. There are other bodies that regulate, again, not just privacy, but just largely speaking how your business operates. But for consumer tech… And I hate when people say, oh, there was no regulation before. There’s been regulation, but it’s just been more broad-based. You’ve had the FTC regulating if there was unfairness or deceptive acts and practices, but there hasn’t been any digital-advertising-specific regulation, which creates pluses and minuses. Right? The plus has been, it’s allowed innovation to really escalate, I think, at a rate that we wouldn’t have seen otherwise. Right? So to your point, the pace of technology and the pace of improvement, all these cycles, it moves very quickly and it’s hard to keep up with now from a regulatory perspective because now you’re kind of chasing a ball down a hill, but that’s allowed innovation to move forward. But on the flip side of that, it’s a complete cultural shift now to say, oh well wait, now we’re going to have… Because I mean, in my perspective, a lot of these laws, I think some of the other states, like Virginia and Colorado, for example, have language that is more GDPR-like and broader, but if you look at California, the way that’s written, and if you look at the people behind it and what they were focused on, it’s very specific to the digital advertising industry. And so an area that was able to move forward with regulation, but call it squishy regulation, now has very specific prescriptive restrictions in place. And I think that kind of cultural shift has been very difficult.

Eric Seufert:

Do you think that the specific, prescriptive, industry-level or feature-level regulation is a function of well, it’s California, and those companies are based there and there’s just more of a general cognizance of these industries or these specific harms? Do you think that that played a role or is that just coincidence?

Jessica Lee:

It might be a coincidence. I mean, my understanding at least is Alastair Mactaggart, one of the main authors behind the CCPA, discovered and became very uncomfortable, unhappy, and upset at the idea of all the data collection that happens behind the scenes in digital advertising online. Right? That was the thing that put a bee in his bonnet and kind of gave him the motivation to go down the path of the CCPA. I think he could’ve been sitting in New York, for example, and we would’ve had a comprehensive privacy law in New York. I do think that historically California has always been at the forefront. And well, I take that back. We wouldn’t have had the legislative mechanism to pass a ballot initiative in New York. So there’s also, from a legislative perspective, you could put a ballot initiative up and get a popular vote on it in California in a way that’s not available in other places, so I do think that helps.

But I think it’s a general concern. And this concern isn’t new. Right? I think once we had open RTB and programmatic starting to take off, the FTC did look at data brokers, and there’s been a concern about data brokers and information sharing online, but I think for a large part, we were relying on self-regulatory frameworks to say, well, this is so complicated. It’s moving so quickly, we should let the industry regulate itself and have some teeth so it can get escalated upwards when people don’t comply. But maybe we don’t have all the tools to regulate this space right now. And I think it’s become clear over the years that self-regulation, at least in the eyes of regulators and maybe the public, wasn’t sufficient. So that’s where you have someone like Alastair Mactaggart come in and say, no, we need to have a stronger hammer. We need to have tighter controls. And I think that’s where you see some of the more prescriptive aspects of California’s privacy law.

Eric Seufert:

Yeah, I mean, self-regulation, it reminds me of that story of the Soviet nuclear engineer [Editor’s note: Stanislav Petrov was a lieutenant in the Soviet Air Defense Forces, not a nuclear engineer]. You have probably heard this story where the Soviet radar system falsely detected an incoming nuclear strike, an imminent nuke. And so he was told, okay, launch the missiles, here we go, this is it. And he just didn’t, and World War III was averted. Or I wouldn’t even say World War III, the destruction of humanity was averted as a result of this one person just defying this order. And I think he was imprisoned for it [Editor’s note: Petrov didn’t defy an order, he simply waited for more corroborating information, which never came. He also was not imprisoned]. And it’s just like, do we want to depend on that? I don’t feel like I have an ideological stake here, but it’s like, yeah, I don’t know. That bulwark, I think always seems pretty flimsy. If we’re just like, well, there’s going to be one person who just defies orders.

What’s interesting about the havoc at Twitter is we really have gotten a pretty good look at the machinery of a big tech company. Not mega tech; Twitter was never really that big, in terms of DAU, in terms of market cap, whatever, but about how a lot of these decisions are made. And there was a kind of story like that. It’s this person, I don’t remember their name, but they quit many years ago. And this person Tweeted, well, because of all of the turmoil, I’ll the story. He was an engineer, and I guess a Telco had come to Twitter and said, we will buy all of your users’ location data. We’ll pay you lavishly for all this location data, and we’ll set up a pipeline and you just deliver it to us in real time. And so this person was tasked with building the mechanism for making that transfer, and he said: “that’s an invasion of people’s privacy.”

And so he worked with the data science team and they applied differential privacy to the data so that you had group-level data, but it was noised and you wouldn’t have been able to identify any individual user. And he presented that to the Twitter exec who tasked him with this and he brought it to the Teclo. And the Telco said no dice. We want the data, we want the raw data.

And so this individual just quit. And he was the only person I guess that knew that part of the tech stack. And so because he quit, that feature was never implemented. And I think on his way out, in the story anyway, it’s not corroborated to my knowledge, so it could be totally apocryphal, I guess. But in the story, he said he reached out to Jack. He had quit, he had resigned, he reached out to Jack and he said, look, this is what I was asked to do. I’m not going to do it. And Jack said, okay, that doesn’t sound right. Let me dig into it. And he dug into it and he said, okay, no, we shouldn’t do that. And so he canceled the project.

But it’s like, that self-regulation, I think a lot of times, there’s always going to be this tension. And this is just from an insider’s perspective, having seen these projects develop, and having been brought in to PM these types of projects in the past. The product team or the executive team or whatever, the management’s always going to want to maximize for commercial impact; and then there’s usually an in-house GC who’s doing God’s work. They’re going to want to minimize risk. And so it’s like, “No, we just won’t do that.” And then you get some data science person who’s stuck in between. It’s like, “Okay, how can we achieve both? How can we make both parties happy?”

That sort of self-regulation, you’d hope that people have some sort of generalized sense of propriety with user data, but who knows? You can’t always guarantee against that. I do feel like there are legal constraints that need to be applied. Because I’ve seen cases where, no, no, no, if you let people implement whatever they want, there would be the most rapacious, unrelenting ingestion and usage of data that you could imagine, where it would make most reasonable people very uncomfortable.

Jessica Lee:

Yeah, I think that’s right. Because I think that that’s what the motivation is, right? Everyone that you talked about has a different lens of which they’re looking at a project and a different motivation about what they’re trying to accomplish. So if you’re trying to get the most commercial value, no one wants to hear from the lawyer that maybe you don’t need all of that data. But if I talk to a lot of data scientists, I talk to a lot of business people, people who are formally in the industry, it does seem like maybe all of that data, the goal could be accomplished without vacuuming up all of that data. But putting that aside, I understand that that’s the lens at which that person is looking at things.

I think the challenge for self-regulation is probably a couple of things. Well, one is just enforcement. So if self-regulation requires you to, for example, say certain things in your privacy policy, or have the GAA opt-out icon; you can do that, but you have to figure out that there are other things going on behind the scenes. I think finding out what’s going on behind the scenes has been one of the barriers between really getting good data governance internally.

I think one of the things that will be interesting to see coming out of California is this idea that the SCPPA, the new agency, can audit you at any time, that they could get assessments from you to see what you’ve done to comply. I think that requires more internal governance and structure and thinking about, well, do we need to collect everything? How do we put structures in place? Also, so that this doesn’t become this huge point of friction in the company that you can still maybe not move as quickly, but you can still get things done. But it requires more structure, I think, internally. I think that ends up being helpful.

Eric Seufert:

Right. To that point, I was doing this panel last week on differential privacy, and I was referencing… Apple has a white paper on its website. It’s like, how we apply differential privacy? And it’s like, a five-year-old could understand it, so that’s good. But then it’s not really going into useful detail at that point. It’s like, well okay, I get it. You add noise. I can understand why they wouldn’t really want to go into too much more detail, because this is proprietary technology they’ve developed. Especially with Apple, if they are communicating that privacy is a differentiator for their hardware, then yeah, they want to keep that stuff secret. Because it’s a trade secret, essentially. It’s a product, it’s proprietary IP. So I guess that’s the issue. It’s like, well, how much can you make public? Or how much can you make available for auditing or whatever, without actually giving up real trade secrets?

Jessica Lee:

Yeah, I think that’s right. And also, it’s what’s digestible for the public, right? I think it’s one thing to provide information to a regulatory body where that’s under the cover of confidentiality, arguably or hopefully; and there’s another thing about what do you disclose to the public. I think there’s a trade-off between wanting to have transparency, and wanting to avoid deception. So if we say we add noise into our data sets, we use differential privacy, and so your information will never be exposed; that’s probably not true, because something will happen. Or maybe it’s not applied in every single product, and then you’ve opened yourself for potential deception. So I think what you have to disclose to regulators is one thing, but I think the thing that becomes harder to balance is what do you say to the public so you can be transparent, make it easy to understand, but also not open yourself up to saying so much that you actually tell on yourself.

Eric Seufert:

Right. Connecting that back to my point earlier… So okay, let’s say that in Europe. So they’ve got this demand that you need to open up these systems to external review. And let’s say, okay, we’ve built this fantastic system; it uses federated learning so that all the data stays on the device and we’re just sharing the model coefficients back with a centralized server, and they’re getting ingested and the model coefficients are being used to update the model. And it’s like, okay, well who can you show that to that’s going to understand that? Guess what? If one exists, if one graduates from a PhD program with a specialization in that, guess who’s going to be competing to hire them? It’s going to be the big tech companies. Because there aren’t that many of those. It’s not like there’s this overabundance of people that can understand that and build those systems.

And then they are saying, well, we’ve got to hire a thousand technologists to help us enforce this. How are you going to hire them? How are you going to compete with the companies that want to hire them and are willing to offer very attractive compensation? Where I imagine that EU probably is offering compensation that’s at a much lower level. I think you can make the case where it’s like, well, you could say the same thing about some cigarette company. There’s a lot of people that probably are like, “You know what? I’ve never worked for a cigarette company.” But I think if you went through a PhD program and specialized in this thing, you’re probably going to want to go work somewhere where you’re actually going to be able to apply what you’ve learned and developed, and see it live in the wild. It’s not really the same thing. I don’t think that sort of moral calculus plays in the same way, depending on how you evaluate big tech on that vector.

Jessica Lee:

I think that’s right. The legal profession has something called externships, and I’m sure they have something similar in other places. But basically, you’re at a big corporate law firm, you can go work for a nonprofit, you might work for a government agency, call it for three months. You make your law firm salary, but you’re doing this work for these places where the salary would be much lower, but it gives them access to a new infusion of talent.

Obviously, if you’re regulating their potential conflict issues and that kind of thing… I said this on Twitter once, and people were like, “Oh, I don’t want anyone from Facebook anywhere near the government or regulations, because they’ll corrupt it. Because whoever would go to Facebook is obviously corrupt.” I think that’s a little too cynical because I do think there are people who go to these companies and want to help do the right thing. But I would love to see some way to allow people to go and help regulators understand technology in a way that doesn’t require them to make the choice between a government salary and a big tech salary.

I do think that there’s value in having multiple perspectives. If you talk to someone who’s been in the government who then comes and works for a big company, I think it’s a little eye-opening to see some of the challenges of complying with the law that they didn’t have visibility into before; and then I think, ideally, vice versa. So I think we would be benefited from opening the lines of communication between the two, but there are obviously some conflicts, challenges; and then the moral considerations of, is this a person we want working in the government?

Eric Seufert:

Right. Yeah. That could cause some friction there, I guess.

Okay. So we’ve seen a spate of recent legislation be proposed that takes aim at Big tech. So you’ve got the AICOA, the Banning Surveillance Advertising Act, the Open App Markets Act, the Competition and Transparency and Digital Advertising Act, the ADPPA. So I guess my question here is: why now? Why have all these things been proposed recently? I think in the cases of all these bills, this year… maybe not. But why now? What caused this flurry of bills related to competition, transparency, and data usage to be proposed in the sort of recent 24 months?

Jessica Lee:

Sure. Well, I think from the privacy perspective, this has been bubbling up for many years. So I think, well, that’s what we’re seeing. This is finally coming to a head, but this has been simmering under the surface for quite some time. And if you think back, we could go all the way back to right before 2018 when GDPR went live, I think it was a month before, that the Cambridge Analytica scandal was revealed. And so that was one of the first big, let’s call it data or privacy scandals that really got consumers attention. Because I think regulators have been focused on these issues, like I mentioned, the data broker report and FTC looking at these things. But if you weren’t really in this industry, I think the average consumer, I don’t know if I would say they were fully paying attention to what was going on.

So you have Cambridge Analytica; you have GDPR, which led to the flurry of privacy notices getting dumped into people’s inboxes at the end of May. It just led to this snowball effect of additional public conversation around privacy and data, and EU versus US, and what advertisers are doing. Shortly thereafter, we have the CCPA and the campaign to get the CCPA passed, the ballot initiative.

So I feel like from a consumer, a public perspective, wanting to see privacy regulation, this has had a snowball effect that maybe we could point back to early 2018 as one of the starting or kicking off points, where consumers got focused on this. Like I said, regulators having been focused; consumers coming to the table and being concerned about this as well. And then you saw these reports. The New York Times a couple of years ago had a big report on location data and how companies were tracking the location data, could tell from your phone. And then they took two or three people and said, “With this information, not that these companies were doing this, but they could do this.” So you had all these reports about privacy scandals, how data was being used. So I think you’ve had this push or surge for additional privacy protection.

And then on a parallel path, as technology has been evolving, I think it’s become clearer that data and personal data in particular is a competitive advantage. The argument has been that some of these privacy regulations harm smaller companies and allow some of the big tech platforms to continue to thrive or absorb the fines and keep moving. There’s been a concern about, well, how does data and antitrust intersect?

So I feel like this is spiraling to this point where there’s a clearer need and understanding that consumers’ data is being used in ways that they probably didn’t understand before. And then it’s also offered a competitive advantage to some companies. I think regulators are now getting pressure. It’s also with the business uncertainty. I think businesses actually now are lobbying to have, well, “just tell us what we need to do, so we can move forward.” Because this place of uncertainty isn’t helpful for us either, so you’re getting calls on all sides to get something done.

Eric Seufert:

When a company says, “Regulate us, please. We don’t want to live in this sort of with this pall of uncertainty cast over us all the time. Regulate us.” Do you believe that?

Jessica Lee:

Generally speaking, I think they want both, right? I think they do want certainty because…I talk to companies all the time about this. It’s a changing landscape. It was like GDPR, CCPA. It’s these five states, it’s different regulations. Different sectors have laws. There’s evolving laws in the EU when people thought, I think, in the US that you just have GDPR, and you solve that and you’d be done with it. No, now we have additional… There’s all this changing, I think, landscape, and then the platforms are changing their policies, too. So you have like ATT, you have the deprecation of cookies. It’s all of this uncertainty. And so I think, you can’t change what the platforms do necessarily, but I do think companies want to see some security in a law and they want to be regulated.

Now, with the caveat that they want to be regulated in a way that allows them to continue to move forward. So no one wants a regulation that’s going to turn all the faucets off for all the data. No one wants a regulation that’s going to have all these class actions coming at them. So yes, we want regulation, but what does that regulation look like? That’s where we get into some of the back and forth about how to actually hammer out a good privacy law.

Eric Seufert:

That’s a good point. And it’s one that I find frustrating when I look at the reality of some of the bills that are proposed or just a lot of the rhetoric that you see from the people that have real influence on how these things get structured. So talking about shutting the faucets off, well of course these companies don’t want that, but I would argue that consumers don’t want that, right? Consumers want their data to be utilized for their benefit. They don’t want all digital products to go back to 1998. They don’t want punch the monkey ads. Do you remember those annoying ads?

Jessica Lee:

I don’t remember punch the monkey.

Eric Seufert:

You don’t remember? So it was this monkey that — it was those banner ads — and a monkey moving back and forth and you had a big red-

Jessica Lee:

Oh my God, I have to look this up.

Eric Seufert:

It was totally obnoxious and that’s why that ad was ubiquitous because it had the best click rates because everyone’s trying to punch the monkey and it was tricking people into clicking the ad. And this is a little bit hyperbolic, but I don’t think there’s a tractable sentiment within, just call it the general consumer body, which is basically everyone for digital products now with smartphone. I mean it’s everyone. I don’t think there’s any sort of sentiment that we want to lose the functionality that we’ve gained, and back to my earlier point that technological innovation has accelerated over time. And so people just want their data to be used, and if someone asked me at a high level, what does digital privacy mean? And I had to come up with a pithy one liner: it’s that my data is used in ways that I would expect it to be used, or ways that I have been informed it’s going to be used. And I have made the decision to continue using that product.

I think genuinely that’s what people want. And so when you see some of the bills, for instance, and this is my personal belief, but the Banning Surveillance Advertising Act goes way too far. And I think the magnitude of that bill would be such that I think consumers ultimately would be unhappy with the consequences of it. Now, not all those consequences would be first order. A general consumer doesn’t understand anything about digital advertising, which why they would they, would not recognize that as a consequence of it. But nonetheless it’d be a downstream second order effect. Sometimes I do worry that kind of like, yes, you’re acting on behalf of the consumer and are these things that have happened that got consumers sort of invested in this and therefore that’s when the legislative process should kick in.

But then we shouldn’t do things that are anti-consumer as a result of that. And so my sense is sometimes, and again, speaking to some legislators, speaking to some regulators, I feel like yes, you purport to be doing this on behalf of the consumer, but you don’t have an advocate for the consumer, I think, in this decision making process, which is articulating the value of these things to consumers. So let’s jettison the bad stuff and try to wrap our arms around as much of the good stuff as we can so that we can kind of strike this balance.

Jessica Lee:

Yeah, I mean, I completely agree. Well first I hate the term surveillance advertising. It’s very disappointing to me that we’re talking about the advertising industry and they weren’t able to get ahead of the marketing of their activities. And so it’s gotten this label and I think the label suggests that it’s all bad, that there’s no redeeming value, there’s no benefit to consumers from the activity, so why do we even need to do it? And I think, to your point, I don’t think that’s the lens at which we should look at this because I don’t think consumers want to go back. I still don’t think consumers want to pay subscriptions for every product. There’s a lot of talk about subscriptions taking away ad supported models. Personally, and I have a good salary, I still don’t want to pay even $5 for 55 different platforms to get access to them.

That’s just not what I want to do and I don’t want to have to manage all that. And I don’t think a lot of consumers want to do that. I think they’re probably willing to pay subscriptions for certain things. But the ad supportive model I think does have a place and then the question just becomes what are the harms from that and how do we protect from the harms? Because if we look to, I was talking about where some of this may have started in 2016, that’s two years I think into the Trump administration and fake news and disinformation. And so I feel like some of this is also, consumers are looking at how data’s collected online and they’re thinking about the worst harms, which is they get manipulated, they get put in these bubbles where they only hear what they want to hear and their misinformation gets amplified.

We hear a lot about that. But you don’t hear about the potential benefits. And I think it’d be good to look at this to the lens of how do we structure this so that consumers can first acknowledge there’s a benefit to advertising and also I don’t want to see irrelevant ads. You’ll hear mixed viewpoints on that or people don’t care, they don’t look at the ads. But I have a dog, so I don’t want to ads for cat food, I want to ads for dog stuff. Some of the stuff I’ve brought that I like, it’s because someone who served me an ad that was relevant to what I need. So I do believe there’s value in that, but you know, shouldn’t have to risk some of the other harms.

Or if we look at the changing political landscape, particularly post Dobbs, what information… Now there are other harms that we need to think about from having your information exposed. So I think there’s a way to address, I think, what are the real harms that regulators and consumers are worried about without totally shutting the faucet off so you don’t get the benefits of what advertising and ad support platforms provide.

Eric Seufert:

And that’s the needle that’s got to be threaded. So I was going back to this, I was on this panel recently about differential privacy and it was a bunch of academics and me, so I was by far the least qualified person to be speaking, but I expected them to be very hostile to me and they weren’t. I felt like they were much more sort of open minded and reasonable than some of the people I’ve spoken on to in the legislative side of things. And they said, “Look, all this stuff is context dependent and we can identify a harm in one context. Harm doesn’t mean there’s going to come to your house and arrest you, right?” But that could be a harm related to Dobbs related to your location data with these ridiculous bounties that you can in Texas can… I mean I’m in Texas right now, I’m from Texas and I live here and you could make money by getting someone arrested because they terminated a pregnancy.

And that’s putting aside my feelings on that, which I think that’s atrocious. But nonetheless, that’s a very real human harm. I mean that’s not this theoretical thing and that’s very much a concrete harm that would be inflicted on someone. But there are other contexts where the sort of theoretical harm, it’s not concrete or it’s not sort of meaningful and those are not the same thing. And so you could have context dependent definitions of privacy there, or at the very least a recognition that those harms differentiated in meaningful ways where one leads to someone going to jail and one leads to something, I don’t know, I’m trying to think of some innocuous consequence, but, some is an innocuous consequence. So if you say, look, no, we have to treat all sort of use cases of data collection and data usage as if a privacy violation would result in someone getting hauled away in the middle of the night by the secret police, well then you’ve just brought us back to 1998 and punch the monkey ads.

Jessica Lee:

Yeah, I think that makes sense. I think the challenge probably is there’s a certain amount of information that gets collected online. And so you have that information, you have your innocuous use cases for it, but then there’s a risk, you get a warrant or subpoena from the government that you have a breach and the data gets leaked, employee runs off with information and then you’re at the risk for the other harms. And maybe that that’s only specific to, we can call it maybe certain categories of data are at a high risk for that activity, but it’s not like internally data gets siloed based off of the innocuous use case of data here and then the potential for harm data goes here, it’s all together.

I think that’s where for me, the conversation around the privacy enhancing technologies and differential privacy and synthetic data, all of that is interesting because I feel like you get to a place where what controls can you put in place so that data that sits there can really just be used for those innocuous purposes, even if it’s those innocuous purposes are annoying to some people, people don’t want to see the ad traveling around, but that’s not the real harm.

The real harm is someone knocking at your door, it’s law enforcement, it’s discrimination potentially. So I think it’s trying to figure out, how do we have internal controls so that we reduce the likelihood of those harms that we’re really trying to get at, but we can still allow for what are valuable business purpose is.

Eric Seufert:

Right. How should operators within the tech space, so the advertisers that use the large ad platforms or the developers that upload their apps to the app stores, how should they stay abreast or the smaller companies that don’t have an in-house team of lawyers working on this stuff and combing over the latest developments in privacy legislation, how should they stay abreast of legislative momentum related to privacy or competition or data usage? How does small tech prepare for the kinds of changes that are likely to directly impact the way big tech operates?

Jessica Lee:

Sure. I mean I think that’s kind of a power is a number question. So trying to get involved with some of the industry organizations that can help keep you abreast of what’s happening. And some of them, to the extent they’re lobbying organizations might be able to lobby on behalf of your interests. Because I do think we’ve been talking about this the whole time, there’s so much going on, there’s so much evolution from the laws to platform changes and I don’t see how small companies stay on top of what that means for them. So I think if I was in small tech, I would want to set up internal data governance that was scaled to my size and the type of data that I have.

So I have generally basic practices in place. I have a business strategy for the data I need to get access to thinking ahead to what will be signal loss, for example, how am I going to protect myself from a competitive standpoint and then what are the industry groups who will help me understand how the landscape is changing so we can evolve. And that’s probably the best roadmap that I can kind of lay out because there are groups like the IB for example, or Cal Chambers, if you’re in California, there are these various bodies that will keep you informed but also advocate for your interests. And I think if you’re a small company, it’s not likely you’re going to be able to do that on your own.

Eric Seufert:

Jessica, this was a really illuminating conversation. I think we could have spoken for another hour easily and not exhausted my list of questions here. I appreciate your time very much. How can people find you on the internet? How can they interact with you, engage with you?

Jessica Lee:

Sure. So I’m on LinkedIn. And then I’m still on Twitter. I haven’t been convinced to pop over to any of the new platforms, so I’m waiting to see how that evolves.

Eric Seufert:

I just got promoted from the wait list for Post. I’m excited to try that out.

Jessica Lee:

I’m on the waitlist.

Eric Seufert:

Well, maybe I’ll see you there. I’ve tried to sign up for Mastodon multiple times and every single server tells me they’re not taking new signups, so I haven’t succeeded there. But Jessica, thank you so much for your time. I appreciate your time, I appreciate your insight, and I wish you a great day.

Jessica Lee:

Thanks, you too.