Privacy Lawls with Donata

Ep.20 | Privacy in the age of advanced technologies & AI (Guest: Daniel Solove)

The last several decades have shown us that with great technology comes great privacy risks. Now that AI is all the rage, it appears that this trend isn’t slowing down anytime soon.

On today’s episode, we have author and professor at GW Law School, Daniel Solove, discuss the tricky balance between technology and privacy.

Show Transcript

[00:00:00] Hello, and welcome to episode 20 of Privacy Lawls, where I, Donata Stroink-Skillrud, speak with amazing privacy professionals, and we have some laughs along the way as well. Today, I will be speaking with Professor Dan Solove about his upcoming book titled On Privacy and Technology. Dan is the Eugene L. and Barbara A.

Bernard Professor of Intellectual Property and Technology Law at the George Washington University Law School. He founded Teach Privacy, a company providing privacy and data security training. One of the world’s leading experts in privacy law, Dan is the author of more than 10 books and more than a hundred articles.

He’s the most cited legal scholar born after 1970. So Dan, thanks so much for joining me today. So you’ve had an illustrious career in privacy and technology law. What sparked your interest in this field? Well, I was in law school in the mid 90s, and I knew I wanted [00:01:00] to be a law professor. I was really interested in academia.

And I really enjoy philosophy in the humanities, and I was looking for what area of law should I be focusing on. And I took a course, an early course in cyber law, and I really thought it was fascinating. I thought that the internet was here to stay. That was a wager I made that luckily turned out to be a good one.

And so I focused my Thoughts and thinking and writing on cyber law. And I thought I was going to be doing internet law broadly, but the first topic I thought I’d tackle was privacy mainly because hardly anyone had written in it. And I thought it was really fascinating and lots of really interesting difficult nuanced issues, which is great for someone who, who, who enjoys the humanities.

So I. Thought I’d start there and then expand to all sorts of other issues dealing with internet law [00:02:00] But it so happened that I entered a rabbit hole with privacy. I didn’t realize it was it was that big I thought okay, maybe it’s a few papers but instead now it’s it’s you know more than 100 papers and and all these books it really is such a big issue much bigger than I ever and as I kept writing and thinking about it the field kept growing new issues kept emerging.

And that was just really exciting to kind of be there at the ground level and watch this take off. And enjoy the ride. It’s definitely a field that, or a rabbit hole that gets bigger or longer every year. I mean, I started with, can you use my image on ads to Siri’s listening to me without my consent?

And as we see more new technologies emerge, we’re seeing a lot more privacy invasions occur too, which is really, it’s bad, but it’s also interesting in a way. So. Through your work [00:03:00] at George Washington University Law School and at Teach Privacy, you’re clearly very passionate about education. How do you talk about these complex topics of privacy and technology in a way that everybody can understand?

Well, I think the key is to start with thinking about the importance of understanding what privacy is going back to the core, which is really a set of philosophical issues, a set of difficult policy questions is how much public exposure of information make something not private, for example what exactly does privacy mean?

What does it entail? What should we protect against? What Conception of privacy you have affects everything. And so one thing I tell my students is that when you look at every privacy law, every case, every policy decision, every time a company says we’re baking privacy into the design of something.[00:04:00]

There is an underlying conception of privacy. There’s an underlying view about what should be protected and what should be left out or not protected. And I often find that that conception is limited. A lot of times things are left out. A conception of privacy might be overly narrow. Lacking imagination, really lacking an understanding of the harms.

And so a lot of my work has been trying to flesh out an understanding of privacy that is robust, that, that captures all the issues that it involves. Because I think privacy is a very

Multiple different things. It’s not just one thing. And so that’s where I recommend starting for a student just thinking about getting a really good grasp of privacy and understanding the effect that the way one thinks about privacy has on the ultimate policies. [00:05:00] And then I, I focus a lot on in, in my training and in the things that I try to do to educate people about privacy.

I think it’s important to focus on the core. Concepts of privacy. There are so many different laws, 160 different laws around the world. There, there’s new ones every year. So that number is growing. There are all these different laws in the states. 20 states have a consumer privacy law now in the United States, all of which is in recent years.

There are subject specific privacy laws on biometric information, children’s information, content moderation, health information, and more. So we’re talking about dozens and dozens, hundreds and hundreds of laws all around the world. There’s no way that a lay person is going to understand each and every law.

So the key is to look at the core, core principles of privacy, which underpin a [00:06:00] lot of these laws. At the end of the day, you can look at all the laws and they all address data minimization to some extent. They have different approaches. Some are good approaches. Some are weaker approaches. They have a view of consent.

They have scope. They have address enforcement issues. They address what duties should Collectors and users of data have what right should people have? And what’s the definition of personal data? And if we look at like these different pieces, we’ll start to see the common elements in the laws and then the common principles about what should be done to protect privacy.

And I think that’s a way to deal with the chaos that we see legislatively and policy wise to really get to the core. Because when you look at it, there’s a few approaches to consent. There’s a few approaches to defining personal data. And we can get a handle on those approaches and determine which ones work better than others.[00:07:00]

Absolutely. Yeah, it’s a great way to look at it, especially through different countries because they view it differently in Europe than here or, or different industries. Absolutely. So through teaching law students throughout these years, have you noticed that there’s more law students that are interested in privacy and technology courses?

Absolutely. It’s really staggering. Actually, when I started teaching, I taught one course in privacy and that was really the only course we had at the law school. And we have in a class, we have a roughly 450 to 500 students and we’re one of the bigger law schools. And I would get maybe. 40 50 students in in a year.

So during a year, an academic year, 40 50 students would take privacy. That number of gradually increased and increased and increased now. We have an entire curriculum devoted to privacy and technology. I teach privacy [00:08:00] many times a year. I teach my course not just once a year, I teach it once a semester.

So twice a year, and then I teach a a course focused on consumer privacy. My numbers are now 90 to 100 per class. For a total of, you know, 200 to 250 students in a given year, which I think is pretty remarkable because that is almost half the, the class size and I’m not the only one teaching privacy.

I have a colleague who teaches a privacy course another teaches a privacy seminar. We have AI courses, which are related to privacy and other privacy adjacent topics. So all told the demand I think has gone really through the roof a multitude of times greater than it was 10 years ago and 20 years ago.

That’s really cool. When I was in law school, we had zero courses about this and, you know, I took business law classes and they [00:09:00] never discussed this. You know, do you just had your usual property law, torts, criminal law? But I wish we had this topic because I think I would have been very interested in it back then, but I didn’t even know this existed while I was in law school.

Yeah, it’s still growing in the academy. I would say still, unfortunately, fewer than half of the law schools have a privacy course. And then of those that have a course, a lot are taught occasionally by an adjunct. So, you know, someone from practice comes along and says, Hey, I’d like to teach a course. And they’ll teach a seminar and then it’ll be taught once and then maybe twice and then it’s gone.

But they don’t have a, a stable course that’s in the curriculum. That’s something that we have now, you know, every semester there is a privacy course. Someone is teaching a course and it’s, it’s something that students can rely upon, you know, semester in semester out to be available to them. And [00:10:00] I, I think though that I find it outrageous actually that law schools don’t all have a course in privacy, especially given where privacy is.

Just imagine if, you know, there’s a law school and they don’t have an IP course. It’s absurd, right? But, you know, if you look at the law firms look at the top 100, top 200 law firms, they all have a privacy law practice. Every single one of them. The IAPP, the Organization for Privacy Professionals has, I think, close to 100, 000 members now.

The field has grown dramatically. Student interest is through the roof here. And I just think it’s unfortunate that law schools are so slow to react. I think it’s because when, you know, professors view of practice was when they were. practicing. And that was 10, 20, 30 years ago. So [00:11:00] they really don’t know what the world looks like now.

And I also think that, you know, another issue is that, you know, as law schools, we don’t have like this meeting every year where we say, okay. You know, what is going on in the world here? What’s going on in practice and what courses should we have? What are we missing? So the way courses get added is someone wants to teach it, you know, okay, I want to teach this.

I go through the process to get a course approved, but it’s very much ad hoc based on a particular individual’s interest. And so over time, law schools have been adding the courses mainly because they hire people who want to teach privacy. And then that. It’s the course to their curriculum, but that’s how it’s getting at it.

I wish there would be a more of like an edict from on high that would just say like, okay, we’ve got to have this. This is really important. Students really want to take this course. Let’s offer it. Let’s [00:12:00] hire people to teach it. Let’s, let’s really you know, you push this from the top down.

Rather than from the bottom up. But that’s not how things happen. And so that’s where we are, but I think You know, it’s ridiculous that we’re still in a situation where Maybe like 40 of the schools have have a course. I don’t know the precise figure But well, hopefully we’ll get to to more soon because I think a lot of students would really benefit from it even if they’re not You know, working on privacy.

I think just having a general knowledge of it is, is absolutely helpful. So let’s talk about your book. So you have a book that’s upcoming called on privacy and technology released on February 14th of this year. So actually released in two days when we’re recording this. But for everybody listening to the recording, it was February 18th, 14th.

Sorry to correct you, but they bumped it down to March, March 4th. But that now is the official date at this point. It’s definitely March [00:13:00] 4th. But yeah, they your, that was the original date, but they, they they bumped it. Okay. So March 4th, we got that to look forward to. What sparked your idea for this book?

Well, I got the idea because I realized that, you know, it’s about a quarter century, about 25 years to this day that I’ve been thinking about privacy. I started teaching in 2000. And I really thought it was, it was a good time to reflect on, you know, where are we in privacy? Where are we going?

So I thought that it, it, it, it was just a chance to kind of sit down and be reflective. And I wanted to write something that was short. I had recently read Timothy Snyder’s On Tyranny, and I really liked the fact that that book was so short and so succinct and really was very powerful and helpful and accessible.

And so I thought, hey, you know, I, I think, [00:14:00] you know, something like this about privacy could be very helpful to summarize my thinking over the years, make it accessible to people, and then also to think about the future. That given where we are now, especially now that we’re living in the age of AI I think we have we’ve been in the age of AI for a while But at least now that the media has woken up to the fact that Here we are, and this is what’s going on, and I think that given all this, I thought it’s a good chance to think about, okay, you know, we’ve had, you know, the last 25 years, all these developments in privacy, all these rapidly changing technologies, and then all these laws that have been passed to try to address it, and are the laws working, what should be done in the future, is privacy possible anymore?

Is it possible for law to keep up with changing technology and then what should we do about it. And so that was what sparked this, this book and, [00:15:00] and I really enjoyed the process of writing this book and thinking about this taking a step back and thinking. About the forest rather than particular trees which I often think about when I write a paper.

I’ll think about, okay, I’m going to look at this topic and I’m going to get really detailed about this. The book was trying to look at everything together. That’s really cool. Yeah. I think those are definitely issues that all of us Try to think about every day and we get distracted by client work.

And, you know, then we have to sit down and truly take the time to ponder these things. But it’s very important that we do that. Is this book for lawyers? Is it for regular people? You know, what’s the audience. The audience, I hope, is everyone. I think that for I wanted to make the book understandable for a lay audience so that, you know, someone doesn’t need to be a lawyer to understand the book.

Someone doesn’t need to be steeped in privacy [00:16:00] to understand the book. But I also wanted it to be valuable for someone who does know something about privacy as well. And so I’m hoping that for folks that are very steeped in privacy, they will find some useful insights and find the big picture view of everything to be helpful.

Because I think it’s often the case where you know, we don’t Take the time to look at the big picture because we’re so You know dealing with you know, one issue after the other and that’s one thing with with Privacies the fire hose is is like gigantic and it’s It’s it’s it’s just hard to drink from that and then take a step back because there’s no, breather.

So I thought that this could be helpful in that way. But I also really want to reach an audience of folks that, you know, aren’t really focused on this every day, that just want to understand what’s been going on. And so. Part of the book is to discuss thinking about privacy [00:17:00] over the last 25 years my thinking about privacy, but also the works and thinking of others that have influenced me that I think is very important to understand.

And I wanted to discuss that and explain it and summarize it and put it all together. And so, yeah, I’m hoping the book can reach a lot of different people. I’m sure it will, especially considering your writing career so far. You know, every group talks about your books. You know, in terms of talking about the last 25 years, You know, in this podcast, we’ve discussed the history of privacy.

We know that privacy isn’t a new issue per se, but your book talks about new technologies and how those technologies affect privacy. What types of new technologies do you discuss in your book? Well, I talk about everything from the, you know, rise of the internet. Social media and how rapidly evolving social media is I [00:18:00] talk about the increasing use of algorithms which are, you know essentially AI AI is a, is a subset of algorithms, certain types of algorithms.

In fact, AI is actually a term that encompasses a lot of different things. There’s many different types of AI. But I talk about you know, the use of technology to make decisions about people, to make predictions about people in certain cases, to make inferences about people various methods of gathering data about tracking people watching people.

Analyzing data about people storing data ways of access technologies of hacking and you know, also technologies about how data is shared. And various other uses of data. So the technologies are, are vast. I mean, everything from devices that people use to home [00:19:00] assistant devices to smart doorbells to smart this and smart that, because increasingly everything is connected to the internet.

You know, a car is, is now, you know, a, a computer and so many things now have chips in them and are, you know, connected to the internet. I recently watched a TV show about a family that moved into a home from. I don’t know, like the 70s or something. And it was supposed to be, obviously it’s fictionalized, but it’s supposed to be the first smart home of its time.

And one of the kids turns on the smart home system and it essentially ends up like persecuting them. And. Ruling and ruining their lives. And I thought it was really interesting. We’re seeing more and more of those kinds of TV shows and movies come out where they criticize or show you what can happen with these smart systems, which is why I’m so absolutely terrified of anything that says it’s [00:20:00] smart.

I just, I just want a light bulb or, you know, I just want to switch or I want a regular washing machine. I don’t need on any of those connections. Yeah. I, you know. We had a a smart fridge or supposedly a fridge that was, wasn’t that smart, but basically it, it got shorted or something happened with in, in, in a storm.

So I guess there was a surge and lightning fried the chip. It took about three or four months to get a replacement for the control panel, but the fridge didn’t have like the dial to keep it cold. So even though everything was fine, even though it was technically capable of cooling, it wouldn’t work because it didn’t have the stupid chip.

And I go, can’t you set it? So like you can, you know, like just keep it cold so we can use it. No. So it was unusable for three months until it got this stupid chip to operate the panel. When you used to have the old fridge where it just [00:21:00] have the dial and you can manually turn it on and turn it off. So that’s kind of where we are these days.

And so i’m not sure you know, how how great you know smart things are Actually Did a training program, a kind of little vignette about the smart fridge that decides that, you know, someone needs a diet and won’t let him open the fridge and, and get food out and tries to control a person’s eating habits.

But that’s, I think where we are now that, that everything is, is connected to the internet, but you know, that there, there’s a lot that. doesn’t really have to be. Not everything needs to be smart or chipped and I do long for just the old manual controls, at least leave them on so that not everything has to, or forced to run through the internet would be nice.

Yeah. I saw a woman who couldn’t make her Thanksgiving dinner because her oven needed to do a software update. [00:22:00] Why? Why just let me cook my turkey and move on with my life. It’s, it’s kind of insane how things are going, but well now it’s the AI assistant, right? You, you know, you said that stupid paperclip that you’d come up in word, like, I want to help you.

And now it’s, now it’s the AI assistant. It’s like another paperclip. I want to help you. I’m AI, you know, let me, you know, do this or that and whatever. So I can’t even open up PowerPoint anymore without it trying to like, tell me how to design my slides or, or it’s, it, so yeah, I think it’s, it’s now, you know, this, this constant badgering of people with various things that, you know, kind of almost get in the way that I find or.

Sometimes they’re helpful, but, but a lot of times they’re not. Yeah, absolutely. So in your book, you, you’re going to discuss how current privacy laws, they put too much of the onus on managing privacy on the individual. And I think that’s correct in the sense that as a [00:23:00] consumer, people might not have thousands of hours every year to read privacy policies or to beg companies to delete their data or answer every spam call.

You know, in most cases, we don’t, we can’t do anything about companies mismanaging our data. Can you share some of your thoughts as to why the current system of consumers managing their own privacy is flawed? Yeah, I think it’s flawed for a number of reasons. First is that it, it’s just, people don’t have the expertise to really figure this all out.

You know, people don’t read privacy notices. They don’t have the time. It’s too time consuming. They, they, they’re not privacy experts. I, I don’t think they can be I’m a privacy expert because I’ve studied this for more than 25 years and I still can’t make decisions because I don’t know enough.

I know that I lack the key [00:24:00] knowledge. Someone says the security is reasonable. What does that mean? I would want to know, for instance, what kind of encryption is used? Not all encryption is created equal. How well trained is their workforce? How good are their vendor agreements? Who are they sharing the data with?

I’d want to know what their level of security is. I’d want to talk to the security officer. Are they doing penetration testing? I’d want to know really a lot of details. For the privacy side, I’d want to know the privacy officer. How good is their program? You know, do they do, do they, how good is their training?

You know, what, What do, you know, how do they do data minimization? How long do they keep my data? There, there are a thousand questions I would be asking to really make a decision. So a policy that just says, we love your privacy. We care about you, blah, blah, blah. The typical corporate BS is, is, is meaningless to me.

That, that doesn’t answer my questions. And I really need a lot of time to study how the company is. [00:25:00] Addressing privacy and then it doesn’t scale. So now I have to do this for Thousands of different companies. And they say, Hey, we’re going to change our privacy policies anytime. So now I have to look at it all the time.

And then am I supposed to manage it? So, you know, key thing is that, you know, there’s a right to delete data if it’s no longer necessary for the company to use. Well, how am I supposed to know that? How do I know? What, you know, when it’s no longer necessary. So I keep, do I have to be the child in the back of the car?

Like, are you done yet? Are you done yet? And I have to do that a thousand times for a thousand companies all the time, multiple, you know, so it doesn’t, it’s just not practical. It’s absurd. And then for AI, how am I supposed to figure out like what the risks are? I think ultimately when it comes down to it, the consumer really wants to know the following thing, I think this is really.

What it all boils down to, right? If I share my data, if I let you [00:26:00] use my data, if you collect it, use it, whatever you want to do with it what are, what are the benefits and what are the harms? What are the risks to me? Are the benefits better than the harms? If I say, yeah, I’m going to get these really cool benefits and there’s not going to be many harms, go for it.

If I think the risks and harms are significant, then don’t do it. I need to make that decision, that risk decision. And I don’t think consumers can do that. They’re not given enough information to do that with AI. It’s almost impossible to do that because it’s hard to really understand exactly what the implications are.

The creators of it don’t fully understand what the implications are. So how is a consumer supposed to do this? And that’s why I think ultimately I use an analogy. That it’s, it’s like going to the supermarket and buying milk, you go to the supermarket, you buy milk and you can trust that, Hey, you know, if I buy this milk, it’s not going to kill me.

And it wasn’t always like that. [00:27:00] Actually, it used to be that milk, you know, was deadly and it could kill you. And some of the manufacturers would mix it with the maldehyde and the people would start to want, why are, why are like hundreds of babies just dying all of a sudden? Well, it’s because the milk.

had formaldehyde in it. This is before we had, you know, good regulation, but now we know that, hey, you know, I don’t have to go to the supermarket, become an expert on each farm. I don’t have to become a milk expert and learn like techniques of pasteurization and, and, and do all this. All I need to know as I go in, I can look at What what’s cheapest price?

What milk do I like the best? And someone’s looking out for it that understands milk and is Hopefully, you know doing a good job with that now. We’re about to lose that it seems like in the states It looks like now that we want to you know, kennedy wants to take us back to Unpasteurized milk or whatever but but you know, let’s say in a well functioning system They got my [00:28:00] back.

I don’t have to worry about that. And then if i’m harmed by it Something is bad and something goes wrong, I can sue. And the problem with technology and privacy these days is it’s not like that. It’s buyer beware. It’s like, I’m supposed to become the expert. I have to figure all this out. I have to learn what the potential harms are.

I’ve got to undertake the risk. And then if I’m harmed, there’s often very little recourse. And that means that the companies don’t have to be accountable. They know, like, if we create harm, then tough luck. People are just going to suffer from it. If we don’t have to internalize those costs, why should we make our products safe?

It gets back to the days when milk had formaldehyde back to, you know, more than a hundred years ago. That’s kind of where we are with technology law today. It’s, it’s like that. It is, you know, and I think we need to modernize it because what we learned is that, you know, this regulation works. It actually works.

We don’t have babies dying of formaldehyde in [00:29:00] milk anymore. And there’s a reason for that. Yeah, to me, the recourse part is the most frustrating. So I get about 10, 15 spam calls per day. If you ask them where they’re calling you from, they tell you some random name of some company. If they ask you, you know, if you ask them where’d you get my data from?

They won’t give you an answer. They’ll just hang up or they’ll tell you something random. If you ask them to delete your phone number, they will continue calling you. If you put yourself on the do not call list. Which I’ve been on for 10 years. That doesn’t matter. And then if you file a report with the FTC, they don’t care.

They don’t shut anything down. So it’s like, what’s the point? How do I stop this? It’s very, very hard to stop it. I mean, end of the day you know, there are, you know, the phone companies have technologies that can restrict robocalls and can restrict this. They just don’t, you know, use it all the time as much as they should.

But I think, [00:30:00] you know, the law could, you know, force. Certain requirements to y y you know, block robocalls if people want to block robocalls and and could do it more robustly. I think one reason why policymakers don’t want to do it is they like to make the robocalls for their campaigns and everything else.

You know, they they always kind of see, you know, how it how it helps them. And then, you know, screw the consumer when it doesn’t really help their own interests. There are some minimal protections against this, but, you know, when you have a lot of fraudsters who can readily circumvent those laws it, it doesn’t help.

But there are methods, there are things that could be done, but they have to be done more rigorously. So that they’re less easily exploitable and that’s one thing that we don’t do very well in the law. The law is very bad structurally to kind of stop things because what it really wants to do is it wants to allow companies largely a free reign to do whatever they want and [00:31:00] you know, it’s known like there’s a lot of fraud here.

There’s a lot of danger here but you know that those costs are kind of left to the consumer because it’s convenient for companies to not have these restrictions because there are restrictions. It becomes a little harder for them, not impossible, but harder, and they don’t want it harder. They don’t want it to be more costly.

So that’s why we have a system that grants credit very, very easily. It’s very easy to get a credit card. The problem is so can an identity thief. You know, there are a lot of things that could be better if the law were to mandate it it just doesn’t. And you know, for another example two factor authentication, which now is actually starting to be offered on accounts.

For a long time, it wasn’t offered, even though the technology existed. And I said, you know, it should, the law should require that it’s at least offered to people. It’s available. It should be offered. It should be encouraged because it is [00:32:00] much better than just the password alone. It’s much more secure. you know, there wasn’t an appetite to do that kind of thing because God forbid a company is told what to do.

Eventually they realize actually it’s a good thing and now they’re doing it, but I mean, we can see the same story happened with seatbelts and car safety. You know, companies didn’t want to do it. The law had to, to kind of. Push them to do it and now they can do it, but that’s what we get a lot of times.

Oh, no, it’ll never work. People will never want it. You know, they’re not interested in it. It’s never going to sell. It’s going to be too costly. It’s impossible. You can’t do it. It’s like a child throwing a tantrum and that’s how they act. And it’s, it’s, it’s like clockwork. It’s like every time. You know, they act like the child, they whine and complain, they don’t want it, and then ultimately we see it actually does work, and it’s good, and it doesn’t, you know, destroy what they’re trying to do.

It doesn’t kill [00:33:00] technology to innovate in ways to make it safer. It’s interesting though, seeing those temper tantrums and then, you know, with the two factor authentication, like Facebook enabled that, and then they use the two FAA information to send marketing emails, almost like a punishment for trying to be more secure.

You’re going to get more marketing now. It’s just like, how far can this go? Yeah. I mean, that’s, that’s, that’s part of the problem is that they, you know, companies will. You know, try to do stuff and get away with stuff. It’s kind of in their nature. And I think one thing I say in the book is that, you know, it’s, it’s a little too easy just to demonize companies.

Because it, when we structure things so that they act the way they act, we’ve built a structure that allows them to be unaccountable, that allows them [00:34:00] to do this and get away with it. And that’s what they do because they’re designed to make a profit. A shark is, is a machine designed to eat. And so it’s kind of like asking a shark, Hey, please stop eating seals and just become a vegetarian.

And it, it, it’s not realistic, right? Their nature is to eat. The nature of companies is to make profit. It is to do this. And we ask them, Oh, please be ethical. Please do all this stuff. Be good citizens, whatever. And then we wonder why the second, like, Why they’re always fighting regulation. Why they’re always, you know, going along with everything.

Why the second the political winds blow in a different direction, they drop all their, all their big principles that they tout, which we’re seeing right now, how they all change because that’s their nature. Their ethics are not deep in their core. They don’t have this kind of deep. Seated moral compass they do what is good for the moment [00:35:00] to get themselves ahead And I don’t think it’s bad.

It’s it’s it’s not immoral. It’s amoral and and they’re built this way We structure a system where we incentivize this The incentives are all set up for companies to act this way. I mean wonder why do they act like psychopaths because We’ve created them as psychopaths and their very DNA of a company is essentially a psychopath DNA.

It’s to go after profit and that is the main aim and ethics is not ethics is only to the extent that it serves the profit. So that’s the system we have. And until we change that, I think, you know, shame on us for expecting. The shark to be a vegetarian, it’s foolish, right? We should realize like, no, the only way you’re going to get them to do what we want them to do and act the way we want them to act is to force it and to, or create incentives so that, you know, if they act the way we want [00:36:00] them to act and they are responsible with technology and.

avoid harm, the incentive has to be that by avoiding harm, they get more ahead than if they cause harm. And the problem is they can cause harm and they know that if they cause harm, it’s, you know, the penalty is going to be small a slap on the wrist or something pretty small. If they get ahead, they’re going to make gazillions of dollars.

Yeah. And so at the end of the day, it’s kind of like, okay, you know, why? Yeah, we better to ask for forgiveness than to ask for permission. And that’s what they’ll do because it’s, it’s, it’s the way the incentives are structured. It’s the wise economic move for them. How can we change our legislative process or change the way things are structured now to hold these companies more accountable?

Well, I think we need a lot more enforcement. Like I would take enforcement and multiply it by 10, probably multiply it by [00:37:00] 100, but enforcement is so weak, so under resourced, so understaffed, it needs to increase. This is not just the United States. It’s around the world. The enforcement just is not where it needs to be.

It needs to be much more muscular. We need penalties that are Really hard hitting fines, you know, certainly hurt a little bit. They get a good media attention. They don’t, at the end of the day, you know, Facebook got a $5 billion fine for the Cambridge analytical scandal, and it’s stock went up that day.

I mean, it makes like more than a hundred billion a year. So, you know, yeah. , it’s not that big of a fine, what, what really? What really would make them wake up is, you know, slow them down. You slow them down and they freak out, you know, they don’t like that. And so it’s like kind of putting, it’s like, like, you know, putting a cat in a bath.

So that [00:38:00] that’s one way to do it. You know, you gotta make a sanction that they’re really gonna loathe. I think that the other thing is a private right of action. It’s not a popular thing in the law to do, but the great thing about a private right of action is it it, it frees up the limitations of a limited enforcement power.

So if the only enforcement power is going to be, you know, the FTC, well, they’re limited. And now under this administration, they’re going to be further limited politically. They can’t bring a lot of cases. They’re just not going to have the kind of enforcement budget. They’re going to be very nervous about being aggressive in a presidential administration that will, you know, slap them down if they’re too aggressive.

So The private right of action, you know, is not limited in that way. So it’s a much more freer way to enforce and it can be enforced by many more individuals and it can be more, more robust by bringing these [00:39:00] suits. But, you know, our litigation system is, is, is, is broken too. And it’s, it’s, it’s, it’s not very good.

I mean, that’s sort of the problem is like our institutions yeah. Are are are really clunky and out of date. And, you know, they’re just they just don’t work very well. And that’s that’s that’s part of the problem. But I think a private right of action would be a start. It would get attention. It would drive, it would solve some of the enforcement problems.

Private right of action does have a huge effect because the moment companies hear that they can be sued for, by an individual for a particular privacy harm, things change very, very quickly. And we’ve seen that a lot with California invasion of Privacy Act lawsuits where companies are really interested in how to avoid these lawsuits.

Because it is extremely expensive and time consuming and the changes that they need to make to avoid those [00:40:00] lawsuits are pretty minor. So we have seen a lot of change due to, due to that, which I think is really cool. Last question for you. Where can people buy your book? Well, you get it on amazon and also at it’s from oxford university press so you can buy it on the oxford university press site Or on amazon.

com. I believe it’ll be on barnes and noble’s site as well So you should be able to find it in in those sites. I don’t know if it’s gonna Be the kind of book that makes it into bookstores. It’s very rare that, that, you know, a book will make it into a bookstore these days. Plus bookstores are shrinking in number.

But I would say, you know, go online and look for it on Amazon or Yeah. Yeah. Oxford University Press. Awesome. Well, I’m really looking forward to, to March when it’s released and we could all read it and see the types of things that you think about and types of things that we need to think about. So Dan, thanks so much for, for [00:41:00] taking the time to speak with me today.

And to our listeners, make sure to subscribe to our podcast so that you don’t miss the next episode. Thanks so much for having me.

Search the Site
Popular Articles
Browse by Category

Comparing Policy Generators

Cookie Consent Banner

Cookie Policy

Culture

Disclaimer

EULA

How To's

Privacy Policy

Terms of Service

Subscribe for Updates
Search the Site
Popular Articles
Browse by Category

Comparing Policy Generators

Cookie Consent Banner

Cookie Policy

Culture

Disclaimer

EULA

How To's

Privacy Policy

Terms of Service

Subscribe for Updates