Privacy Lawls with Donata
Ep. 4 | The History of Privacy — Part 3 (Guest: Debbie Reynolds)
How did the creation of the Internet impact privacy?
We chat with Debbie Reynolds aka the ‘Data Diva’ to dive deeper into this topic. Debbie is the founder & CEO of Debbie Reynolds Consulting LLC, on the IoT advisory board at the U.S. Department of Commerce, and executive founding member of the Digital Directors Network.
Show Transcript
Hello and welcome to the fourth episode of Privacy Lawls, which is our final installment of our history of privacy segment. I’m your host, Donata Stroink-Skillrud and I’m very excited to talk to our guest today, Debbie Reynolds, about the history of privacy post Internet.
Debbie Reynolds, best known as the Data Diva, is very well known to us in the privacy field. She’s the founder, CEO, and chief data privacy officer of Debbie Reynolds consulting, LLC, an advisory board member on the IoT advisory board at the U.S. Department of commerce, and the executive founding member of the Digital Directors Network.
So, Debbie, thank you so much for agreeing to be on the podcast today. It’s great to have you here. Can you please tell us a bit more about your career and what got you interested in privacy?
Sure. First of all, thank you for inviting me on the show. I was really happy to be here. And I looked at some of your other episodes, and they’re really great. You have a friend of mine on one of your episodes, Kimberly Pack from United. She’s a nice lady. We’re neighbors live on the same street.
But the thing that started me in privacy, first of all, privacy, I had a personal interest in privacy. So there was a book that my mother had around the mid 1990s called the right to privacy. And Caroline Kennedy was one of the authors of that book. And my mother, I think she had seen it on a TV show, she went out and got the book. So this is back in the day when there was no Amazon, you had to actually go to a bookstore and buy stuff, right? And so the concept of what was private and what wasn’t was fascinating to her, and her interest got me interested in it. And I read the book, and it just hooked me on it. And I’m like, wait a minute. What are my rights? Because this was a lot about the laws and the gaps in the laws and legal theory and stuff like that. And so I’m a technologist. And as technology started to grow, I started to see, wow, this is going to be, like, a huge issue, right? Because now we’re collecting data that was never collected before, and we’re moving into these digital areas where it’s harder to keep your privacy as opposed to the way that privacy had been thought about in previous laws, where it was like, okay, when you lock your door, your house is private, right? But if you have a phone that’s listening to you, then is that private? So all those questions really got me interested. And then over the years of my career, I’ve been working with multinational corporations that were doing data moves, or they want to move data around the world. And you have to know what these laws were, right, because you just can’t move it and then be like, okay, well, whatever. So you have to really know what these laws were. So people who knew me from that work that I was doing around digital trans information started calling me up, asking me, hey, these privacy laws. Starting to brew. Can you talk to us about it? And so one of the first companies that asked me to come, I would speak on privacy, and one of the biggest companies, they saw me speak at a conference, McDonald’s corporation asked me to come speak to their corporate legal department. And this is around 2014, so this is before GDPR. And I was telling, hey, GDPR is coming. This what it’s about? They had all these different questions. And then I talked a lot about GDPR because a lot of people in the US. Didn’t really understand it, didn’t understand kind of the extraterritorial reach of it and why it was going to be a big deal. And then by the time that the GDPR really went into enforcement, PBS called me and asked me to speak on TV about it. And so people still call me about that interview that I did on PBS. So that’s kind of my privacy career start.
Very cool. I think the first book that I read that was about privacy was a book about Henrietta lax. And that was just an absolutely fascinating book. Can’t recommend it enough. I think there might be a movie being made about that as well, which I will definitely watch because it’s a fascinating story.
Yeah, there was a TV movie that was about her. I think Oprah Winfrey was like the producer of that movie. And I heard in the news recently there’s like a lawsuit going on now around that because her family never compensated for the use of her stuff, her know. So there are some privacy implications there as well.
Absolutely. So what types of clients do you work with at Debbie Reynolds Consulting? What types of problems do you help them? Wow.
The clients I work with are very broad-ranging, I would say. It’s not a particular industry. I typically work with companies that are trying to implement or work on emerging technologies. So people work in AI. People working identity, biometrics, children’s, privacy, probably ad tech, anything that’s like a high risk for humans. That’s really what I work on. So from fortune 500 companies down to smaller, more startups. So it’s basically around the risk area of privacy. So I tend to work with people who are in those high-risk or those emerging technology areas. So the types of problems I solve for clients, I work with them definitely to educate them on privacy. But a lot of times when I work with companies, they want to work on maturity, and they want to eliminate barriers to adoption. So they want to move into a new market. Maybe they’re a new player in a market. Maybe they’re an existing powerhouse in a different market, and they want to pivot, and they want to open up into a new market. So I try to help them move in those areas. And thankfully, I work a lot with companies at a design level, so. I’m working with them before a lot of time before they’re implementing or as they’re developing a tool because they don’t want it to become a legal issue. Right. So a lot of times I’m working with them before it gets to the legal department. And obviously I work with compliance, I work with legal people as well, but they realize that my work or my skill with them is very different because I’m like a technologist and I sort of bridge that gap between the two. I’ve done tons of trainings and tons of education with lawyers and also a lot of business units around privacy and how it relates to them. So I was just on a call this morning with the automaker about den to the AI. So it’s pretty fun. I really enjoy the work that I do and it’s great that people reach out to me and ask me to help them out. I’m happy to do it. It sounds really fun to work on the most kind of cutting edge legal issues and privacy issues and technology issues as well as privacy by design. That sounds like a really fun job.
So let me ask you about another fun job that you have, which is working with the US Department of Commerce and as an advisory board member, what is that like?
Yeah, well, first of all, I have to say disclaimer. So nothing I say is like official to the advisory board and I’m just one of a member. But first of all, it’s really interesting. So it is a panel of twelve people that were selected by the U.S. Department of Commerce. Only us twelve in the US. We’re all from different backgrounds, which are, these are my favorite types of projects where we’re not getting together because we’re the same, we’re getting together because we’re different. So everyone has like a different lens that they look through. This board, it was developed as a result of a law that was passed to say, okay, you have to get outside experts, you have to come in and do these meetings. The meetings are open to the public. We’re going to write a report. We’re finishing a report now that’s going to go to Congress and Federal Working Group and all the government agencies around. You know, I’m the head of the privacy part, right? That’s my contribution to this group. But the cool thing I enjoy about this board is that privacy is such a horizontal issue that it cuts across all these different industries and different things. So where there are some people in the board who are very industry specific in certain parallel, where I get to work across all these subjects, to me it’s fun, it’s great. I’ve been able to meet some really cool high level people and this is great because talk is cheap, right? So you could talk all day, but it’s like this is like a roll up your sleeves thing. Hey, we need your advice. We need to know what we need to do. At. Federal level. What can we do to support either innovation? Remove barriers to adoption? One really cool thing I really love about it is that this is for the US. Right? So for all people, not just corporations, not just business people, but how does someone’s grandma understand IoT device? How do they understand how to use it, how they understand what their rights are, stuff like that. So it’s something that will impact everyone in the US. And also internationally, because I think the U.S. Has a opportunity to really lead on IOT know, a lot of the manufacturers here, aside from Chips, right, are in the U.S. And they’re really pushing this in the U.S. So I think that we can really be a voice internationally in terms of what we do on this board. Because I know a lot of people, I’ve heard people say, oh, IoT, that’s like an old thing. I’m like, no, it’s actually new because as we see, all these new technologies come out. They’re bundling all these new technologies with these devices that these devices weren’t doing that before, right? So before you think of a camera that maybe you see in a convenience store, maybe back in the day, all it was doing was collecting video. Now it’s collecting audio. There are laws for that. Now it may be using databases. There may be laws for that. Right? Now it’s using AI. Now there are issues with that. So it becomes very complicated. So I enjoy it. I like tough problems. So this is right up my alley.
That’s definitely tough nut to crack, for sure. I mean, so many things that we used to use that were not IoT are today. I think you’re totally right about that. I have a washing machine that can hook up to an app that tells me when it’s done. And being a privacy lawyer, I just listen for the sound of it being done. I don’t necessarily connect it to the app, but one of my favorite things that comes out during the holidays is an IoT list of the most popular devices coming out that year that kind of explain the privacy issues. And I think they have a creepiness scale from not creepy at all to creepy, which I find to be really fun. So that’s very interesting.
Yeah, it is. And I think it’s going to be incumbent upon users to get educated, right? So I would not advise you go out by the wash machine and plug it up to the Internet and think, okay, we’re all good, right. Where I feel like your washing machine, something you don’t think about, could be a gateway to your other devices in your house. Right? Because it’s on the network. So I think when people are doing these things for me, I want to make sure that they’re aware, informed about what it can do. Because I think companies are very good at selling the benefits, but they don’t really talk about the risk and you take the risk. Got the consumer.
Yeah, absolutely. Most consumers don’t understand the risk. They. Think, oh, well, if Samsung makes it, that means it must be safe, where that’s not necessarily the case. But anyway. So let’s get into the history of privacy. So the Internet was started in the 1960s as a way for government researchers to share information. And January 1, 1983, is considered the official birthday of the Internet, as that was the day that the Transfer Control Protocol, Internet Network Protocol, was established, which allowed different kinds of computers on different networks to communicate with each other. So knowing that the Internet was initially established for government researchers to share information, I don’t really think that anyone at the time was really anticipating how we would use the Internet today or how big it would become. Are there any issues that affect privacy that stem from the way the Internet was initially set up?
Absolutely. That’s a great question. That’s a perfect question, actually. Yeah, right, that’s true. So the Internet sprung up definitely in government and education. Right. So I think it started Stanford, or not mistaken. But the Internet was created to share information. It was not created to protect information. So a lot of the issues that we have now with, like, cybercrime and someone breached this or someone breached that, we’re basically swimming upstream when we’re trying to protect data on the Internet because it was never created that way. Right. So I think the next version of the Internet I know you hear people talk about Web 3.0, it has a lot of different features. So one of those features will be trying to find a way to protect the Internet. So building in cyber and building in privacy at that foundational level in a way that it is not that way now. Right. I remember companies started maybe I’m dating myself, but when I first got my first job, there were no emails. You wrote a memo, someone or you went to their office or you talked to you know what I’m saying? Not only does everyone have emails, everyone has tons of emails, right. Tons of email addresses. It just was not that way. So I don’t think I anticipated that people would what’s wrong to look for? When the commercial Internet started, I did not think that companies would trust it enough to do financial stuff, important stuff. So I was totally wrong about that. Right. But we’re seeing the result of that because all these breaches, especially banks, they have to be so far ahead of everyone else. So when you’re banking now, almost every couple of months, there’s like a new change, hey, this is a new way. We want you to. Authenticate or whatever. So they have to stay ahead to try to try to minimize those threats. Because we’re all dealing in a situation where the Internet is not created to protect data. So these companies have to really invest money in it or not put stuff on the Internet right as we’re seeing. So I think in the future of the Internet is going to be very different than it is now. Right now, it’s like, okay, sign up and put your data in this huge bucket that’s owned by Google or owned by Facebook or something, right? Where the future will be where people it will be more decentralized and it will be more private and secure by design because you’re only giving certain data. So your devices now will be able to do stuff that before you only could do, like on a major platform. So I think that’s going to make it very different in the future.
I think we’re all looking forward to a time where we all have more control over information. I mean, I know that some people in certain states or countries have rights, and I live in Illinois. I have no rights. I have no control over my personal information. So I think giving that back to the people is going to be a big thing in the future. But in 1986, the Telephone Consumer Protection Act and the National Do Not Call Registry were established to regulate telemarketing calls and automated telephone dialing. So how has the proliferation of the Internet affected these types of calls? And is this an issue that we’re still dealing with today, almost 40 years later?
That’s a great question. Well, the Do Not Call list was very popular. It still is very popular. So basically, it’s the list that you go in, you put your number on, and then it forces the telemarketer companies to go on that list and make sure that you’re not on it so that they can market to you. Right. Some companies don’t care. They market to you. Anyway. I think the thing that has happened since that list came about was that Robocalling was there, but it wasn’t as sophisticated as it is now, and it wasn’t as ubiquitous as it is now. So I think that’s always a challenge. You’re always getting calls, and now you’re getting a call from AI, right? So it’s not even a person recording a message to send to you. It’s like them typing it in and they choose voice and they call you over and over or different things. So I think one of the efforts because regulation people have seen regulation be more successful in the telecom industry, I think at a federal level, even at a state level, they’re looking at the lessons or the successes that have happened in telecom. Stuff like you being able to take your number with you and stuff like that. Or how those telecoms need to collaborate or coordinate with one another on a consumer basis. So we don’t yet have that in privacy. But I think the telecom is probably a very good sector to look at for lessons learned and for successes in that area. So I think the foundational principle should be, hey should have control. Companies need to be transparent. And not only transparent, like, oh, here’s my 80 page privacy policy, and just click a button. Right? There needs to be more skin in the game for companies on that consent part, because, just like you were saying about IoT grandma, when she plugged in her IoT device, she may not understand what the risks are, right? But it’s like, right now, it’s no one’s responsibility, right? It’s no one’s responsibility. So there has to be responsibility. There has to be accountability at that point. And I feel like consumers have unless you’re a geek like me, who studied this all the time, or like you, who’s in your profession, you probably like the privacy guru in your family, it’s hard to get that message out.
It is. And it’s hard to enforce these things, too. I mean, I’ve been part of the Do Not Call list, and I looked this up the other day since 2013, and I still get Robocalls every single day, multiple times per day. And you try to block the number, and all of a sudden, they’re calling you from a new number. Maybe you report them to the FTC and then never hear back. So I think we still have a very long way to go. And it’s interesting how we started with these phone books, these giant phone books that have the names and phone numbers of everybody who lived in that area. And now we’re moving into data sets of hundreds of thousands of millions of people who are called day and night about stuff that they never wanted to be called about.
Yeah, we have an opt in problem in the US. Where it just doesn’t exist, right? So a lot of companies, especially some very big ones, right? The major ones you could think of, they’re combining services together, so they’re getting you to say, hey, agree to this. And then when you agree to that one thing, that offs you into everything else that they do. Right. Stuff that you didn’t care about. But because we don’t have any law around opt in, basically, they can opt you in to almost anything, but then it’s your responsibility as a consumer to fight your way out. And they don’t make it easy, right? I tried to cancel something on Amazon once, like someone had made a mistake. First of all, they make it super easy for you to order stuff, right? So you click one button if something shows up at your house, but you try to cancel it, like, you dig into these menus. I swear, it took me, like, 30 minutes to mind is one thing that I wanted to cancel, and it should not be that way. Right? So. If I can order something one click, I should be able to cancel it one click, too. Right? Or at least two clicks. Or maybe five. I would take ten. Right. But not like 30 minutes of trying to find stuff. So I think we have a long way to go on that. I’m not sure I see on a state level, they’re at least trying to create more consent or opt-in around third-party data transfer, but not for other things. So you have whole services that can literally opt you. Let’s say you opted out of something, okay? They can opt you back in, and then you are responsible for opting yourself back out. So there’s nothing illegal about them opting you in again, and that you have to opt yourself out again. That’s a problem.
Very frustrating. I think I saw on Privacy by Design Hall of Shame, that if you want to opt out of LinkedIn emails, you have to go to 64 different places to fully opt out of emails. I mean, no person has the time or the ability to do that. So I think the system is set up incorrectly there. But anyway, getting back to the history of privacy, so in 1995, the EU data protection directive was adopted. So what does the directive say about privacy? And how does the EU approach to privacy differ from the US. Approach to privacy at this time?
Yeah, that’s an interesting question. I actually recommend people even though the data directive is no longer enforced, I highly recommend that people go back and read it. It’s very similar to the GDPR. But the important thing to note is that between 1995 and the time the GDPR went into force, which went into law in 2015, there were a lot of international laws and different jurisdictions that were passed in privacy, and they were based on the director. A lot of those laws are still enforced, like for the Philippines. So the Philippines privacy law, they have a very good one. Right. Has parts of it from that directive. And so if you understand that, I think you’ll be able to have a better handle on any laws passed between that date or possibly from that, have some parts of that directive. So as long as you understand that, I think that’s really important. What the EU was trying to do when they did this directive, obviously, they wanted to codify their feeling about privacy being fundamental right. In the EU. But then also they were seeing there was a divergence what’s happening with the US. Right? So the divergence at that point was, okay, us. Is creating all these digital systems. We see a problem or we see a potential harm where we can’t really protect or control that data. So let’s create this directive so that we can talk about what we want as Europeans for our data and for our rights. Right. Also, what. People don’t know it’s. A parallel thing was happening with standard contract clauses and stuff like that. So all those things were happening in parallel around the same time. So it’s not a coincidence that we’re still talking about standard contract clauses, because those things preceded GDPR, and they were started brewing, like, in the would say mid to late 80s, before this particular directive came about. I tell people when people complain about so what were you doing for those however many years before? Right. So the GDPR was it brought with that extraterritorial reach, and it brought in fines, and it put a fine point on things. So I would say the directive is probably 80% of what the GDPR is, but it didn’t have the teeth of GDPR, so people really weren’t concerned about it. They’re like, oh, that’s always they’re not going to file a case against me or whatever. So I think that Europe definitely got people’s attention with GDPR. But go back and read the directive. It’ll help you a lot.
It’s interesting because we still, as privacy professionals, I think all of us still get emails from companies who say, we’re ready to start complying with GDPR. And you’re like, well, what have you been doing for the last couple of years? This has been around for a minute. You should be much further along than starting to comply, unless you just started kind of working into the EU market, or you’re thinking about going into the EU market, maybe. But still, I think it really reflects the fact that in the EU, they were viewing privacy a bit more broadly than here. Here we were trying to kind of correct specific harms. And I think a great example is 1996, the Health Insurance Portability and Accountability Act goes into effect in the US. So what is HIPAA protect, and how did it come about?
Yeah, great. Are wow, these are fascinating questions. Okay, so HIPAA now, the reason why I know this so well is because I live this. So I was paying attention where all this was happening. So this is a great question for me. So, first of all, HIPAA is not a privacy law. HIPAA is a law about health portability data. It has a part in it about privacy. So when people talk about it, that was, up to this point, probably the best, most well thought out and articulated way to talk about privacy. And the reason why we even have HIPAA is because in the US. We don’t have universal healthcare. So you have to have portability where places like Europe, they have universal healthcare, so they don’t need portability laws, but we do. Right? So HIPAA covers patient provider information, so that’s like your treatment. But also how. You pay for things, different things, so that you can move your data. Let’s say you change doctors, you change jobs, you go to this other place, it governs how that data gets transferred from one place to the other and different things like that, and tries to protect your privacy as a result of that. So that was really the whole point of HIPAA, like taking on a life of its own, especially because this port ability now is so different, right? So it’s more digital now, where before it was like, hey, I’ll go to the doctor’s office, they’ll give me my paper, print out of my stuff, I’ll drive over to the next doctor’s office and I give it to them and they’re supposed to handle it a different way. And so as we moved more to digital, it became a lot more complicated because your data isn’t like a widget in a box that moves from place to place now, flowing across the Internet. So it creates more problems for companies, especially if they don’t understand the risk when they move into digital. And then around a little bit after HIPAA, there was an effort to move more into digital to health. But a lot of stuff in medical steel, paper, right? So if you have to get an order from someone, they’re faxing stuff or they’re mailing stuff. So it’s going to take a while to get past that, I think. And some people I know, a couple of doctors, they just won’t go onto electronic health records. Like there’s still paper, so no one’s going to force them to do that. But as long as people understand I think what people don’t understand is, like, people think that anything about their health is private, and it’s not. So if you go on the website or go on the Internet and announce you have some type of ailment that’s not private, and so you can’t stop. Like, for example, your employer says, oh my goodness, this person had cancer, so we’re not going to hire them. You can’t stop that, right? Because that information is protected because you sort of gave it away. But then also when you’re using fitness apps and Fitbits and stuff like that, those things are protected. They’re not protected by HIPAA. They’re protected by consumer laws and wherever jurisdiction you’re in. And those laws are much weaker than HIPAA law, right? And they’re a lot less prescriptive as well. That’s another thing people need to understand. As long as companies are adhering to those consumer laws, if they exist, it’s much weaker. So I think there is a risk there for we see companies creating risk profiles about people based on stuff that they infer on their app. So let’s say you have a phone in your pocket and you go to the coffee shop like twice a week or something like it, who’s to say? What would they infer, right? Maybe they infer. Well, of course you think, okay, Dana, she likes coffee, so we’re going to advertise her coffee. Or they can know. It takes her a really long time to walk from one place to the other. Maybe she has some type of health issue. Maybe I’m going to sell this to her insurance company or whatever. So this is like the Wild West area that we’re in with data right now because basically there’s so much more data that we didn’t think anyone cared about, but someone cared about it, and they’re willing to pay for it, and they’re going to do stuff with it that maybe you don’t know about.
Yeah. And you think, okay, I go to the doctor and the information that they tell me is protected. But how much information are we sharing with these apps? Like women’s health apps recently subject to a lot of enforcement actions because they were sharing data with advertising providers or selling data and all of this kind of stuff that most people don’t even realize could happen. So you definitely have to be careful about the data that you’re sharing online with these new apps. And so 1998, children’s Online Privacy Protection Act became a federal law. So were parents worried about their children online during this time? And what risks do children face when they’re in the online world?
Yeah, co is very important because it is probably the closest thing that we have to federal privacy law. But it’s currently for children under 13. Right. So in my view, it’s like, why don’t we extend that to everyone? Why don’t we have COPPA for everyone? Right? Because I think it would be definitely needed. So I think part of it was around parents having a concern about children. But I think what happened know, Congress, they were concerned about these new technologies. Think about video games that just burst onto the scene in terms of digital things, it was very Wild West in terms of how people were handling data of children, especially as people, regulators who have children, they were like, hey, this is like an easy win here, right? Why wouldn’t we want to protect children? So let’s say people under 13, you just have certain things you have to do when you’re working with their data. I think the thing that’s happened now is that it’s becoming a lot more complex. We have more children under 13 or under 18 for that matter, who are in digital systems. They have their own phones. It’s not just like us growing up, we didn’t have the Internet when I was growing, you know, kids, like, I literally saw a kid that was four go on their phone and went through like a Netflix menu. I don’t even know how to do that right. But this is what we’re dealing with, where there are so many children now that they’re playing game games, they’re on their parent’s phone, they’re ordering stuff, they’re pushing. That one button and something shows up their house. So there is a need not only for education, but there is a need to protect that data of people. And those companies get in a lot of trouble because they do all these programmatic advertising, so they don’t know. They press the button and let it do magic things they don’t really care about, oh, what’s the age of this person? Because before it was like, okay, promise that you’re pinky swear that you’re under 13 and click a box and then away you go. So I think in the future that’s not going to be sufficient. So there are things going to be in place where they’re going to require companies to know more about the age of a person. We’ve seen a couple of companies, like Weight Watchers, they got in trouble with the app that they had because they said, okay, well, swear that you’re over 13. Say yes. And then they ask the person’s birthday and they say, oh my God, this person’s eight. I think companies need to really look at that, especially as we see a proposal, definitely in California. So California is raising the age, they call it super copa, right? Where they’re raising the age from 13 to, I don’t know, 16 or 18. I forget, I have to look. But this is probably going to go through on a federal level too. And this is going to be very game changing for companies. So before they’re like, click the box and say, you’re not 13, but now they’re like, wait, you need to know more about the person, which means you’re collecting more data. Right, but then companies up to this point, if they weren’t specifically targeting children 13, they weren’t really concerned about children’s privacy. But now this is going to become in the forefront because a lot of think about TikTok and Facebook, most of these kids are between 13 to 18. So this is going to be like a maddening thing for companies that had not been accustomed to taking those paces and doing those steps. It’s wild.
When I was a kid, I had a flip phone with one phone number in it for my grandma, and that was the only number I could call. I had no internet, I had no anything. And now kids go onto Facebook, TikTok, all these platforms, and share photos and share videos. And that, you know, it can attract predators, it can put them at risk. It’s also things that can follow them into their life, right? Like my husband made a stupid video when he was a kid of throwing a toilet off of a roof, right? And at that time, it was really fun. It’s what you did as a kid. But nowadays, let’s say he was trying to find a job and somebody found that video or whatever, and maybe now it’s embarrassing or whatever. Or maybe you’re doing something that precludes you from getting certain types of jobs in the future, and the kids don’t really realize that.
Oh, totally. And then people just overshare, right? It’s like, oh, I had a ham sandwich on know, beyond being annoying. What people don’t know is that when you’re doing these videos and stuff, it’s like tracking, your know, people unfortunately, people may have nefarious intent toward you, and kids don’t really understand that, I don’t think. Absolutely. So speaking of California, so 2003, California was the first state to implement a data breach notification law. So we’ve seen most other states model their own breach notification laws after California. What do we see with these privacy laws? Why is it important for someone to be aware of breach of their personal information? Yeah, so you’re right about the breach notification. And so California is I don’t know, california is the canary in the coal mine, in my view. So they tend to lead on privacy. They’re very progressive there. They’ve done the work year after year, try to get this done. People follow them a lot. So not only breach notification, right? So breach notification, like you said, started in 2003. So by 2019, all 50 states had data breach notifications. A lot of them borrow liberally from California, but they’re all different because I think everybody wants to be a special snowflake, and they want their own little spin on everything. But I think also it’s important that California having the most comprehensive state level privacy law. They’re also leading in that regard. So I think we’re seeing basically the same thing that happened with data breach notification is happening with these state level privacy laws because of California. So they’re very important, for sure, but I think breach notification is really important for a lot of reasons. Well, first, I’ll tell you, it’s irritating, and it is important. Okay? So breach notification is irritating in my view, because I see states, let’s say they had a breach notification law, and then they don’t really want to pass a privacy law, quote, unquote. So they’ll go back and they’ll revise the breach notification, just throw some privacy stuff in it. So it looks like the law hasn’t changed because the number hasn’t changed or whatever, but the law may have changed. It makes it that much more complicated to keep up with it. But the importance of breach notification and people don’t really realize that. It’s like you don’t want to wait, you have a breach to know what you need to do about breach notification. Part of it is because some states have different ways they define a breach. They have different definitions of what personal data is, what sensitive data. They may not call it sensitive data. New York, they call it tier one, two, and three. I think something like that, around certain types of data. So if you don’t know that, you will. Have a hard time trying to put together a notice for a particular state around that. So I work a lot with companies where maybe they’re operating in certain states. Like, I had a client that’s a national cable provider, so they have stuff in all types of states. Right. So they need to know what those laws are, because there are certain ways that they need to define the data that they have. And because that definition is different from state to state, they need to know what scope or what data is captured by those laws so that if they ever do have a breach, they know what they need to say. They know what they need to do.
Yeah, california, we saw the first data breach notification laws, and they passed the California online privacy scheme protection act of 2003 right about the same time. And then they passed the CCPA, and we see a lot of these state privacy laws being modeled after California, which I think is really interesting. But let’s go to the EU. For a minute. So GDPR passed went into effect. How does GDPR protect the privacy of individuals? And what does it tell us how they view privacy in the EU. Versus how we view it in the US.
Yeah, well, okay, so the GDPR, I think, was just the next step up for the know. They built that strong foundation with the data directive. Part of the reason why PR became a regulation is because they wanted to make sure that all member states that it was a law in all member states, as opposed to a directive where each member state was responsible in transposing it into their law. That meant that they had some inconsistencies. So in a way, the way that data directive was is the way that we are now in the US. So we’re like about 25 years behind the EU. In terms of regulation, in my view, because we’re not yet there. So if we ever got a federal privacy law, it’s not going to be as strong as GDPR ever, but at least it would do what the GDPR has done for Europe, which is create more harmonization around those laws and at least just the definitions. Right. So the definition of personal data in the EU. Is the same in all member states. Right. That’s very helpful for someone who’s trying to do business in the EU. So that they can understand, and it makes it easier for businesses. But in addition to creating that harmonization, I think even though a lot of businesses gripe about it, it makes it easier for them to understand those laws in EU. Even though member states still can’t still pass laws that maybe are over and above or different in some way. Not. Not to counteract the GDPR, but maybe to supplement it. So companies need to understand those member state specific things, especially in places like germany and france, where they’re a little bit more stickler about, stuff like that. It’s definitely an issue, but what the GDPR did was create a framework that since then has been borrowed from liberally in other jurisdictions. So there are sprinklings of GDPR and know we see a lot of people using language in different laws like data subjects and controller and data processor. So all that language is from GDPR, so we can definitely thank them for that. So I think even though I have friends or I have clients in europe and they kind of gripe about GDPR, I’m like, GDPR has been massively influential around the world in terms of privacy. Maybe it’s a big wake up call for c suite folks where privacy just wasn’t like a priority before. But because of the fines, I think it became more of a c suite issue, and now we’re seeing consumers definitely wake up to that. Did I ask you a question?
Yeah, you did. I mean, it would be nice to have a federal privacy law similar to you, or at least something that harmonizes our existing situation because it’s so difficult to deal with it right now. You have privacy laws calling it personal information, personally identifying information, personal data, and all of that is just the very basics of the privacy law. And here it seems like we’re again targeting very specific harms, like targeted advertising or automated decision making, but we’re not really thinking enough about these broad based privacy foundations, I think. And that’s a really big issue.
Yeah, well, now I remember what I was going to say. So the big difference between the EU. And the US. Is that the EU’s privacy laws are based on privacy being a fundamental human right, where privacy is not a fundamental human right in the US. So a lot of our laws are based on consumerism, right? A lot of them come out of commerce. So not every human is a consumer. So their power and their scope and their mandate is around consumer, right? So if you’re not a consumer, then you can’t exercise your privacy right in the so let’s say let’s say you took a picture on facebook. You took a picture of you and your grandma on facebook, you put on facebook, and maybe she doesn’t want her picture on facebook. What rights does she have in the US. She has no rights whatsoever, right? If she was in the EU. She has rights because she is a human right, and that’s how those laws are based. So the gap that we have is that not every consumer. Not every human is a consumer. And so a lot of the problems that we have in our laws are around that. But then also, for whatever reason, I think another reason why it’s hard for us well, there are three reasons. Three reasons why it’s hard for us to get privacy on federal level in the US. One is there’s a constant battle between the states rights right of federal and state’s rights or whatever. Even though federal and state are different, state can do things that federal can’t do, vice versa. So it’s not a one for one type of thing. I don’t know why we’re fighting about this. And then the other was the private right of action where corporations definitely don’t want that. A lot of legislators do want that and it seems to be different among party lines, I think. But then I think the third reason know the GDPR and a lot of ways that Europe passes laws, in my view, tend not to be as prescriptive as us. So I think it is more narrowing to create prescriptive laws, but it also creates a lot of gaps and it’s harder, I think, to pass prescriptive. Like in the GDPR, they say, well, don’t collect data, only keep data for as long as it’s necessary, right? So it doesn’t tell you how to do that, right? Where the US. Like CCPA, they’re like put a button on your website that says this. So I think that type of law is harder to pass than just do this. But businesses want more prescriptive laws, right? Tell us to do this and we’ll do AI, we’ll punch in the numbers or whatever, where we’re saying you be thoughtful about it, do something reasonable for your organization, ask these questions. And that’s harder to do, right? Because you’re like, just tell me how to do, like tell me to do the damn step. Like make your own up.
So do you have any hope for federal privacy law in the US.
I think it may happen eventually. I’m really interested to see what’s going to happen in 2024 because, well, first of all, 2024 is an election year, so there will not be any federal privacy, anything past this in 2024 because of that, right? Because the legislators are trying to get reelected, so they are not going to be bothered with this, right? But in 2024 in the US. There are going to be a lot of state laws going into effect. So in 2023, I think there were five state laws went to effect. And I think in 2024 I have to look at the numbers, I don’t know. 1113, I don’t know. I can’t remember the number. But it’s a massive number. A lot more just say double what went into effect on a state level in the US. So I think by then end, people will just pull their hair out and being really like, I don’t care what. Happens, we need a federal law. I don’t think that it’s going to be very comprehensive. Get a federal law. I think it’s going to be very thin. But I’m fine with thin. Maybe thin is something that we can build on. For me, I think we should forget about the preemption and forget about private right of action and just harmonize the language. Personal data means this in 50 states. Breach means this in 50 states. I think that would go a long way to help companies be able to figure out what their obligations are and at least to harmonize and not spend so much money on trying to do this patchwork thing.
The one thing that I truly vote for, for federal privacy law is that it’s not written in seven days. That time to sit down. Actually talk to the people who know privacy and who know what they’re talking about. Talk to the consumer protection groups. Talk to the technologists who understand privacy and how the internet works, and then go from there. Instead of doing some kind of quick slap and dash type of thing, actually think through it. But who knows whether or not that’s actually possible.
Yeah, and one thing I’ll say, the BIPA law, you live in Illinois like I do, a lot of corporations don’t like that law, right? First of all, they get a lot of fines from that. Even though I think it’s one of the most simple laws, I swear. I think it’s is. It four pages. It’s like very simple, right? But I think the thing that BIPA has done that’s so different and to me it aligns with I’m pretty sure that they didn’t do this intentionally, but it aligns very much to me with the European way of thinking around human as opposed to a consumer. So that’s why companies don’t do not like it. They’ve been trying to get it revised for a long time. It’s not working because they’re looking at it through a consumer lens where it’s more of a human law, right. It’s basically saying, hey, if you have data of a human, you need to let them know. You need to tell them how long you’re going to keep it different, things like that. So just like in my grandma example, if you took a picture of you and your grandma put a Facebook in Illinois, your grandma would have rights because she is a human. So that is the difference where some people in legal, they look at it from a consumer lens and they want to try to window it down to that. And it’s just not that type of law. And even in the proposal for, the most recent proposal for a federal privacy law with the adppa, it cannot preempt BIPO, it cannot preempt that law because it’s not a consumer law. It is a human law.
It’s wild how many business upset about it, but when there’s like three things that you need to do to comply.
It’s so. Four pages. But I mean, if you’re thinking in a consumer lens, if you never have to think about it in terms of a human lens, I can see how you I call it the BIPA Ditch. Like, don’t fall into the BIPA ditch, right? Yeah. So I’m like, seriously, let the person know where’d you collect, why you’re going to collect it, tell them how long you’re going to keep it, and then just follow through. That’s it. That’s, like, extremely simple. It’s not that hard.
I think so, too. Well, let me ask you one final question. So the California Privacy Protection Agency said that they’re going to review the data privacy practices of connected vehicle manufacturers. CPPA taking a closer look at this. Well, there are a couple of things. First of know connected vehicles that’s IoT, right? Internet of Things, which is that’s a good thing that they’re looking at. That the issue around IoT is that these devices don’t just do one thing. They do multiple things, right? They may be subject to multiple laws, and they may do all types of collection. A lot of the collection happens in connected cars. You don’t think about it, but a connected car is like a computer on wheels, right? And a lot of times that data, not only is it collected, so your car has many hard drives and things it collects, but then that data gets transmitted wirelessly to who knows who knows who, right? Some of that isn’t for the operation of a vehicle. Some of it isn’t for safety, right? Some of it is to sell, like, to insurance companies and different things like that. So I think it’s good that they’re definitely taking a look at that. And I think a lot of other states will probably follow suit once they see. Maybe they want to wait and see what happens with California on this. But I think one thing that companies need to realize that they haven’t really got the message yet, they don’t understand because they’ve not seen it yet. So I think a lot of companies think that they can hide in plain sight. They’re like, okay, if I’m not Google, I’m not Facebook, they’re not going to come after me. But the future is a lot of these agencies are leveraging more advanced technology, almost like, think about your tax return. So if you didn’t turn your tax return, IRS knows, right? They’re not going to be like, oh, we don’t know who didn’t file. They know who didn’t. Right? So in the future, they’re going to know whose practices weren’t up to snuff or whatever. And it’s not going to be hard for them to see people who aren’t complying, because right now you have to put these things on websites. You’re mandated to put these policies together, but now we have technology that can see, okay, these people are out of compliance. Send them an automatic letter, right, where before you’d be like, okay, it’s like, a group of lawyers going through websites, looking at stuff. That’s just not the way it is. All this stuff is going to be automated. It’s going to be put in a way where you cannot hide and play inside anymore. You’re going to get a letter from somebody, and these states are coordinating with one another on how to do that, because part of that is revenue generating. Right? Yeah. Well, let’s just hope you and me don’t receive any of those letters. Yes, exactly. So, Debbie, thank you so much for speaking with me today about the history of privacy. For anyone listening, make sure to subscribe so that you do not miss our next episode. Thank you. Debbie, thank you. This was wonderful. Wow. You’re a wealth of information, and I love the fact that you went back to the history, so I’m a total nerd about stuff like this, so this is great. Thank you.