Privacy Lawls with Donata
Ep.11 | Getting Consent the Correct Way (Guests: Elizabeth Donkervoort & Lani Victoria Q. Vinas)
Several privacy laws require you to get consent before collecting the personal information of website users (emails, phone numbers, names, IP addresses, etc.)
It’s extremely common for websites to fail to get proper consent (even though they think they’re doing it correctly).
We chat with Elizabeth Donkervoort (the Program Director at the American Bar Association’s Center for Global Programs at the Rule of Law Initiative) and Lani Victoria Q. Vinas (the Legal Director at SHUSAKU·YAMAMOTO, named as 2023 Japan IP Law Firm of the Year by the Asian Legal Business) about the proper way to get consent.
Show Transcript
Hi and welcome to the 11th episode of Privacy Lawls, where I Donata Stroink-Skillrud talk with amazing privacy professionals and have some laughs along the way. This podcast is brought to you by Termageddon, an auto-updating Privacy Policy generator. Today we’ll be talking about an often misunderstood compliance requirement, and that is obtaining consent for the collection, use, sharing, and selling of personal data. We’ll talk about what consent is, why it’s important, and the consequences of failing to properly obtain consent. I have two guests with me today, Elizabeth Donkervoort and Lani Victoria Q. Vinas. Elizabeth is the program director at the American Bar Association center for Global Programs. At the Rule of Law initiative, Elizabeth oversees the strategic design and execution of programs focused on safeguarding online privacy rights, combating foreign malign influence and repression on a global scale, and enhancing the resilience of legal systems and right advocates in East Asia. Elizabeth is also a member of the global task force to counter digital authoritarianism at the Montreal Institute for Genocide and Human Rights Studies and is a member of the ABA Cybersecurity Legal Task Force. Lani is the legal director at Shusaku Yamamoto, named as 2023 Japan IP Law Firm of the year by the asian legal business. In her role, Lani manages a team of more than 60 legal professionals, technology specialists and associates in managing the IP portfolio and technology assets of blue chip corporations, universities, governments, and technology venture capital firms. She’s qualified in the Philippines, England and Wales, and the state of New York. Lani serves as one of the privacy advisors at Aba Roli, a senior lecturer at University of the Philippines Diliman College of Law, and will soon be joining Hitotsubashi University as an adjunct professor this spring. So thank you both for joining me today. Both of you perform work at ABA ROLI, can you tell us a bit more about what the purpose of ABA ROLI is? Sure, I’ll jump right in there as the staff member for the American Bar Association’s Rule of Law Initiative. Just quickly, I will be speaking in my personal capacity and not on behalf of ABA or representing any sort of ABA policy. Everything I talk about will be from my own personal experience. So for those of you who aren’t aware, the American Bar association is one of the world’s largest volunteer professional associations in the world. We presently have around 250,000 active members and several different committees. I think around 250 active sections and committees within that association. Within that larger association is a center for global programs, which includes the rule of law initiatives as well as the center for Human Rights. And our purpose is to help the American Bar association achieve its fourth strategic goal, which is to promote rule of law and access to justice both at home and abroad. And so Rolle and CHR are the abroad component of that and what that involves is programming in, I think, close to 100 countries around the world. We have offices in around 38 countries right now. and really, we focus on just a broad range of activities dedicated to helping individuals access justice in their jurisdictions, as well as helping promote rule of law. And I know we’ll talk a little bit more about that going forward, especially related to my programs. Awesome. Can you tell us a little bit about the current initiatives at ABA rolling? Oh, my goodness. We have a lot of initiatives. The one that I’m obviously most familiar with as it relates to today’s topic is the defending digital privacy program. And this is the one that Lani and you, Donata, as well, are really wonderful in helping us out. This is a pilot initiative that we have started in Southeast Asia under the idea that, you know, we have a lot of really great organizations out there that are conducting cybersecurity work and helping grassroots actors and marginalized communities improve their cybersecurity. Similarly, there are a lot of organizations working out there with a lot of threats that we see, especially online, to freedom of expression and freedom of association, but really sort of taking a step back and looking at all this. These two pillars of Internet freedom, or really, freedom within the digital era, are not sustainable if we don’t have a baseline understanding of what privacy is in the digital realm online and as it relates to technology, and then a very comprehensive regulatory framework to enforce that digital privacy. Right. so that’s where the defending digital privacy program comes in. We’re working with lawyers and rights advocates right now from across Asia. We’ve already started expanding into East Asia to really start developing what a common understanding of what privacy should be online, and then thinking through what existing regulations are out there, where there may be gaps, and how different jurisdictions can move forward to address those gaps, as well as utilize existing frameworks to help communities and individuals who are victims of privacy rights violations find redress. right now it’s a monumental task. and so we’re really grateful for lawyers such as yourselves for helping us out pro bono. A couple of the other initiatives that I’m less familiar with, but are nonetheless very interesting. Our programs we have in multiple countries around the globe you know, countering hate speech and disinformation, whether it’s around elections or political transitions, or working with social media companies or other community organizations to have pages shut down for, you know, not just hate speech, but bullying, harassment in a variety of different languages. we’ve also worked with courts to help improve their efficiencies through e courts mechanisms. In the Philippines, we’ve actually included an e court automation system that was designed to help judges track and prioritize cases, and that actually helped reduce the average age of pending cases by 42%, and it increased disposed cases by 45%. We also have programs that help strengthen the knowledge of judges as it relates to freedom of expression online and cybercrime, sort of case law and legislation in distinct regions. And so it’s a growing portfolio of law and technology programs that we have, and we’re always looking for people interested to help us develop them. I love the training that Roley offers, and, I mean, I have to say, pretty selfish about this, but I do wish that some of the training was offered to me in my law school, when I was in school, because so. Much of it is so useful. Well, thank you. Yes, and that reminds me of another area we’re looking to expand with the digital privacy moot competition. So we’ll be starting off in select jurisdictions in Asia this year, the school year, and then we’ll be expanding to more in the following school year. And then hopefully we’ll be moving into Africa in two or three years time. And I guess, to add to Elizabeth, the DDP is actually a great regional training program because for Europe, you already have GDPR. Lawyers are so familiar with it. It started almost a decade ago. I mean, obviously, it really did raise awareness sometime in 2017. But in Southeast Asia, that’s not the same level of awareness. And here in the regional training program, we’re not just approaching or training those communities to ensure the sustainability, as Elizabeth mentioned before, but lawyers as well. Not all legal professionals are involved with digital privacy. And so it’s really very important to involve everyone regarding this, because as we’ve seen, the violations in relation to privacy it’s no longer in the physical space. In the past, it was so easy to identify violations if someone enters your house or someone searches your being or even seizes your correspondence. But in the digital realm, it’s quite different. And so training is really necessary, not just for the policy implementers, but also the lawyers, every. All legal professionals, npos communities, and even individuals. And I think that’s one great thing that is being done by the DDP program. Absolutely. I love that. Thanks. Bonnie, sorry. And I guess I’ll just add one more thing. You inspired me to remember a few other contours of the DDP program is, you know, one of the things that we really found when we were first developing the program with stakeholders on the ground is just, well, there was a general awareness, not really a sense of urgency around digital privacy, but you just speak with stakeholders for a few minutes about their lives, and they start realizing, as Lani mentioned, all of the manners in which their privacy can be violated and how that can be impacting them on a day to day basis. And that sense of urgency comes very quickly. So, you know, when we first designed the program, our hope had been to train around 200 lawyers and rights advocates in Southeast Asia. And to date, we’ve trained 1300 or more lawyers and rights advocates, not just in Southeast Asia, but in other regions, just because the interest and the demand was there. But there’s still this recognition that amongst the general public there a lot more work needs to be done in order for people to identify that they may have a case of privacy violations and then know that they can go to a lawyer, not only for lawyers to know how they can be trying these cases of privacy violations. So it’s a big task. Absolutely. Speaking of big tasks, I don’t think I’ve ever seen a group have, like, this much output and this many people trained and this much documentation and this much help, like, anywhere. And I wanted to ask you, you know, what made you interested in this type of work? You know, what made you interested in helping people protect their online rights freedoms? So before I joined Aba’s rule of law initiative, I was working for another democracy and governance organization, and I was based in Hong Kong, and I was working with rights activists and closed and closing spaces across Asia and the globe, and especially in Asia. I really watched, unfortunately, in real time, this concept of digital authoritarianism unfold. So as technology started rapidly progressing, watching, it deployed against people I was working with who were trying to stand up for farmers or villagers whose water has been polluted, or individuals of the LGBTIQ community or women. I mean, any marginalized community, if you were standing up for their rights or even defending them in court, you were having this technological apparatus turned against you. And even I experienced, to a limited extent, some of those repercussions as well. And, you know, just that experience really demonstrated to me early on in this technological evolution, the dual nature of technology and the just desperate need to be addressing it. But it really, I think, wasn’t until, I mean, we had the GDPR, as Lonnie mentioned, but still not a lot of traction coming out of that, I really would say it wasn’t until chat GPT, that people really started paying attention to what technology was doing. I mean, a little bit on hate speech, perhaps maybe I could be a little bit more generous and talk about Cambridge Analytica. In the 2016 US election, where, you know, social media users data was used to send targeted ads for certain candidates that included misinformation. and people started becoming aware of it, but not the full scope and not the intersection with privacy. Yeah, no, that’s so true. I mean, we started. I started working in privacy right before GDPR and GDPR went into effect, and everybody started losing their minds. And, you know, at first people thought I was crazy, and I had people at networking events saying, oh, I don’t really care about privacy. And now, you know, people want to talk to you now, and people are really interested in it now, and they’re like, hey, can you help me, like, change my ad settings? Or what does this mean? Or what does that mean? So it’s definitely become a lot more relevant to people. I used to be referred to as the captain of the tinfoil hat, happy hour, membership of one. And now I totally know what you mean, where people just didn’t care. I’d even. I have to talk to my mom about it. And I would say to her, you know, like, mom, you don’t walk down the street because you on the street naked, because you don’t have something to hide. You don’t walk down the street naked because you have a sense of, like, human dignity and personal privacy. So why would you saunter about the Internet naked? Like, so true. So true. Lottie, what. What made you interested in this work? Well, with me, it really started with my practice. So, like you, I think I started just before the GDPR was implemented, but it was really more of isolated matters. There’s. There would be question in relation to digital privacy, and you’d answer that. But it wasn’t until the GDPR that there was really a huge increase, at least in my observation, a huge increase in relation to matters involving digital privacy. So that’s where I started. I remember because with GDPR, everyone had to comply, including law firms. And I remember receiving, like, on a daily basis, maybe hundreds of emails just informing us, oh, you need to make sure you’re compliant in relation to this. So that’s when it started increasing. But with the ABA, I met Elizabeth in one of the law Asia Asia conference, annual conference. We were both speaking in relation to, I think it was localization of data and free. Yeah, free flow data localization. So. And then the issues in relation to digital privacy, you know, for me, personal data generally can be viewed two ways. And I think this is where the fragmentation really is. When I, of course, when I’m working for my firm, work, it’s seen as a digital asset. Personal data is really seen as a digital asset for corporation, and they need just to make sure that they’re compliant. But on the other side of it, there are genuine individuals holding this information who would want to be able to control the information that they have. And I think maybe we can talk about it later. I think because of this dual role of data, of personal data, there is conflicting interest, I think, between companies and sometimes persons. And that’s the reason why it’s very important that the laws that we have, they protect the personal information. You know, for most of us, our names are already out there. And so that’s not the issue anymore. Your name is out there. Your email address could be out there already. We have given that up. But it’s a question on how it’s connected to our other information. And I think that’s where. Where this paradox happens. A lot of people, they talk about this privacy paradox, and I really think that it’s important to elevate the discussion now talking about whether we should have a consent centered approach in relation to data privacy, in relation to personal data, vis a vis approaching it, and ensuring that corporations and companies, they remain transparent in relation to how they process, they remain responsible and accountable to the data that they already have, because there are some data that. That’s already out there. And it’s quite difficult, honestly, I mean, isn’t it quite difficult to withdraw your consent because you don’t even know which companies you’ve given consent to? We’ve started so long ago that you do not know which companies you’ve given consent to. And how do you withdraw that if you do not know? And with the data mining, I. Sorry, I feel a little bit passionate about looking at it differently in order to genuinely protect the human rights aspect of it, because there is a human rights aspect in relation to this particular issue. So there. I mean, that’s how I know. You asked me about how I started, but I started with Aba Rowley when I met Elizabeth, when speaking about the human rights and economic aspect of personal data. That’s the short answer. I think that’s the problem you get when you get a bunch of privacy nerds together who thought it was important before the GDPR. And one thing that I’m reflecting, and this might take us off in the wrong direction, is, you know, I’ve heard people talk you know, battle scars. When GDPR came into force, just like you were mentioning how much information you were getting about it and how laser focused companies were on that. And then, you know, the PRC, what was it, two years ago? I’ve lost sense of time. You know, their personal information protection law came into effect one year after their data security law. And the PRC is a major market. A lot of data flows in and out of that country as well. And no one seemed to be really paying attention to it because that was when we had started designing this program, was under the idea of, well, how is the interplay of the GDPR and the PIPL going to play out in Asia? And I think it’s really critical that people start thinking about privacy now. But we haven’t seen that same excitement. And, Elizabeth, It’s the fact that our digital privacy laws are so fragmented, not just across jurisdiction we’re talking about. Even in the same country, you have multiple laws covering the same thing. It could be industry specific. And so it’s really. I’ve been. I’ve been talking about this so long, and I’ve always received, like, pushback from different lawyers in relation to having unified at least an agreement on the terms, an agreement on the best practices. It’s really important because when, for example, Cookie Banner. Let’s just focus on the cookie banner. Cookie Banner, you agree? How many. How many jurisdictions can you see this particular cookie banner from a particular website? How many. How many jurisdictions does it. Does it cover? Would that single cookie banner comply with all of the. All of the different jurisdictions? At the end of the day, you know, we have companies wherein they guarantee compliance of the digital privacy laws in different countries. But that’s a really tall order. I think it’s quite impossible to say, oh, our cookie banner is compliant in all jurisdiction. That’s. You know, that. That’s really not realistic unless you’re doing, like, auditing for each of the countries and ensuring that you’re compliant for each of the country. And how many countries does the same website cover? So it’s also that I think. And donat, I’m so sorry. I think we are taking you in the wrong direction. But it’s just a thought that I had in relation to what Elizabeth said earlier. Well, cookie consent banners and consent, I mean, they do go together. So it is part of the same topics. We’re not veering off course here. I do want to note that, you know, if anybody ever promises you full compliance for any country, you should run in the other direction. Like, that’s not possible. And number two, with consent, you know, there are things that we consent to and things that we want. But I’ve had this experience living in the United States many times now where I buy something and it says, sign me up for your emails, and I unselect the box, meaning that I don’t want to sign up for emails. And then I still get like a. Thousand emails a week from these people. So even if you are exercising consent choices, a lot of times those consent choices are just not restricted. But before we get into consent, one more question about Ava Rolle. How can people get involved? How can they volunteer? Well, the easiest way, I think, would be to become a member. So if you just go to americanbar.org and you can join, you know, this 250,000 plus strong professional association, even if you’re not interested in privacy or consent, there’s probably a section of interest to you. But if you don’t want to join, you can always reach out to me. My email address is on the website, I believe, because especially in the center for global programs, we’re always looking for different experts, especially in these emerging areas of law, to help us troubleshoot the way we should be framing our programs, helping us design materials, or as you both are experienced in doing, reviewing the materials for accuracy and, you know, with helping lead trainings, as Lonnie has done or speak. So there’s a lot of different ways that you can get involved with ABA. Rolling. Awesome. good. All right, so let’s get into consent. So can you all define consent and kind of explain just why is it. Important. To go first as the practicing attorney? All right, so consider, in the context of privacy, I think what’s important is to talk about privacy first. And really, privacy traditionally is the right of individuals to be able to determine and control for themselves the extent of information about them, how it’s going to be used, who’s going to have it. So that’s the understanding of privacy. When talking about consent in the digital privacy realm, there’s different aspects of it, and I think maybe you’re going to be talking about it later. But just as a brief overview of what those factors or aspects are, a genuine consent is something that should be freely given, something that’s specific and something that’s unambiguous. These are the basic aspects of consent in the privacy field, is, if you look at it in the philippine situation, at least one of the important aspect of it is that it should be a genuine choice, and that really should be what consent is. And so if you provide consent and it’s not respected, then that’s actually not being able to control your information. It’s not a genuine choice because they did not really listen to whether or not you’re consenting on how a particular personal data is going to be processed. So I think that’s what consent is in the context of data privacy or digital privacy. I think a couple of things to just add on is just related to what Lani was mentioning. The genuineness of it has to be, you have to be able to say no and still be able to continue about your life in a very reasonable way. And because so much of this data hoovering up by companies, it seems to be the standard for business models. You can’t get many services without having to consent to something. And that’s only going to increase as we move to more automated decision making. The next component is also the ability to withdraw your consent, to be able to change your mind. And that isn’t you know, whether it’s, you’ve just had a genuine change of your mind or whether the terms of your consent have been changed unilaterally by the individual or entity that collected it from you. And then another component of consent that it sort of relates to everything, but it’s more related to the nature of privacy online and digitally is this idea of networked privacy, where it’s not just the data you put out about yourself or the information you put out about yourself, but it’s the information that others hold about you. So if you go somewhere and you take a photo with an individual and they post it online, that’s a part of you that’s out there that you surely didn’t turn to them after the photo was taken and say, okay, well, I consent to you to upload it to this site and this site, but not that site. You can do this, this, you’re not going to do that because, you know, our social interactions would grind to a halt. But, you know, as I referenced briefly before in Cambridge Analytica, even just having someone’s phone number in your phone or contact information can suddenly devolve into a much bigger issue that you really cannot control. And so I would even go a step further from what Lani was saying on that last little rabbit hole we went down is we can’t just be content to say, well, the business model of collecting data and then really doing our best to protect it is where we need to stop we need to really take a step back and say, like, why are we saying that this data collection model should be the norm? We really need to start having these companies asserting themselves as disruptors, disrupt that business model, because it’s proven itself time and time again to be very destructive to society. Not just human rights in the grand sense of genocide, but just our daily human dignity. Lonnie, I see you want to jump in. Yeah. Although, Elizabeth, I think the data collection, though, it does help for governments to create data driven policies. Data driven policies are actually quite important as well. So, I mean, I understand the hesitation in relation to collecting the data, but I think there are genuine, genuinely positive impact that data collection has given us. So, for instance, when it comes to health, how do you know that this particular disease is something that’s a cause for concern in relation to this particular country? And so a pharmaceutical company would then be focusing a research towards that particular disease. If there’s no. If there’s still no cure, or there’s no effective cure, data does help us develop policy driven. Sorry, data driven policies, and I think that is important. Can I interject really quickly? Yeah, yeah, sure, sure. And I’m sorry, Donata, I think we’re going to go down another rabbit hole, because I have to push back a little bit on this, because this evidence based, you know, programming and whatnot and data driven modeling has really steeped its way into the social science space. And I consider, you know, the work that I do to be social science, humanities, as opposed to the hard sciences, where that model really, you know, was first developed. And I think it makes a lot of sense in the hard sciences, especially for the reasons that you mentioned. But in the social sciences, it’s wildly problematic for a couple of reasons. One, we can’t trust governments with the data. I mean, in every country, they’ve proven to be poor stewards. There was, what, an OMB hack that 14 million us government employees had their data taken, like, hacked. And then it’s my understanding that the pipeline scandal that happened two years ago, the way they got in, was using the data that they had collected from the dark, like they had bought off the dark web from the OMB hack. So, one, bad stewards. Two, they don’t have any protections or forms of accountability for not just, you know, they’re being stewards, but in how they use it. And especially when we’re talking about using data, like you say, to do social services, we’ve seen that in, you know, sentencing or housing decisions, wildly discriminatory practices have been justified or implemented under this idea of data driven policies. And then finally, so, you know, it intend it helps. And then the final one is, when we’re talking about marginalized communities, they’re not benefited often by these data driven models and programs because their problems are not the standard or the norm. And I found, you know, especially in working in closed societies, where it’s really difficult to get genuine data because people want to provide it to data driven programming to help these communities isn’t possible. We can’t have the big numbers, even though we train 1300 people in this DDP program, which is great, that’s an open program, but you might only get to three or four people in a very close society, but make a massive difference. But if we’re talking about a data driven model that would never show up, it’s next to impossible to actually measure quantitatively. And I fear we’ve been swayed by these hard science people into thinking that there’s always a right and a wrong answer and data will give us the right one. And donate again, I’m sorry. I don’t think that that’s so I understand this issue in relation to bad actors. And that’s actually a reality that I think as digital privacy professionals, we see it. We see it on our regular work. But at the same time, this, even the marginalized sector, when you start collecting data, the data that we’re collecting from them that an entity collects from them, may not be digital data, but when you actually go to the lower communities, I mean, I’m from the Philippines, so when. And I’m from a province in the Philippines, I would. I do remember when I was younger that they would actually go to the barangays, to the small lgus, the local government units, and they would still collect data, but it wasn’t done digitally before they start for instance, giving out welfare. So data is important, irrespective of whether it is digital data or the actual physical data. But data is important when you’re implementing certain policies, even for corporations, for corporations to provide better services. I think it is a question of accountability, and I think it’s also a question of responsibility. Not just accountability of governments or accountability of governments, but also accountability of corporations who hold this data. And it is really a difficult, conflicting interest to balance, I think. But completely stopping data collection, number one, I think that’s going to be a bit. A bit impossible to do at this time and age. especially since I think right now it’s been collected already and going back to this question of consent, the difficulty about giving this genuine choice is the fact that we don’t just have personal information or even the network information. There’s also person related information wherein they may collect your data. And when that data is collected, it’s technically considered not personal data, but because of all the available data out there, when you consolidate it with another information, it becomes personal data. Much worse, it becomes sensitive data. So how is the consent genuine? If the transfer of your information, you don’t even know how. Yes, you are aware of the transfer, you are aware of the purpose, but you’re not even aware of what was the information that you’ve given. That’s third party, the third party company, the Nata, I’m not sure. But do you actually keep track, I mean, I always say no now, but do you actually keep track of all the companies we’re in, you might have consented in relation to this particular data? No. That would take me, like, the rest of my life to figure out for sure. And it’s not just the companies that you’ve consented to and that you’ve actually used their service or bought their products, but it’s all the third parties that have gotten that information from those parties, like the sub processors, and then it’s also everyone that has collected my information that I have not given that information to them. You know, they scraped different websites, and they got my information without me ever having to interact with them. So in that gets us, because I will bring us back to consent in a roundabout way, one that gets us to this idea of it’s not just consenting to the collection and the transfer of the data, but the analysis of the data. That’s. Lonnie, what you’re getting at is when you aggregate all of this data, and it might not even be all of your data, it might be your zip code and then your age, and they can, with all other data from other people, determine whether or not you have diabetes. And that’s a good example, because we do want to use data to address health issues. But and I’m not saying stop all data collection. My concern is that it’s, you know, not even a rebuttable presumption that companies are just entitled to collect our data to function and do well without having any of this discussion. And I’m not saying not to collect the data because there are legitimate purposes, but you have to show that, like you were saying, you have to be accountable. You have to show you’re going to be a good steward, you have to show, you know, the collection, the transfer, and the analysis. And that’s not possible right now with some of this black box technology. But one little kink I want to put in there to what you were saying, Lani, and Donata, I promise it gets us to consent. Is there are these conversations now about these programs with great humanitarian objectives and great need for the data going into communities, even collecting the data in an analog fashion or doing it digitally. And then this idea of, well, who owns that data, who owns it in its individual nature, and who owns it in its aggregate nature to be able to perform the analysis. And if you are taking someone’s data to do an aggregate analysis, whether it’s for a market based reason or whether it’s for a health based reason, let’s say, why does that community not also gain access to that value? And that is a really interesting conversation. Related to consent, I would say that they don’t have a genuine choice in that their data is given and they are not allowed to use the benefits of the aggregated data or the analysis. Even all of us with Facebook, Facebook doesn’t put all of its data out there. They have very special handbook fellows that they put on specific tasks who get very closed up data sets to work with. It’s interesting because a part of me used to feel that if we present the individual with all the information that they need to know, here’s the data that I’m collecting, here’s how I’m using it, here’s who I’m sharing it with, here’s whether or not I’m selling it, here’s the security measures being applied to it. And the individual has that information, and they say, yes, please collect my data, then that’s fine. But in reality, that’s not exactly what’s. Happening, because number one, most individuals don’t understand what’s actually being disclosed. Two, you have this, oh, we may share information with third parties for marketing and customer support, poor purposes, but nobody really knows what that means. And three, regardless of your consent settings, you’re still being sent email marketing, you’re still being tracked with Facebook Pixel or whatever. So a lot of times your choices aren’t even respected. And how do we kind of get through that barrier of providing individuals with information that they can understand, and how do we make sure that that information is actually true? And so that’s where I’m saying, like, we need to take a step back and look at this current business model, the model that we use to approach the Internet. And, you know, there’s a growing body of work. It’s very nascent right now, but looking at how it’s a pejorative term, but like, traditional communities have addressed issues like, say, networked privacy and how your privacy is related to your entire community, which is antithetical. I think to some western liberals, conceptions of privacy, which is very individualistic in nature. But, you know, ubuntu ethics in Africa region has been seen as being one potential sort of base from which to grow these new approaches to privacy, and then how we would move on to collection from there, which is what the DDP program is doing. It’s trying to elevate those voices that haven’t been heard in the discourse. Get out there, so stop listening to me, because I’m based in North America. But I think that’s the beauty of the DDP program, actually, you don’t just hear the opinions of the privacy experts, because honestly, if you’ve been in this space, I think you tend to, you tend to just follow what the GDPR is saying and as if it’s gospel truth. Actually, the time that I started rethinking the approach in relation to how we should safeguard the data was when I started talking to the communities through the DDP program. Because, as Elizabeth said, is it actually owned by the individual? Should it be owned by the community? If what you’re talking about is a data in relation to a specific group, a specific community. And so that’s where I think the consent is still present, but it doesn’t take the individual’s consent, but rather a community’s consent, because, you know, I think with Asia, it’s more of the community rather than the individual rights. At least traditionally it was obviously, now it’s really moving towards the individual consent because of what happened, because of GDPR and the propagation of GDPR in different regions. But it’s that, it, it’s a matter of considering that as well. But going back to what Donata mentioned earlier, I think I just wanted to add that even if there’s a proper processing of your data because they followed, you know, it’s not all consent centric. You can actually process it for legitimate purposes. The problem is, if it’s processed in a certain jurisdiction with an exception that does not apply to your, to the jurisdiction that you’ve originally given your consent to, then what happens? This consent based approach, I think it would work if we do have, it may work, or it would at least work better if we do have a consistent regulations. In relation to the consent, because if there’s a gap in a specific country and it’s processed there and then it’s transferred because they can transfer it anywhere, what happens then? What happens to the consent that you’ve given? And again, we don’t have a list of all the consent that we’ve given. How can you withdraw it? So absolutely. It’s that I think it would be. Nice to have a consistent framework that can be applied. I think one of my pet peeves living in kind of the western side of the world is, you know, if a business goes to an attorney and says, hey, I’m doing business in all these different places, right, I’m doing business in the US, I’m doing business in South Africa, UK, EU, Canada, Australia, Asia, whatever, you will most likely get a GDPR privacy policy that makes no mention of anything else going on anywhere else. And the assumption will be that if you’re compliant with GDPR, you’re compliant with everything else. And that is not how it works. Like, I can tell you right now that if you’re complying with GDPR, you’re not compliant with any of the US privacy laws. You’re compliant with GDPR. Yeah. You’re not even compliant with UK’s privacy laws since Brexit. Right. And I don’t think you would be compliant with the PIPL and the PRC either. No, you definitely would not. So the issue is that we have this kind of view. All GDPR is the best. It’s the only standard that we follow. It’s the only thing that matters. We’re going to, our entire compliance program is going to fit GDPR because it’s the most stringent standard. So that means that we’re going to comply with everything else. Like, no, you’re not. I’m sorry, but you’re not. And that’s really where, where this, the whole problem in relation to, well, an aspect of the problem in relation to digital privacy is coming from this really fragmented regulation. Because of the nature of digital privacy, it’s not jurisdiction based, but we approach it per jurisdiction. And there’s no single company, again, as you mentioned, Donata, if a single company claims we can make you compliant in all jurisdiction, you should run the other way. I mean, it’s true, but you do have to be compliant. If I, I mean, obviously some jurisdiction they would have certain requirements on when their laws apply. But there are jurisdictions where in the moment that you collect a single person from that, from that country, even if that person is not in their country, you have to comply with their law. And this problem, this issue, will persist unless we have a unified framework to ensure that there are no gaps In between, from I have voice to. The US legislators ears. So I have a couple of thoughts. The one thing that, you know, Lonnie, your point on, you know, needing to know, you know, if you collect information on one person, even if they’re not in their country, but they’re a citizen of a country, that could be a violation of that third country’s law. So that would then necessitate always asking someone, please tell me all of your citizenship, which is in and of itself a possible privacy violation, because, you know, part of human dignity is being. Is obscurity, is being able to go about your life in relative obscurity. So even though our faces are, quote unquote public, it’s not like everyone knows who I am when I walk down the street. on this idea of fragmented regulations, I was previously very aggravated by the fragmentation as well. On board totally with let’s have a standardized approach to everything. But I started talking with different groups and marginalized communities and examining. Getting back to what you were saying, Lonnie, about data collection in this one sphere, say health is really important, or lower levels of privacy for parents over their children is necessary. And it started me down this path of thinking, perhaps there is some value to the fragmented system within the United States where you have HIPAA that’s dealing with health, and you have CoPpA, that’s dealing with children. I’m sure I’m forgetting many other ones, but dealing with occupational health and safety and all of these things, that does help. But I was talking to a computer scientist based out of McGill University, Professor Benjamin phone. And he was saying that there are a growing work of research in the area of privacy-enhancing technologies where you can actually calibrate levers of what degree of data you need to come to a certain decision while keeping it anonymized. I think we can’t just have these conversations in isolation. Knowing really what is possible, technologically speaking, is really important. But then, by the same token, you know, as you may recall, at the beginning of the DDP, we brought together a large number of lawyers from multiple different jurisdictions in Southeast Asia. And we were talking about this idea of having a standardized system, even just in Asia. And the initial response was, well we have so many different languages, and we have so many different cultures. We have so many different histories. It would be too hard to come up with a common idea of what privacy is. But as we continued on in this three day symposium, and we really drilled down into, like, well, what is the basics of privacy? It really did come down to human dignity, which was across the board the same. And this idea of obscurity and, you know, maybe having to up the lever on the network’s privacy nature, but there was a lot more commonality. So, really, I didn’t give you an answer at all. I threw a bunch of contradictory facts out there, and I’ll let your listeners take with it what they will. Well, I have to say, like, the cultural aspect of it is so interesting, because being a lawyer, you would not think that you would learn about other cultures by the nature of your job. Right. Well, one of the things that I do is I, like, work on the engineering behind our terms of service, which incorporates consumer protection laws. So I have to read consumer protection laws for all these different countries that we cover. Great example. In South Africa, it is against the consumer protection law to imply that you have applied any, and I quote, juju onto a product that will garner certain results. No other country’s consumer protection law that I’ve ever seen has ever included that. But that was something that they wanted to include in there. And I thought, man, that’s really interesting. Very small, like, portion of the law. Right. But it does illustrate cultural differences, which I think is cool. But anyway, I know we’ve gotten in and out and often in. Into different topics, and we’re unfortunately running ahead of time, even though I’d love to talk to you both forever. But thank you so much for. For joining me today to talk about ABA ROLI, and talk about privacy and consent as well. Thank you. Thank you so much, Donata. I really enjoyed it. Good. Me too. And for anyone listening, make sure to subscribe so that you don’t miss our next episode.