This week our guest is Distinguished Professor in Law at the University of Virginia, Danielle Citron, who recently released her book The Fight for Privacy: Protecting Dignity, Identity, and Love in the Digital Age.
Danielle’s book, as well as this episode, explores the darker side of technology as it relates to the data of our intimate lives. This includes sensitive and challenging topics such as revenge porn and online abuse. We additionally explore the broader topics of online privacy, including nuances in section 230 (often considered the most important law in tech), as well as digital IDs, deepfakes, and even potential solutions to some of our current problems.
You can buy Danielle's book and find out more about her work at daniellecitron.com
Music by: Amine el Filali
Danielle Citron [00:00:01] So unless you strictly need intimate information to provide a particular product and service, you don't collect it. And this should be a standing rule. You cannot sell it, period. I'm sorry. You cannot sell the data brokers by. You cannot sell it. You just can't. It's a commitment to civil rights.
Steven Parton [00:00:34] Hello everyone. My name is Steven Parton and you're listening to the feedback loop on Singularity Radio this week. Our guest is Distinguished Professor in law at the University of Virginia, Danielle Citron, who recently released her book The Fight for Privacy, Protecting Dignity, Identity and Love in the Digital Age. Danielle's book, as well as this episode, explores the darker side of technology as it relates to the data of our intimate lives. This includes the sensitive and challenging topics of issues such as revenge, porn and online abuse. However, we do additionally explore the broader topics of online privacy, including the nuances in Section 230, which is often considered the most important law in tech, as well as digital IDs, deep fakes and even potential solutions to some of the current problems discussed here and. Now, considering the way some of these topics bring, Danielle brought a ton of levity and playfulness to her insights and made this a truly fascinating conversation, and I hope that you'll feel the same. So without further ado, everyone, please welcome to the feedback loop. Danielle Citron. Can you maybe just set the groundwork for us then on what exactly the fight for privacy is and how it's changing?
Danielle Citron [00:02:00] Yeah. So the focus of my book is on the privacy that we afford our intimate lives. So, you know, information about and access to our bodies, our health, our innermost thoughts that, you know, we know we document all day long, every second with our phones that we browse or read. We share, we like, we click, and it's our sexual orientation or gender or sexual activities. And it's our closest relationships that are just our lovers, not just the, you know, our partners, but also our friendships. And it's eroding the privacy that we afford our intimate lives in some sense, because our lives are being documented by services and tools in ways that are quite new like it used to be. I was teased with my class, like my phone was just my phone and my watch. Truly, my watch is still just my watch. There's no chance, like I'm wearing a networked watch. But, you know, everything that our tools and devices were were done right. We couldn't even conceptualize. I mean, we could have we read a lot of, you know, futuristic, you know, work. We could have guessed maybe we were going in this direction. But the way in which even our TVs are networked and everywhere we go in our, you know, our homes become our workplaces. It's so much tells us so much about who we love, what we're interested in, our fantasies, our desires, our communications with loved ones, you know, all the ways that that intimate life is being surveilled by individuals, by companies and by governments. And so that the fight for privacy is to reclaim. Are the privacy around our intimate lives in ways that I think shift the conversation from thinking about companies as companies to being data guardians and governments, not just as governments, but as the guard and also the stewards of information that they persistently collect about us. And same with individuals, you know, who invade our intimate privacy so seamlessly and almost without apology. And and it's a global problem. Right? So that's what I studied in my book wasn't just about the United States, but I interviewed 60 people from across the globe, you know, from U.K., Australia, India, Iceland, you know, the United States, Israel. So, like and I work really closely with the South Korean government. There's lots of ways in which the story that we that I tell in the book is very much a global story. I wish the invasion of into their privacy was sui generis for the U.S.. Right. Then we can just tackle that problem be solved, you know. But but, you know, as as is often true, misogyny is like a through line culture for culture. And there's parts of and I make the case I think we can make the case that even culture for culture ways in which privacy that's around our bodies, our love lives, our kind of most intimate thoughts and selves, that actually is pretty big. I mean, I'm not making the empirical claim that it's all the same culture for culture, but it is resonated so often when I talk to people and work with governments and you know, the Safety Commissioner in Australia, the digital sex crime and information commissioner, you know, in South Korea, the folks, the Law Commission in Singapore do there's some interesting, resonant themes that, you know, show us that the struggles in the US, we may be the worst at it, which is just true, but in our lack of protection for intimate privacy. But it's very much recurring themes that I saw in my work with companies that I've seen in my work, you know, with individuals and advocacy.
Steven Parton [00:05:43] Yeah. Or maybe it's an obvious question, but I want to ask it anyway. What is the cost of this information being in the hands of these these guardians, in the hands of the companies and the government? Why is it such a bad thing that this information is not private?
Danielle Citron [00:05:59] Right. So let me let me just take if it's okay. Yeah, because I do talk about kind of three different kinds of actors who invade intimate privacy. So individuals, companies and governments and sometimes individuals are the handmaidens of companies, are the handmaidens of governments. So they're not really separate in so many respects. But the harms are quite they they're shared themes. But it's worth noting that when individuals invade intimate privacy, it, of course, impacts how employers and companies make decisions about us and in turn, governments as well. But I thought I'd start if it's okay with individuals, right? And the harm that you ask me, like, what does that harm look like? Why should we be so worried? You know, so intimate privacy is at hours, like, what's the big deal? So and it's often true that when individuals invade each other's intimate privacy. So what I mean is often and there's just it's a a growing, unfortunately, field of privacy invasion. But the you know, the ways in which we nonconsensual a tape people videotape people, record them. We take that recording and extort more nude images from them. So sextortion, we take upskirt and down blast photos of individuals, you know, without the permission, you know, whether it's with a camera in your shoe or it's just, frankly, your cell phone, you know, that you use and you sort of lied under someone's dressing room, the nonconsensual posting of people's nude images, deepfake sex videos, you know, and and not an insignificant problem that is there are like 60,000 or more deep fakes posted online. Basically, all of them are deepfake sex videos and nearly all of them are women's faces morphed into porn. And so it's true for all of these different types of intimate privacy violations that the pictures are more often women, nonwhites, sexual and gender minorities, and often kind of on a intersectional basis, right? Like so it's black women and Indian Muslim woman like Rana, are you, you know, people who have multiple vulnerable identities and the cost, you know, you ask like, so what's the big deal, right? When when someone's nude images are taken without their permission, if they're posted online, if they're shared with other people like their employers, their friends, the costs are, you know, they're so manifold. So just to to sort of begin with how the integrity of the self, right, how you see yourself and how others see you is undermined. Right. Like, you know, you sort of we walk around the world, we move seamlessly through the world thinking that like we're seen as whole individuals. My hair may not be great, you know? But whatever it is, like we're fully integrated person. But when you're nude images posted online and that's in a, you know, in a search of your name or people find it and share it. As victims have explained to me and this is a resonant theme, that they're just the vagina. They're just a naked body on the toilet. There's their they see themselves as a fragment. And crucially, other people see them that way, too.
Steven Parton [00:09:07] It's very dehumanizing.
Danielle Citron [00:09:08] I imagine just like your as object rather than as subject and that dignity denial. So it's an autonomy denial and dignity denial. It's pretty devastating. You talk to victims. And one thing that kept coming up again and again is that folks felt like it was incurable disease. Like, when was the next photo going to come up and how could I get that down? And because so many sites are hosted either in the United States, Russia or Russia. Very, almost difficult or impossible to force sites to take it down. And they have no incentive to. That's their business model, as is often the case. Yeah. And so it's it's undermining for one's sense of self. It's undermining for love and free expression because it's really hard to stay online when you think people are using your online personas and profiles and, you know, social media hubs to undermine you and get more information about you. And so most of the victims I interviewed basically shut down all of their LinkedIn, Facebook, Instagram, whatever, all the the ways in which they communicated with others. And you feel really cut off. Yeah. You know, like as one victim explained to me, like Facebook is how I communicated with my high school friends. Like I'm a lawyer, practicing lawyer now, but I've like cut off a whole part of my life and I just can't do it because I know a this person emailed these nude photos to these friends. I just like, I can't bear it. Yeah. And they go off LinkedIn because that's a way in which abusers often get more information about their networks. And so, you know, it's free speech inhibiting and destroying its dignity, denying it could cost you your job. So we know that most employers use search as a way to basically as a part of a seed check. Yeah, it seems so obvious now, but. But it's now only obvious because we've got lots of studies that that employers say, hey, like 90% of employers use as a part, it's not the whole thing, but a part of your job interview process, especially for third party hiring services, collecting intelligence on you that, you know, the Google searches part of it. Mm hmm. And if nude photos appear in, you know, searches of your names, we know from studies that often why you don't get a job. One of the reasons listed is inappropriate photos. So from a 26 Microsoft study. Wow.
Steven Parton [00:11:32] 26 even.
Danielle Citron [00:11:33] Right. Right. And can you imagine only escalating right from there. And so, you know, the idea that that, you know, it's it's like no big deal. So often law enforcement will say and this is across cultures like get over yourself, boys will be boys, you know, ignore it. It's going to go away. You can't ignore it.
Steven Parton [00:11:56] Yeah. On this individual level, I'm interested. Is this something that is. More pernicious for young people, or is this kind of like an across the gap thing? Because I do feel like when I think of this, I'm fortunate enough to I'm 35 now. And when I was growing up, I didn't really have access to these technologies. But I know that if I was 13, 14, 15, up to, you know, 25, 26 really had these tools. It's a very natural way to be intimate, I would think, with the people you're dating and things like that and you're not really cognitively aware of these consequences.
Danielle Citron [00:12:32] Yeah. So we know that sextortion, which is that using people's nude images to extort more nude images or videos of you masturbating that predominantly the victims are children. So they're young boys and girls as well as women. We also know that the not the nonconsensual taking and sharing of nude photos is acutely true of women. So young women and girls. So in their late, like almost in their twenties. So like just the twenties, the twenties are really vulnerable. TIME Yeah. So young adults and it's not to say that men don't have their nude photos shared, but if you go to they're like 9500 sites whose raison d'etre is intimate image abuse. And it's either like not like more than 90% are women, you know, females. But there are sites devoted to gay, bi and trans men that it's like a gay revenge porn site. They're disgusting. Yeah, there is equally disgusting as the ones devoted to women and girls, so they list people's Grindr handles. There will be contests about where they live. Who do they work for? I want more nudes of this guy. And so it's it's very familiar. You go to that site and you think, oh, I've seen this before, you know, but that one in ways in which it was much more sort of women and women of color and etc.. So the it's it is, as you noted, well, like we are now bringing our phones and our laptops everywhere we go, bathroom, bedrooms, and it's normal. I am so not that person. Like people like you're such a prude. I'm like, no, I'm not. You know, I. I have two daughters in their twenties. Mm hmm. And there is nothing wrong with sexual expression that involves imagery. There's nothing wrong with you. No sexy text at all. Like my generation, I'm 53. You know, we did it a little differently, whether as I made my husband mixtapes. It sounds so, so lame, but that was really revealing. Right. And the letters we wrote to each other that every generation has a different medium and sometimes they're cross called we cross generational. Right? Sure. But, you know, young people, their phones are with them all the time. And it is a mode of expression, intimate expression. And they want people to be able to use whatever mode that works for them. Right. As they love as they connect. But we have to have trust, confidentiality and discretion so we don't destroy people's lives.
Steven Parton [00:15:08] Yeah. As you're talking about this, it makes me think, though, like, what is the some of this feels like the expression of the human condition and whatever medium is available. So then my thought goes to how do we control the medium? I guess so that this you know what I mean? I don't want to control the person and their expression by shaping the medium too much, but it does seem like there is some impetus on the medium to take these things more seriously. And as you noted, it does seem like in terms of civil rights, the authorities tend to care less about the digital side of things than the more physical. So I guess my question is how do we kind of shape our response to maybe be more mature about these technologies so that there is a better response and that we are building incentives and frameworks within them. Right, that aren't constricting, but also help us solve what seems to be a very real problem.
Danielle Citron [00:16:03] Right. So I think let me try to answer this first by saying technology alone or companies alone won't solve this problem, sadly. So just I'm just going to borrow a case from South Korea, which is like ten years ago, the upskirt phenomenon, you know, non-consensual taking of photograph sort of begins. And parts of society were like, okay, companies fix this. And the response in South Korea was like, you couldn't use a phone that didn't have a shutter sound. You now, of course, we have apps that work around this now like technology. The technological solve was to put you on notice that someone was taking a photo. But of course, a we loop around that really quickly and B didn't do much, right. So the if if it's just, you know, ways in which we change the technology, that's not to say it's impossible to help have the technology help us minimize damage. And absolutely, we can do that. We can talk a bit about sort of hash technologies and how companies have use it to prevent the rape. Posting of non-consensual intimate imagery. But we need legal. We need a lot to come in to. Right. Because there is people who say like the market. Well, the market hasn't fixed it. You know, like we have a broken prop. We have a broken market problem here. And you're right. Humanity is the bug in the code. Right. And that's why laws so helpful. You know, laws are teacher and it's not law is not going to solve everything, of course. But law incentivizes better behavior by the enablers of destructive, intimate privacy violation. So that's the platforms. Right. Law can help provide incentives to the companies that are over collecting oversharing, over selling or intimate information that can be used by governments, can be used by extort, by individuals. Right. Like go to a data broker and get all this intimate information about me and then sex start. Then you can then you're an extortionist extraordinaire. Right. If there's something in there that that I don't me individual doesn't want you sort of released to the public. So we need law laws, a part of it. You can't be the only part of it. We need all of us as well. You know, individuals have such an important role to play in education and educating each other and and having those bystander moments where we're like, this is not okay. So, you know, we we there's so much that as a we as society, right. Individuals, groups, advocates can do. There's a lot companies can do on their own. But that's not enough because that's why we're at where we're at. We need a lot of step in to make clear that intimate privacy is a fundamental and a civil right that each and every one of us enjoys, and that it transfers our view of it as a you're intimidate. I can collect it and make a profit from it until I have to actually justify why I need that data. I need a really good reason. And then I need to be, once I have it, the steward, the guardian of that data. Right. And that approach like shifting to the caretaker model, like civil rights, we think, you know, schools, workplaces, that employers are the caretakers of those workplaces. And when there's a racially hostile environment, let's just take workplaces or a sexually hostile environment law. We then interpreted state and federal anti-discrimination law to say, hey, you step in because you're the guardian of that workplace. You can't allow that to go. Unaddressed. And the same is true for like you think of public transportation that they've got to actually shift how you actually have public transportation, your design in order ensure that everyone can use public transportation, that the disabled have opportunities to get on the bus. Right. That we shape our environment to commitments to and, you know, to equality. And and so that's the shift I want us to make.
Steven Parton [00:20:02] And it seems like it seems like I think potentially, from what you've been saying so far, a very difficult shift to make, because I think of things like Pirate Bay and certain websites that allow you to, you know, stream things. And there's a lot of different legal codes. And it feels like in a global environment, it's very easy to push this content to servers that are outside the jurisdiction.
Danielle Citron [00:20:25] Yes, that's very true.
Steven Parton [00:20:27] So, I mean, is this to really solve this? Is this going to have to be something that comes from like literally a global, you know, all the all of the geopolitical.
Danielle Citron [00:20:37] All hands on deck.
Steven Parton [00:20:38] Like because it feels like without that, there's always going to be dark recesses for these things to exist. Is that true?
Danielle Citron [00:20:44] I think that is true. But I have a colleague, Nick Nugent, who's been writing about sort of the way in which all the layers of the Internet kind of work together. And he was explaining to me is like, look, you set up a server in Russia. It's slower. People don't like to wait. You know, like that is non-essential porn that streams from the U.S.. Let's say San Francisco and Chicago, which is one of the most famous revenge porn sites or kind of they had actually hopped between those two cities as well as in Russia. So like it hops around, of course. But, you know, it's going to be better streamed from the U.S. and more easily accessible. And user experience is so much better than if it's elsewhere. And so as as he was explaining to the more countries you get on board with your civil rights agenda, the more it's like what? And wherever you're streaming, it's people are impatient. So God bless you. They're not going to wait to load revenge porn. And so that that actually is the more jurisdictions I think we can convince to be partners with us on a civil rights agenda. And I actually have some jurisdictions who are joining us, you know, even more sort of ahead of us, frankly, than than Australia and South Korea come to mind then. Then we can get on this together so that it's I don't think you can't eliminate that destruction.
Steven Parton [00:22:05] Right.
Danielle Citron [00:22:05] Because let's you know, Hobbes would say, like, we come together to create a civilization and rules to prevent us from bonking each other on the head. Yeah, right to that. We're not acting, you know, it's not one way ratchet the destruction all the time, but we can't totally eliminate it. But we can make it harder to get less profitable. Right? Right now, it's just all profits. There's no downside. Right. It's like especially thanks to section 230 of the Decency Communications Decency Act of 1996 in the U.S.. That at the content platform layer those 9500 sites whose whole thing is intimate image reviews, they get to just make money. You know, they have subscribers for 29, 99 a month.
Steven Parton [00:22:56] Could you expand on to 30 for people who might not be familiar?
Danielle Citron [00:22:58] Cause I feel like, oh, my students would look at me, go. There she goes. You're up, you know, like.
Steven Parton [00:23:04] We've talked about before. But I want to make sure on the podcast that everyone is familiar.
Danielle Citron [00:23:08] And it's really important because I think unless we all understand what we've committed to for the last 25 years, we can't change it unless people understand what it is that in 1996 are these two congressmen, Chris Cox and Ron Wyden and Ron Wyden hands down in the Senate, Cox retires. But in 1986, there was a decision called Prodigy versus Stratton Oakmont. And Prodigy was like, Think of the early message boards and early Internet servers. It was like service providers. You went to AOL and it was everything you could see on the Internet, right? It was like a closed garden. Same with, you know, Prodigy. And Prodigy had used filtering software so that there were no dirty words because they wanted to be a family friendly site. And someone posted there was a post on a money basically management forum that was allegedly defamatory about Jordan Belfort, which makes this really ironic because the Wolf of Wall Street was ripping everybody off and at the time he wasn't yet convicted and thrown in jail. But someone said he's running a boiler room, the guy's a fraudster, he's a scam. And what a liar's. Do they say you're lying about me? Right. And so Belfort's firm sues not the poster, because, of course, there's no deep pocket there. But the publisher. So, so prodigy and prodigies, like, listen, we didn't know about this defamation. You know, we're not strictly liable as a publisher under defamation law. And a New York trial court says, you know what, because you try to filter dirty words. You're an editor, so you're strictly responsible for defamation that sort of flows through your networks. And Chris Cox and Rod Widen saw, you know, heard about this decision, this tiny little, you know, state court in New York. No one had really heard about it. But they really you know, they they cared about what the Internet might be. And they thought, okay, we want to incentivize basically companies to help clean up the Internet themselves. They knew the federal agencies couldn't do it on their own. So they passed a part of the Decency Act, which was like an endeavor to get rid of porn on the Internet, which you look back and go, Holy cow, that wasn't an unconstitutional law. Basically, the law was totally struck down except for this one provision that has two parts, important parts to it. The first talks about not treating users or providers of interactive services as the publisher of something someone else said. That's the part we're going to focus on. There's another part that basically and this is the title of the statute called Good Samaritan. Blocking and filtering of offensive content. That's their words. And the second part of the statute says, like, if you take down, you're not civilly liable for filtering or removing speech if you do it in good faith. So the whole discussion is not about that second part, because what are you going to sue? You took down my speech. Like, what's the cause of action? It's just it's it's rare. Sometimes the antitrust claims come up. But where we've seen all the action is when revenge porn sites. You know what they say. You sue me. Go ahead. Good luck. Because we're immune from responsibility. And why is that? Because the first provision I talked about, though, we won't treat you as a publisher, speaker of of user generated content. It's been so broadly interpreted to like lose any meaning and connection to the notion of a good Samaritan so that even if sites and this is what they do solicit. Encourage or deliberately keep up intimate privacy violations. They get to enjoy immunity from liability because they say user generated content. I know we called for it. I know we incited it. I know we're making money off of it, but too bad. So sad. And people have tried to sue them and they've lost under section 230 of the Decency Act.
Steven Parton [00:26:51] So you might agree with section 230 may be in cases like social media in general. Right. But maybe you think it should be amended. And for for these kind of cases, we should make some more strict language that kind of says, hey, in cases like porn sites and things like this, we need to talk more seriously about this not being allowed. Like, is that a fair? Because it does seem like it has some value in some. Yes, maybe not in these places.
Danielle Citron [00:27:18] Right. So I think it's had great value to incentivize self-monitoring. Right. Like I work with Twitter. I used to I mean, I think I still do. I don't know if the council still exists. We'll see in two weeks when we're supposed to meet. But like for years I was working with them since 2009 and Facebook and I work closely with Spotify and now TikTok and Bumble. You know, I work very closely with companies on trust and safety. Right. And they were immunized, of course, for making those decisions as good Samaritans to take down speech that was destroy people's lives. You know, you can you can commit 21 crimes with words. So, you know, they were very cognizant the fact there's like, you know, speech can silence speech. Speech can be speech acts can be really destructive and be criminal. Right. So we've got to keep that. No question. C two needs to be untouched. Right. But the the part of the statute that seems like a license, like a free pass to destructiveness, needs to be amended. And so I have a new article coming out in Bayou LA Review called How to Fix Section 230. And I kind of like I'm trying to be as careful as I can be because I do work a lot with staff on the Hill and so post jobs like I want to be very careful about the kinds of incentives that we put before companies so that we don't take down sort of if you think about illegality writ large, well, now reproductive health is illegal in lots of states. So I wanted to be as careful as I could be. And so what I've done in my new paper, so I like sort of take this notion, some notions from the book. But now I've been even more specific that. If you saw these were these sort of bad Samaritans? You know, we were saying like some some folks, some content platforms should never enjoy any immunity. And so what I say is that for sites that solicit, encourage or deliberately, deliberately solicit, encourage or keep up intimate privacy violations and cyberstalking content, you just can't you can't invoke Section 231. And then I say for everybody else, you've got a duty of care to take reasonable steps to address intimate privacy violations and cyber stalking. And I prescribe like recommend a statute and say, listen, Congress shouldn't leave this notion of what a reasonable step is to courts. We know right now and I can tell you right now what reasonable steps are. And so, like having worked in trust and safety with folks for 12 years, I sort of prescribed six steps that Congress can put and should put in a statute, say this is what reasonable steps to address intimate privacy violations and cyber stalking looks like right now. And I say and listen, let the FTC update the law with rulemaking as it's needed because we don't want the statute to get outmoded. You know, reasonable steps today may mean reasonable steps like something, you know, additive later. But I want to make sure that we have an agenda and a path forward that we learn from where we've been. You know, how like Elon Musk is saying things like we need a Trust and Safety Council and I'm like, dude, we've had one for honestly since 2015. I've been on it, you know, I guess. What do you mean? You need one? You have one, you know. And so which is full of people with really diverse views about speech, civil rights and civil liberty groups were on the council. I'm not really sure what he thinks he's making like we already there. Right. So and we haven't been disbanded yet. We're supposedly meeting December 15th, so do you know what I mean? Like, so, you know, there is a way in which we can learn from the 12 years or more. That we've been working on trust and safety. We're not writing this amendment to section two thirds in a vacuum.
Steven Parton [00:30:59] Mm hmm.
Danielle Citron [00:31:00] And as I can tell you, the ways in which we have reasonable steps.
Steven Parton [00:31:03] How do you feel like this type of stuff is being received? I mean, obviously, the the Elon Musk and Twitter situation is a very unique one. It's a bit chaotic and capricious because it's really one person and it doesn't really involve so much the the system and the brothers like. GEIST But like in general, how is it how do you feel like this push is being received? Do you think there is actually any people paying.
Danielle Citron [00:31:29] Attention to like, is this remotely realistic? Right. I mean, the interesting thing is just to step back is that there's a lot of agreement that we need Section two thirds reform, but there's disagreement about what the problem is.
Steven Parton [00:31:39] Mm hmm.
Danielle Citron [00:31:40] So on the one hand, you've got Democrats who think the problem is it's a license to ignore and leave up destruction, and they're not wrong. And on the other hand, there are folks who think and that's especially true of conservative speakers and representatives who say the problem is that they're taking down companies, that companies are taking down too much of my speech. Speech I like you're censoring conservative viewpoints. And as it turns out, having worked with these companies, that's so untrue. And they sort of bend over backwards. But, you know, look, if you're someone like Alex Jones who engages in hate speech, incites mobs to ruin people's lives, like you breaking the law, you run into tort. You know, I'm saying like kicking off Alex Jones is is a welcome response that's not censoring viewpoints like leave up all the you know you can leave up a whole lot but when it comes to things like especially in Europe, hate speech, right? Incitement of violence, like that's a bridge too far. Right. So given the fact that we have disagreement about what the problem is, it doesn't mean we can't come up with solutions. I work with both sides of the aisle on Section 30 reform proposals. And do I think a targeted approach that would like carve out the worst of the worst, where the cost of free speech are obvious and empirically shown? We know the cost to consumer privacy violations is just both economic. It's speech, it's dignity, it's autonomy. The costs are apparent and they're studied. And there's empirical proof. Right. Do I think that we have any sort of way in which we can convince people, you know, I work with Blumenthal's office. I work with Schatz, his office. I work with a lot of offices. I work with Graham. You know, like there's ways in which I think if you narrow it even further to children. Absolutely. But that's really not the problem. Like, as I say to offices like see, sound federal criminal law, it falls outside of 230. So if prosecutors want to go after anyone peddling child sexual exploitation material, they can. And they do. My goodness. Right.
Steven Parton [00:33:52] It also seems on so there's a sidetrack, but it also makes me think with content, moderation and communities in general, like I've done a lot of I've run a lot of communities that currently still do similarity and another areas with, you know, tens of thousands of people. And yeah, one thing that's interesting is there's always the discretion to just get rid of people who are bad actors regardless of any legality. Right and right. What's interesting here is these platforms are so big that we deem them as public utilities. But at the end of the day, I still feel like they have the prerogative to just simply say, you don't.
Danielle Citron [00:34:29] Make it. You're not playing by the rules. That's right. You're making it unhealthy.
Steven Parton [00:34:33] So how do we reconcile that, I guess, in your mind in terms of like the public utility versus the private company aspect?
Danielle Citron [00:34:42] Right. I think that's the argument is unpersuasive. These are private companies that when you think of the mail and the telephone where you have limited bandwidth, that is you got to use the U.S. Postal Service or you have to use, you know, think about the telephone. Like, you know, there was really like you built, you know, wires. It was expensive. Yeah. And you there weren't especially post the break up of the baby bells, you know, like you this pre-internet. No public utilities are because such scare, there's scarcity and good, powerful control. We thought, all right, you got to let everybody in if you want to send the letter. I'm not going to say you can't because you're a Nazi. Right. You can use the mouse. And that same is true of the telephone system. Like we actually protect the telephone and communications privacy quite robustly. Same with the mail. And so, you know, we're not going to judge you, right. For using the phone. We're not going to say you can't have a phone because you are inciting violence. We may punish you for having incited violence. Yes. But we're not going to say you can't use this utility. These platforms are you know, they're here today and gone tomorrow. I feel like Julia Angwin wrote this brilliant book about MySpace, which is like what, you know, like and Twitter was really important. So probably it isn't.
Steven Parton [00:36:05] Those people listening right now who don't know what MySpace is.
Danielle Citron [00:36:08] Probably. Yeah, exactly. Like y'all. Hello was the biggest thing. You know what I mean? Like, you know, and then Fox buys it and it just dwindles, right? Of of its shell of itself. And I'm not even sure it still exists, but it was like the biggest thing. So, you know, here today, gone tomorrow. And these are private companies. And what's really interesting and something I've really learned working with them is that if there's bandwidth and if they can scale education, that so often instead of just kicking people out, never letting them back in, it's like you get a timeout and suspensions. And if so, I don't know. If you listen to Sudheer Breaks the Internet, it's like a podcast for a guy who was a sociologist who worked at Facebook for a couple of years, and he and his colleagues who they focused on content safety, like, I just it turns out I never work with them because I was busy working with really bad heads. Right. Rather than, you know, I we never met. But he has this amazing podcast where he talks about he and his partner, who they left Facebook because they got nowhere but said like we've had the experience of like teaching people about the roles. Like there have been we've done some experiments where we have found people who violate terms of service and community guidelines and then we taught them, put you in a timeout. But let me teach you about why and why it's harmful. What is hate speech? What does it mean? Why is it harmful? Or, you know, take another example stalking, harassment, threats, nonconsensual imagery. And they took the case to Zuckerberg that they found that when they taught people about the rules and they suspended them, they came back and didn't re-offend. That they were they were healthy participants. You know what I mean? Like they didn't make the platform worse. And so they made the case, like, we should educate. And Zuckerberg's response, apparently and this is in the podcast where Sudhir explains that that Zuckerberg is response was like, people don't change. There's nothing we can do, which is just the damn cop out. That is such a cop out. It's pathetic. That's why it's clear you didn't have a liberal arts education. Like, he needs to go back and read a book because he didn't stay in school long enough to understand that, of course, we cultivate knowledge, we teach one another. That's what parenting is like. I feel like, golly, his poor children, you can't teach them anything, you know? And we can teach people who are older, like it's a project that's for all ages. Especially older people like what do they know about harms online? They have issues that make sense, like so that if they could appreciate and learn it, like we've got to get everybody on board here.
Steven Parton [00:38:43] I'm this makes me wonder a little bit about, you know, again, I guess it's people changing versus the product changing. But one conversation that I often find myself in when we're discussing these types of things is the idea of like a digital ID or something that attaches to the individual so that their online behavior, there's more accountability, which I think is one of the issues here, is the social norms that we tend to use to keep out bad actors. And and.
Danielle Citron [00:39:12] It's like.
Steven Parton [00:39:12] Absolutely, it disappears once you go online. Would this be something that you support or does it just open up the door for more privacy issues because now you potentially have authoritarian.
Danielle Citron [00:39:23] Then you so knew where I was going. Right. Yeah. You know, I think that's right. Like what worries me, I think we have seen through studies that even when people are totally identifiable, they engage in the same kind of mischief.
Steven Parton [00:39:37] Mm hmm.
Danielle Citron [00:39:38] That the verifiable ID's don't actually change behavior. And, you know, then you have a honeypot of ID that can be tapped into by companies and governments. So that's like my worry about government's misuse of real IDs, right? And who really needs anonymity and pseudonym any are domestic violence victims. Political dissenters like I want to I want to protect their pseudonym Annie, because that's what enables them to speak and engage online because they are protected. And in some small measure, like we live in a completely pervasive surveillance age, as Shoshana Zuboff would say, like surveillance capitalism is alive and well. And the idea that we're not traceable is just that's not true unless you have a really smart perpetrator. They're not also smart. Right. But it takes resources to figure out who someone is. And then you need somebody to sue who you can sue legally.
Steven Parton [00:40:38] Right.
Danielle Citron [00:40:39] And companies and individuals are really protected and insulated in the United States from liability. A thanks to section two thoughts individuals and B companies. Because we have a consumer protection model which basically is a free pass, like collect all you want to sell. If you don't lie to the public about it, it's all good. And that's right. Like we're living in in a many ways, like we're the scofflaw in the United States versus elsewhere. I'm not going to say GDPR is like heaven and not at all. It's like with weak sauce, procedural commitments. It's better than nothing. They have data minimization as a commitment, right. But there are this a lot of loopholes like you can collect data for a legitimate use which like you could drive a truck through. Right. So it's not that there is perfect solve everywhere else. Right. But you know, I do worry. So just to be sure, Stephen, I'm reading responsive to you, which is real ID, right. And those solutions have been something we've wrestled with for a really long time in these conversations about certainly hate speech, you know, having worked on the issue with companies for like 12 years. And so often people say like verifiable identities are going to solve that. But look what we saw in my town, 2017, Charlottesville. People were on the streets with no mask. Like walking down the street, right? With the pitch. You know, like with tiki torches. Tiki torches. Thank you. I'm, like, traumatized. Saying it right. Jews will not replace us. And they were fully out in the open. Mm hmm. Like post 2015, we have mainstreamed anti-Semitism, racism, you know, in a way that these people felt like they could walk down the street in Charlottesville and feel protected. And they were. You know, we do have prosecutions after one sex, you know. And we had some civil sex. Amazing, successful lawsuits here in Charlottesville. But they felt emboldened. And there is a president emboldening them. Right. And so we've a lot of work to do. Right. And so I don't think I wish there was an easy technical solution. I'm always like, so game, you know, like my friend on our board at the Cyber Civil Rights Initiative, Hany Farid, amazing technologist, is on our board. And I'm always like, Honey, how are we going to solve the Deepfake problem? He's like, Not yet, you know what I mean? Like, they're working on interesting authenticity and you know, and I have a friend at USC, well, al-Majid, who works on the sort of on the detection, as well as the authentication side of imagery. And they're not giving up, but they're not bullish on a solution right now like that is technology isn't going to save us in the moment. And and I'd like us to have a whole solution where we I'm such a lawyer. I'll take any tool. Give me a tool. I'll use it. Right. And that's companies. That's the market industry norms. That's law.
Steven Parton [00:43:34] Yeah.
Danielle Citron [00:43:35] That's us. That's parents, teachers. Right. It's like we need a whole solution, right? No one thing can fix it.
Steven Parton [00:43:42] As, as we, as we near like 100 episodes for this show. One thing I've learned, I feel like at this point is that we're probably in for a perpetual arms race, so to speak. There's always going to be that better deepfake and the security is going to be lagging behind. The question is just how quickly can we close the gap?
Danielle Citron [00:44:01] That's right. Like I you know. When we get to that place where, let's say deep fakery is undetectable from the technical perspective, you know, we always have journalism. Like, I wasn't there when you say I was doing and saying something, the video says I did or said, right. That takes time. I mean, it's also not always foolproof, but it takes time. And that's time that our eyeballs and our way in which we want to just quickly like click share with our phones. Right. That's that's kind of the technical and then real, you know, human behavior don't interact. Well, does that make sense even?
Steven Parton [00:44:41] Yeah, but I'm grinning because I'm just thinking, as we're saying, this is like the you basically said in some ways, the answer to these privacy issues is less privacy. Not not not really. But I mean, in my mind, I'm hearing if we have more ways to know where people are, then we can say that people weren't where they said, oh.
Danielle Citron [00:44:59] Right. No, no. Bobby Chesney, I wrote about this in RSA. We he's a national security expert. He's now the dean of U. Texas Law School. He's amazing. And in 2018, we teamed up to write a series of law review articles about deepfakes and their challenges for national security, democracy and privacy. And we considered the possibility of that life, LifeLock, that like everything, especially for really important players like politicians, CEOs who edit fakery, could move the market, right, could undermine elections. And in the end, we're like, ah, it's too big of a privacy problem, right? But but I guess what I mean by journalism is like, well, people with public identities who go and travel, they're out in public rather than like in, you know, in the privacy of their homes or, you know, having meetings that are often publicly documented so that you can say, hey, I was at work like this hour to this hour, and it's shows me having sex with, you know, my sister. I'm making this up, you know what I mean? Like, whatever thing they want to discredit me with.
Steven Parton [00:46:05] Right? Do you see, like, any technological solutions on the horizon? Like I'm thinking of obviously of something like blockchain when you mentioned that and if it was like a private blockchain.
Danielle Citron [00:46:15] Did you really have to say blockchain?
Steven Parton [00:46:17] I mean, I feel like I'm.
Danielle Citron [00:46:18] Screeching I'm the munch, you know, the munch. A scream. I'm scream, right? Yeah. But don't do it. Don't do.
Steven Parton [00:46:23] It. Paves paves a it paves a road towards a direction we could go. Something that is encrypted, something that is right. Maybe less authoritarian, you know, like do you do you see that as a potential horizon worth exploring?
Danielle Citron [00:46:39] I mean, is show me a good proposal, right? That doesn't destroy the environment like I'm down to hear it. Right. But one thing I do think we need to do, and we've had some success on Facebook. I was on Facebook's nonconsensual imagery task force, and we came to Facebook in 2015 and said, myself, the Mary Frank Sinatra from the Cyber Civil Rights Initiative, National Network to End Domestic Violence. There are a number of us on this task force, including the Safety Commissioner of Australia, Julie Grant. So we met with Antigone Davis who we work really closely with Karunanidhi and Monika Bickert. Like we got together and said victims are coming to us and they're telling us that people are threatened. They're usually ex-lovers threatening to post their nude photos and they want to be able to prevent that from happening on various sites. Would you consider, you know, taking and making hashes of images to prevent them from being posted on your platform because they were admits, because we had recommended it. Now, having a hash database of images that they had determined to be nonconsensual, intimate imagery, not just you complained, but they determined and went through a little process, was nonconsensual, lots of imagery. And so that was an effective way. And this is just Facebook's property. So it's really Instagram and Facebook, right. That they were already hashing images that they determined was nonconsensual and some imagery preventing the reposting on their platform. And we wanted to widen the aperture a little bit and they said, let's do it, let's talk. So Alex Stamos, who was then head of, you know, chief security officer, they made it work and we created a pilot program. Now, the problem with the pilot program, it's now universal. And you you can people don't trust Facebook. Yeah. And so all the victims that I would say like you've been threatened, do you want to go to them and give them their the hashtag like I hate Facebook. You know, like so I think it's not that it's it's it's possible. Right. So one thing I think that we can borrow from the notion of a hash database is if and this would be a cooperative effort much in the way that we see the Internet Watch Foundation and Nick Mack deal with child sexual exploitation material, have hash databases that companies can then go to and scan and then prevent the posting of and CII, not just on Facebook. And Instagram. But on all the sites that participate. So a sort of shared hash database of images. That's not porn. I'm not anti porn. Right. But these are non-consensual shared images that someone does not want to be shared. And the interesting thing, if you go out there on non-essential item imagery tags, the sites which I have to spend a lot of time so you don't have to. They're disgusting. But usually the photos are so in some sense, like it's not remotely as sexy as porn. You know, these are everyday people. Right. Who took a snap and sent it to a lover. And it's all their glory. Do you know? I'm saying. But it's not anything as graphic for as. Do you know what I'm saying? It's like it's not beautiful necessarily, but it's shared images that, you know, that the thrill is that they said they didn't want it.
Steven Parton [00:50:01] Yeah.
Danielle Citron [00:50:02] Like, it's very. Does that make sense? Like, if if you spend enough time on these sites, you see, it's just that what makes it salacious for people to look is it that is it's not consensual power status.
Steven Parton [00:50:13] These guys are.
Danielle Citron [00:50:13] Totally they're showing off whatever to their friends.
Steven Parton [00:50:17] In terms of we're we're coming up pretty close on time. So I want to kind of turn our attention towards the future a little bit and a little bit solutions oriented, I guess. But like, what are some of the I guess. Most promising avenues in your mind to kind of address these things? Like if you if you could guide people right now towards like a way to improve this circumstance, whether it's at the individual or the company or the governmental level, where would you point them?
Danielle Citron [00:50:46] Okay. So we need a whole approach, but let's just take law because that's what I'm good at. So the book, of course, explores industry norms and what we could do as people and teachers and parents. But first things first is that the sites that host this content should not be free of responsibility, right? They have right now the incentive is to just show and encourage abuse. Right. We need to fix that because that's a key part of it. We also need a comprehensive approach to intimate privacy violations where victims can feel seen and heard and can sue their perpetrators, as well as the sites hosting the abuse. And for companies, we need to minimize it. If you don't collect it, you can't sell and exploit it. You can't look at it, can't be hacked. So unless you strictly need intimate information to provide a particular product and service, you don't collect it. And this should be a standing rule. You cannot sell it, period. I'm sorry. You cannot sell the data brokers by. You cannot sell it. You just can't. It's a commitment to civil rights. And just because it's in the name of profits, I don't care. And you have commitments of loyalty and care and nondiscrimination. Companies. Right. And then governments should follow the same playbook. That is the playbook of the data steward, as should be the playbook of government like government right now has tremendous amount of surveillance around pregnant women who are on Medicaid. And state rules require that they tell social workers an enormous amount of personal, intimate information that has nothing to do with a healthy pregnancy. Like have they ever had an abortion? Have they ever been raped? Have they ever suffered domestic abuse? Have they ever been expelled from school? Like, why do you need to know that for a healthy pregnancy? You do not need to know that, right? It's just it's sort of power and shaming and I have no idea what other reason. But it's not that that is, governments don't strictly need it for the administration of Medicaid. They don't get that information. Insurance companies and doctors don't ask out of wealthy mothers. Whose same risks of a healthy or non healthy pregnancy and parenting. Right. And so, you know, government should be guided by the same first principles of any collection, an anti sale.
Steven Parton [00:53:11] Fair enough. Well, I want to give you the last few minutes here, Danielle, to give us any closing sentiments, thoughts. Talk about your book, if you'd like, in any way, shape or form. Just a few minutes for you to have the floor.
Danielle Citron [00:53:25] So takeaway intimate privacy is indispensable to a life of flourishing. Right. The idea that our intimate life is ours and we can experiment and play and figure out who we are, be seen by others with integrity. Like, I don't want people just to be seen as their genitals. Right. I want to protect you from having your insurance premiums be raised because you have bad cramps at the end of the month. I want to ensure that we can use all these tools and services that do. They say they make our lives better. Okay, great. You know, I'm not going to knock a dating app or. A Perry tracking app. But do you need to sell my data to marketers and in turn to data brokers? You don't and you shouldn't. It's about privacy is too important. And so what do they say? Like pave paradise and put up a parking lot as Joni Mitchell's fabulous song? I don't want us to completely give it all away and then just say, Oh, I'm sorry. Like, we gave it away. So, you know, that's just not true. We control what we build. We control. We meaning the proverbial companies, governments, individuals. Like we're in charge of this project. I don't want it to get too far out ahead of us so that people can say, I have no privacy. That's what. But at the Sun Microsystems CEO Scott McNealy in 1998, I think said, like, you have zero privacy, get over it. Absolutely not true. Right. No. That serves your bottom line, but not mine. Right. And so I think we need to get on top of this now so that people get the intimate privacy that they want and expect so they can get the intimate privacy that they deserve.
Steven Parton [00:55:08] Well said. Well, Danielle, thank you so much for your time and for this very well-articulated conversation.
Danielle Citron [00:55:16] I thank you so much for having me. And that's a lot of fun.