This week our guest is research scientist and adjunct professor, Justin Hendrix, who is the CEO and editor of Tech Policy Press (https://techpolicy.press/), a non-profit media community dedicated to exploring the intersection of technology and democracy.
In this episode, Justin and I enjoy a wide tour through a variety of topics, including political polarization, social media as a public square, social media as a public utility, the economic possibilities including Universal Basic Income, and much more.
Follow Justin at twitter.com/justinhendrix.
You can also find the paper Justin cited on Collective Stewardship at https://www.pnas.org/doi/full/10.1073/pnas.2025764118
Learn more about Singularity: su.org
Music by: Amine el Filali
Justin Hendrix [00:00:00] You've got all these politicians that are essentially in this Skinner box where they're responding to the incentives of social media. Clearly, there's something wrong with the idea that privately held platforms can become so publicly important, and yet there are no fail safes.
Steven Parton [00:00:34] Hello, my name is Steven Parton and you're listening to the feedback loop on Singularity Radio this week. Our guest is research scientist and adjunct professor Justin Hendrix, who is the CEO and editor of Tech Policy Press, a nonprofit media community dedicated to exploring the intersection of technology and democracy. In this episode, Justin and I enjoy a wide tour through a variety of topics, including political polarization, social media as a public square, social media as a public utility, the economic possibilities of universal basic income and a whole lot more. Justin brings a rich collection of knowledge that he put forth in this conversation with a calm clarity that I truly appreciated, and I hope you'll feel the same. So everyone, please welcome to the feedback loop. Justin Hendrix. Comment? Well, I think the best place, most obvious place probably to start with you then, is with tech policy press. Obviously something that you spend a lot of your energy on. So maybe to start, you could just tell us what the inspiration and motivation were behind starting this nonprofit.
Justin Hendrix [00:01:47] Sure. Tech policy press started out as kind of side hustle. My co-founder, Brian Jones, and I were running a blog under the name Protéger Press. This is now a few years ago, looking at similar issues to this tech Policy Press's current editorial remit. And in the midst of the pandemic in 2020, I decided I would like to take a go at trying to make this side hustle my, you know, particular focus. And so we essentially set up a tech policy press as a five, a one, C three. And set out to kind of create a prototype of a media and community site that would gather ideas from across a wide range of disciplines at the intersection of concerns around tech and democracy, tech in society. And so I've been up to that now for two years.
Steven Parton [00:02:42] Nice. And when I think of that intersection that you're talking about there between technology and democracy, I can't help but kind of think of people like the early tech optimists of the nineties, like Douglas Rushkoff and their kind of hopes that this new technology would unlock empathy and free expression and we'd see in an empowerment of our our government and all of these things. And then I think a lot of people would say around 2012, 2016, this maybe went the completely opposite direction. Where do you stand now in terms of how you feel this technology slash democracy relationship has unfolded?
Justin Hendrix [00:03:22] It's interesting you bring up Doug Rushkoff. I've actually been teaching with Doug Rushkoff for the last few years in a course called Tech Media and Democracy that brings together faculty from across multiple campuses in New York at NYU, Cornell Tech, the New School, Columbia, as well as Queens College, where Doug is a professor. And of course, he's, you know, taught me a lot and learned a lot from working with him and from reading his work. And that class, which was really at the root of the inspiration for me in getting involved in these issues and launching tech policy. Press was meant to kind of look at this relationship almost from the perspective of, you know, is this a crisis? Is this a crisis moment? Are we seeing, you know, externalities and problems that are emerging that may, you know, truly pose a kind of existential threat to democracy? And I mean, I think it's fair to say that part of the reason for that were the types of concerns that came along at the outset of the Trump administration. Many of the concerns around social media and disinformation, which of course, became very prominent in those those moments thereafter, things like the Cambridge Analytica scandal and then, of course, the ramp up of of regulation in Europe in particular, and lots of discussion about potential regulation perhaps here in the United States and elsewhere. So as far as where we're at on that continuum between perhaps optimism and pessimism, I think we're still in a slightly difficult place where some of the. Maybe more market driven aspects of that optimism are still very much at play. And we in at least in this country, do not yet have any sufficient. Kind of plan for what we would like a tech policy to look like in this country. We don't have things like comprehensive privacy regulation. We don't have anything, of course, is comprehensive around social media as the Digital Services Act in the EU. We don't have, you know, a kind of comprehensive policy, although there has been some movement in that direction. So there's there's still a long way to go. And on some level, we're still operating with the same rules as we had in that nineties period or nineties or early aughts period. Even though we've learned, you know, quite a lot since then.
Steven Parton [00:06:12] Well, speaking of maybe running by outdated rules, you know, is is the American government or any modern government that's kind of founded on a several hundred or a thousand year old constitution, is it ready to adapt to the changes that technology is bringing down the line? Like can? Is there something maybe unique about the American government that has caused it to maybe lag behind these other countries? Or is every country just kind of in the same boat because they're not really made for this kind of paradigm?
Justin Hendrix [00:06:46] You know, we could probably spend our entire hour on this question, and I'd probably veer into some areas that I don't know enough about. But I do think that what we're seeing right now in the United States, at least, is a kind of urgency from both parties around different issues. They both have big tech in their crosshairs, but for different reasons. I think a lot of people were very disappointed that what appeared to be a kind of bipartisan consensus on antitrust and, you know, issues around competition didn't produce much more than a, you know, merger filing fee modernization out of the last Congress. And there's little hope that a lot is planning to come out of this Congress, even on topics where it appears there's some consensus across both parties, you know, maybe around things like child safety or even privacy, for that matter, where the American Data Privacy Protection Act, you know, of course, passed with bipartisan support out of committee. I think there's something going on that that some folks have suggested, you know, really has to do with just the kind of breakdown between the two parties, the ability to find consensus and the fact that at least on some level, it's good business for politicians to say they want to do something about tech and to continue to fundraise on it and continue to kind of, you know, call out the injustices, whatever those injustices might be from their perspective and perhaps less good business to necessarily solve anything.
Steven Parton [00:08:32] Yeah, well, you touched on something there that I think is really important, which is that the polarization and I feel like a central feature of pretty much any social institution, especially as it relates to government or politics, is a degree of trust in some common narrative or common a common purpose. And we we often talk about how much the parties are divided into their own little subjective reality tunnels. But, you know, we're we're fragmented into our own little worlds on social media as the masses who are voting. But I can't help but think so are the politicians right there. They're also in their own fragmented tribe of subjectivity. How do you think, I guess, social media is impacting? Our ability to really push forward in the government. But whether from the masses changing the zeitgeist because they're too busy fighting each other or the politicians not actually pushing policy forward because they're too busy fighting each other.
Justin Hendrix [00:09:34] I helped to write a report last year on social media and political polarization with Paul Barrett at the NYU Center for Business and Human Rights. And we talked about some of these issues and that report in particular looking at polarization effect on, you know, the legislative process in the U.S. And I think there's a lot of questions here about the fact that you've got all these politicians that are essentially in the Skinner box now, you know, they're in this like game where they're responding to the incentives of social media and other players in the game are also responding to the incentives of social media. So, you know, whether it's media and journalists, people in civil society, you know, voters themselves, everybody is kind of vying for attention in this mechanism. And, you know, research that we have on this, a computational social science research as well as, you know, internal documents we've seen from places like Facebook. You know, it appears that those incentives are very powerful and they're affecting the behavior not only of politicians, but also, you know, clearly of of media entities and the like, and driving them, of course, to more extreme polls, essentially. And so, you know, one of the things I think is most concerning about the Facebook leaks from Francis Hogan was essentially that that, you know, politicians were embracing and pursuing more extreme agendas, essentially to. Be certain, be certain that they would better reach across social media or across Facebook in particular. And so, you know, just think about that dynamic. Kind of reflexively operating for years and years and years. And that's where we're at, Right? I'm not saying and nor did the report say that social media is the sole cause of polarization in this country or any other or the sole cause of the reason that legislators aren't getting along terribly well. There are other reasons, but it does appear to be making things worse.
Steven Parton [00:11:47] Yeah, I can't say I've never thought about this before, but it's kind of strike me as profound as you say this. Now that you know, these politicians are equally as maybe addicted to these tools as the average person are and maybe therefore don't have the incentive to actually change it. You know, it's hard to cut an addict off of their drug. And if you're a politician, why would you cut your supply?
Justin Hendrix [00:12:11] I can't remember the exact hearing, but there's this one particular moment where a camera caught Texas Senator Ted Cruz, you know, asking a fiery set of questions and then immediately going on Twitter to see what the response is in real time. Right. So, you know, that to me, like kind of really was a sort of like poignant little moment that that shows exactly where we're at, that people are not only very aware of how they're performing in front of the social media, you know, audience, as it were, but they're quite literally looking for that feedback and looking for those signals instantly.
Steven Parton [00:12:55] Yeah, I mean, this makes me think of the idea of the public square, you know, the historic public square where we might come together, where we might have those there's fire, your revolutions take place and we socialize and we decide to change the zeitgeist of a generation. And that's now all mediated by a screen and by algorithms that are shaping what people see. How do you think that that's impacting the conversation? You know, we're not really out doing that in-person dynamic anymore. Is that changing the way our our democracy is really functioning at this point?
Justin Hendrix [00:13:31] You know, one of the books I always think about when I encounter a question like this one is from Jose Marshall, who's a professor out at California Lutheran who wrote about Facebook and democracy all the way back in 2012. So he was kind of, you know, perhaps early to this conversation. And, you know, he makes the point that. You know, you've literally got these mechanisms that are that are somewhat replacing or standing in for democratic functions or certainly democratic investment time and effort that individuals used to put into, you know, the other mechanisms. So, you know, it's common now, for instance, for maybe neighbors to fight over some civic issue in a Facebook group, but never to attend a community board meeting. Right. So there's a big question. I think that and I think that, by the way, that is a reality at various levels, not just at the local level, but there are a lot of questions there about, you know, where is the legitimate democratic discourse? Is it on Twitter? You know, is it somehow on Tik Tok? Is it all of those places? Is it at the community board meeting? And that's part of what's going on here is there's almost a kind of challenge to government. You know, there's this other place now where it seems like. These things are going to be debated and the sort of weight of the masses, you know, kind of conclusion on some matter should matter. And the system's not really set up that way. Right? Right.
Steven Parton [00:15:18] Well, not to not to compel you to opine too much after our previous discussion about giving too many opinions prior to the draft. But do you think. We're engaging in some golden age thinking here, thinking that it should remain an in-person experience, that we should be attending these community meetings and that we need to maybe get on board with the technological version. Or maybe is there something truly lost that in order for the human animal to function well, we need to stay in that in-person dynamic.
Justin Hendrix [00:15:52] You know, I think this is a really good question and pretty much the golden question, you know, to some extent, do our democratic systems very much need to evolve and be updated for the digital age? I think most people would say, yes, there does appear to be something wrong with. The way that government operates in this age, that sort of out of step with the way that most people are engaging with one another and with with political information, for that matter. And yet, you know, what is the thing that should happen? I think there's not a good discourse around that. That's one of the things I hope to talk about on tech policy press in the years ahead. You know, I think it is a multi-year or maybe even multi-decade conversation we've got to have. I remember some comments from Gideon Lichfield, who's now the editor in chief of Wired. Along these lines where he said something to the effect, I don't want to put words in his mouth. So for some reason, Gideon, if you listen to this, I apologize. But, you know, essentially that that we may need to think about the technology of democracy, which, if you think about it, that's what democracy is. It's a kind of technology for coming to consensus. It was a technology that we developed perhaps in the age of horses and, you know, carrier pigeons or what have you. And very early ways that information would spread and we would, of course, need agents that we would essentially empower to go and represent us in certain contexts. And all that does to some extent seem, you know, perhaps, you know, a bit a bit old fashioned at this point. And yet, you know, for the most part, that's what we've got, right? That is the legal structure that is in place in most of the world's democracies. And we've got a rule of law that demands that things function that way. So how do we square those things? You know, some people think we can't write that. We need to figure out ways to suborn these major platforms more effectively to democracy. And there are others who would much more rather us move to some model that looks more like the digital realm, perhaps empowers people in a more direct way on on digital platforms.
Steven Parton [00:18:23] Yeah, well, speaking of platforms and you mentioned mask and well, Twitter, I guess more specifically earlier, and it's got me thinking, you know, while Twitter and Musk is an interesting topic, I think a more interesting discussion that you're alluding to here is the idea that maybe these platforms could or should in some way be brought in as public utilities or have some increased regulation brought into them. Is that something that you agree with? Do you think that that's something that we should do? Is it still just a big question mark?
Justin Hendrix [00:18:56] I think it's a big question mark. You know, the clearly, there's something wrong with. The idea that privately held platforms can become so publicly important. And yet there are no fail safes, right? You think about Twitter and you think about all of the uses that governments have for it to inform citizens everything from, you know, earthquake alerts to whether alternate side parking is on or off here in New York City. You know, these are things that I might get via Twitter, and yet I'm totally at the whim now of Elon Musk. And, you know, whether the thing's going to operate from one day to the next. And you might look at that and say, well, you know, government should never have relied on these private platforms. And yet there's something I think most people would agree practical about the idea that whether it's Twitter or other platforms, people are for the most part, that's where they're at. And so if you want to reach them with information, to some extent, you have to use these these platforms at this point, and I don't think we've got that solved.
Steven Parton [00:20:06] Yeah, you mentioned something else a minute ago that kind of sparked an idea for me. You mentioned the the carrier pigeons in the horse and the limitations that that provided. And it makes me think that as technology has advanced, we've certainly become a more connected at globe. You know, the geopolitical geopolitical landscape has has changed pretty drastically. But what is interesting to me in some ways is that as we've. Gotten more connected. It feels like an immune response has taken place for a lot of these countries where maybe too much cultural influence came in too quickly through the pipes. And rather than increase cooperation, it seems that a lot of countries are retracting. Is this something that you are seeing as well, or am I just kind of making this up as a thing that's happening?
Justin Hendrix [00:20:56] Know, one of the groups that I try to keep close tabs on is the folks at Freedom House who put out the Internet Freedom Report every year. And they look across the globe and they look at this question across the globe, really, you know, how are states permitting or restricting free expression? You know, what are the types of surveillance mechanisms that states are employing against their citizens? How is technology affecting the ability of individuals to express themselves, organize, etc.? And unfortunately, when you look at that question from a sort of methodology, methodologically sound place, which Freedom House does every year, things are getting worse, right? Every year, fewer people are in a place where a free expression is, you know, well preserved. Every year people are living in more circumstance where individuals may be arrested or imprisoned for things that they say or do online every year. You know, the surveillance apparatus grows in large parts of the world. And, you know, this is one of my, again, key concerns. You know, when we talk about tech and democracy, we can't escape the reality that democracies backsliding across the globe every year, year on year. This has been true now for a decade or almost two decades, depending on which measure you look at. So, you know, back to that question about optimism versus constraint or pessimism, whatever you want to call it, you know, things are not moving in the right direction on this. And that's one of the things I think we have to get hold of. We have to figure out how to create a pro-Democratic tech movement, you know, technologies, platforms, ideas, tech policy, regulations that are pro-Democratic.
Steven Parton [00:22:59] This is obviously a bold question that if you could answer correctly, you would get the Nobel Prize for. But is there anything that you think could stop that backslide? Is there is there maybe like a struggle for the camel's back, so to speak, that could if we could just maybe get this one thing tweaked in the right way, we might start going in a better direction.
Justin Hendrix [00:23:22] I do think that the United States has to figure out how to incentivize its technology firms in a way that produces better democratic outcomes. If we're to be hopeful that, you know, similar gains can be made elsewhere in the world. Now, the EU has some extent, you know, leapfrogged us in terms of innovative tech regulation. A lot of folks listening to this probably say, well, actually, no, they're far too paternal and, you know, killing innovation, etc.. I don't tend to think that, you know, I read their documents, I look at the amount of diligence and that goes into the thinking around their legislation, the amount of kind of expert consideration and consultation, the care that they put into things like the Digital Services Act. And will there be unintended consequences? Absolutely. But, you know, in a functioning legislative system, if there are unintended consequences that you regard as negative or counter to what you set out to do with a particular law or regulation, you can reverse it or you can alter it or you can update it. But you know, where we're at in the United States, unfortunately, is everybody kind of has this idea that if we pass anything, we'll have to live with it for 40 years. So we might as well not because we have no idea what will go wrong.
Steven Parton [00:24:48] Yeah. Do you think those who would claim that is too paternalistic and that government needs to stay out of everything basically are kind of being naive in terms of the scope and scale of what is happening here? Because I think, you know, I always love bringing things back to an evolutionary environment. And I, I think we're probably meant to be in groups of about 150 people as humans. And now we have countries of hundreds of millions or billions, like it feels like to say that government shouldn't be involved in that in some way, that you're not going to have issues that government needs to step in and handle is. I don't know. I again, I think maybe kind of naive.
Justin Hendrix [00:25:29] You know, a paper that I will share with you after we talk that I think about a lot is won by a friend of mine, a guy called Joe Buck Coleman, who's now at Columbia, formerly at the University of Washington, along with, I think, about two dozen other authors from a variety of disciplines and the papers called The Stewardship of Global Collective Behavior. And it's this sort of essentially kind of argument that essentially what you say is true. Right. There's something kind of fundamental going on here that we're fiddling with the species in a way that we never have with digital media and communications technologies, with social media. And there's a gap in our knowledge, right, about how these technologies will affect our ability to progress as a species, whether that is our ability to, you know, produce new information, disseminate that information, or how our democracies function, how we respond to crises like pandemics. And the environmental crisis that we're going to increasingly find is is perhaps the most important crisis that the species will face. And there's this sort of question about, like, how do we get to a point where we can steward our collective behavior, steward our social systems in their direction that results in, you know, good outcomes for the species? And you know, that that seems to me to be the open question at this point. And does government have a role to play in that? Absolutely. What form of government? What what is that role? What should it be? That's the open question. And listen, people on both sides of the argument have really legitimate arguments. You know, some people were rightly concerned about too much government and authoritarian outcomes, etc.. Others are concerned about too little government. Right. And capitalism run amok and big tech platforms that are unconstrained by state power. And, you know, both have legitimate grievances. And unfortunately, we're going to have to find some balance.
Steven Parton [00:27:54] You know, I'm wondering if that political polarization leading to a stalemate that we talked about at the national level is going to impact that stewardship stewardship at the global level, Because I can't help but wonder, are we going to get into an east first west type dynamic here, You know, with China and its approach to technology, which is kind of like run full steam ahead, do some experimentation and social things that maybe the Western world doesn't agree with, and the West has its own individualistic approach. Do you do you feel like we're going to maybe get in a similar bottleneck there where that stewardship is really hard to to bring forward?
Justin Hendrix [00:28:35] So, you know, when we talk about China, this is one of those areas where I want to be very careful to not overstate what I do and do not know. I've been to China once in my life. I have visited two cities in China. I have, of course, read and studied aspects of Chinese politics and culture, but I do not in any way consider myself an expert. But let me just go through them and say, I think one of the interesting things that's going on, at least when you do zoom back perhaps to the moon and look down, is China on some level is far more restrictive in terms of what it is allowing its tech firms to do? In a domestic context. And it's very interesting, you know, if got, you know, certainly more rigid privacy, a concern rules about the amount of time that people can spend with certain applications, rules against things like generative AI, deepfakes, that kind of thing. And of course, a much heavier hand when it comes to censorship and monitoring of public discourse on social media. Whereas in the U.S., you know, we have much more laissez faire attitude on all these things, and yet we're permitting companies to do things that, you know, are truly worrisome, collecting massive amounts of information, developing technologies and tools that may well destabilize the information ecosystem. And if they run amok and, you know, there's a real question here about which direction is the right one. And I personally believe that, you know, free expression and liberty and the sorts of things that we value in in democracies is the better way to go and will result in ultimately a species that thrives more so than the Chinese model. And yet I do think we're going to see attempts to kind of challenge that idea or challenge, you know, my certainty on that over the next decades as we face increasingly massive problems, that authoritarianism may well produce more. I guess I don't know if the word is urgent or more immediate. Answers to those answers may not be the right ones. Look at COVID. The Chinese attempt to control COVID for a couple of years looked extraordinary, looked like a world beating, you know, success. But then eventually the kind of, you know, I guess hubris of that approach ended up kind of falling apart. Right. And so, you know, it may be that in some cases that apparent kind of success is not real and that ultimately will be revealed to be a weaker model. But there may be other things where it is more successful. I keep thinking about environmental crises where maybe a government has to say to a whole group of people or an entire city, you know, it's time to move away from the coast, or we need to completely change the form of energy that you're using. Whereas in the United States, it may take us years, decades to come to those conclusions, and we may carry on, you know, in certain directions without the ability to take action because our democracy is slow and fiddles its way towards things. You know, we may look at those situations and say, oh, you know, maybe there's some benefit in this authoritarian approach, But I don't know. I kind of hold out hope that no matter what the end of the day or for the most part, democracy will prove to be even a better system for managing information and crises.
Steven Parton [00:32:54] Do you see a situation where that impetus, that that pressure that comes from maybe issues like the environmental crisis that we're facing or any other issue is going to cause us to actually upgrade? Like will we change the system in some way, our governments at least, and like America? Do you foresee that being a case where it will become something faster and more streamlined than kind of the 40 year turnaround that you're you're alluding to?
Justin Hendrix [00:33:23] I think that depends on the people to some extent. You know, in the United States, clearly things aren't going terribly well from a political point of view. You know, we've got a lot of problems and a lot of discord. And our legislative process doesn't seem to be, you know, terribly productive. And a lot of people are at different levels and from different political perspectives, are concerned that there are some fundamental aspects of the system that are broken. And I do think that, you know, they could be right and for different reasons. Right. At some point, people may demand some more substantial change. Right. Could you imagine a constitutional Congress where we might address some of these things? Could you imagine major reforms to the way that the Senate operates or in expansion of the court or an expansion of Congress itself or some rebalancing of power between the branches of government or, you know, some new constitutional order altogether? It's possible that within the next few decades, we'll see that here. My hope is that somehow we can arrive at that without bloodshed. But the reality is that most of the time, that's not how it goes.
Steven Parton [00:34:43] Yeah. How do you think economics plays into this? We haven't really touched on that too much. Maybe I'll leave it open and broad as a starting point. What is the role of the economy in this relationship?
Justin Hendrix [00:34:55] Again, this is such a big question, right? I, I think that economic inequality, the kind of hyper capitalism that we're observing at the moment in much of the West and certainly the United States, the degree to which the economy appears to be really kind of fracturing into even in places like the United States into into just increasingly looking like a kind of caste system. You know, I had this thing the other day just to kind of illustrate this. I there's a spot in Brooklyn that I park sometimes where a lot of the delivery drivers have found that they can park. And for them, it's a good spot. It's right between a bunch of things that they service like restaurants, that they pick up deliveries from grocery stores, etc. These are DoorDash delivery workers and food delivery workers, etc.. Often, you know, with these cheap e-bikes. And it's hard to to look at this this kind of scenario. You know, these individuals who in New York, we know have had to organize just to get the right to use the bathroom in the businesses that they serve. It's hard to look at their experience and look at their their own woes that they express as they try to organize for more recognition and in the rest, and not to look at it as a kind of caste type type scenario. There's been a lot of great research and writing about individuals who are now kind of suborned to algorithms or managed by algorithms managed on these platforms and then these various precarious work situations and very precarious income and, you know, employment situations. And I don't know all of that seems badly out of whack to me and at some point will need to be addressed. And the workers can only work within the system, of course, if the system decides to work with them. And in fact, if they don't, you know, then we can we can worry about more populist resentment and anger and we'll see what that results in.
Steven Parton [00:37:19] Yeah, well, you know, there were attempts to maybe upend that caste system in some ways with things like cryptocurrency and whatnot. But I feel like it also became its own kind of caste system. And now we're entering a phase that it seems like crypto is struggling pretty dramatically, I would say, but maybe there's still room for for, you know, digital solutions to this. Do you see kind of a cryptocurrency solution or maybe something like a basic income, maybe something that's a digital basic income that is going to be needed or that could help us navigate these waters?
Justin Hendrix [00:38:00] I am not an expert on on crypto or blockchain either, but I would regard myself as an armchair skeptic that cryptocurrency is the answer to any of these fundamental problems, either political or economic. I realize there are a lot of folks who are invested in that idea and who, you know, in their ventures believe that that's what they're doing, that they're solving for large political questions. And some of them are, you know, leftist utopians and some of them are libertarian idealists and some of the folks who are right in the middle and just want to do practical stuff. But I think the jury's still very much out on whether cryptocurrency is part of the problem or part of the solution. So we'll just have to see. And I think in the meantime, we've been given over the last couple of years ample evidence that we should remain exceedingly skeptical and that it is perfectly appropriate for people like me to maintain an adversarial point of view on whether this is the particular path or particular set of technologies that that may ultimately lead us to, you know, utopia. Yeah. When it comes to things like UBI, I mean, again, like, you know, you could have, I'm sure, much more expert economists come on and talk about these things. I sometimes worry and this is maybe not the most informed point of view, but I sometimes worry that UBI is a kind of sort of get out of jail free card for people who are concerned that, you know, I's going to disrupt too many people's jobs and, you know, also will make sure we look after them well enough that they don't starve, but we're not going to do anything else. Right. To fix the economy or society to their advantage. We're just going to give them a bit of cash and hope things work out and mostly carry on as we have. We're not going to constrain companies or, you know, we're not going to put in place new regulation or we're not going to increase the power of labor or we're not going to pay people more for their services. Rather, we're just going to kind of throw them a bone and let air get on with it.
Steven Parton [00:40:28] So they're going to take a bit of a jump here because I want to touch I want to get your thoughts on this before we wrap up. But what about data and privacy? You alluded to it a bit earlier. I think we're talking about China, But what do you think about the current paradigm? You know, the Facebook data that's being collected there, Tik Tok data that's being collected, how we're basically letting all of the information about us be sold pretty much without thinking too much about it. Like what you what do you think about this, this paradigm?
Justin Hendrix [00:40:59] You know, I said an event last week or maybe two weeks ago now that was talking about the intersection of law and extended reality. So looking at XR Air, VR, those types of technologies and thinking about the implications in the law, and there was a good amount of conversation about privacy and the fact that, you know, forget social media, you know, when it comes to XR and virtual environments and the types of devices that we imagine people are going to have on their heads and other parts of their bodies to enable them to experience those types of in immersive environments, you know, the amount of data that's going to come off us is this extraordinary sort of radiating biometric information and, you know, other signals from our nervous system and, you know, emotional recognition, that kind of thing. And just a lot of questions about how that data will be used, where it will be stored, how it will be minimized or destroyed, or stay on the device or leave the device or, you know, etc.. And I remember at the meeting there was this woman, Susan Aaronson, you might have had on your podcast in the past who said, you know, the future of tech is not about any particular technology. It's about how are we going to pool and utilize data in a way that, you know, creates value for the species and doesn't. And I'm by the way, I'm probably putting slightly words in her mouth, but doesn't, you know, give too much power to big tech corporations, doesn't end up hurting people's interests, doesn't end up hurting democracy, etc.. And, you know, we've got to think about that, right, Because there's incredible power that will come off of exactly the type of data collection that I just mentioned, all kinds of diagnostics and, you know, assistive technologies and techniques and learning and education applications and various other forms of value that will be created by the collection and expression of that data. And yet, on the other hand, do we want all of that information, you know, in Elon Musk's hands, where he may choose to hand it out, however he might see fit? Do we want Mark Zuckerberg sitting on top of that trove with no rules in place? Really? It seems seems crazy to me. You know, we've got to get this under control or I suspect that, you know, as the technology advances, as they are advances as various other communications technologies advance, we're going to introduce a true threat not just to democracy, but to humanity.
Steven Parton [00:44:01] Yeah, well, you know, it's it's natural, I think, as humans to engage our negativity bias and look at the things worth being concerned about. But I'd be remiss if I didn't, you know, offer up an opportunity to maybe have you discuss some of the wins, you know, some of the things that you think we've done well, some of the ways we've met that balance that I think you quoted there from Susan Aronson, I think you said her name was. Are there ways we have been able to benefit humanity without giving too much power to these corporations and institutions that you can point to and say, this was the thing we did? Well, this was a this was a correct implementation of technology and democracy.
Justin Hendrix [00:44:42] It is undeniable that social media has given voice to people who did not have voice in the media and information ecosystem of just 20 years ago or ten years ago, that there are movements, there are concerns, there are expressions of, you know, problems and hopes and dreams and various other forms of expression that we would not know about or would not, I suppose, you know, have such ready access to or hears in such volume. Were it not for social media. And I do think like ten, 20, 30 years from now, hopefully we'll have figured out how to have taken advantage of that extraordinary thing. This, this ability to empower individual voices to rise up and to be heard in a way that improves our ability to arrive at a more just a more plural, a more successful and, you know, economically and environmentally sustainable way. We're just not there yet. And I do think eventually we'll look back and regard digital communications technology, social media as a kind of miracle. But we're going to have to figure out exactly that thing that you you stated earlier, which is, you know, how does it challenge? Some of these innate evolutionary mechanisms that we're fiddling with and we don't really understand.
Steven Parton [00:46:21] Yeah, absolutely. Justin, I got one last question for you. Maybe the hardest of all for a lot of people. Is there anything at all that you would like to leave people with, any closing remarks, any closing thoughts, anything you would just like to put out to the people to tell them about what you're working on? Anything at all?
Justin Hendrix [00:46:40] Well, I certainly would encourage folks to check out tech policy dot press and sign up for the newsletter. We have a podcast and we'll be running some events in 2023. Appreciate the opportunity to talk about it here. And I guess one of the things that I about to start the semester again, you know, teaching again this course on tech, media and democracy. And I'm always looking for like, well, what have we learned in the last year? You know what, what is a sort of new consensus or what is the sort of new baseline from which to talk about these things? And I do think we've made an enormous amount of progress in just the last half decade or decade or so at understanding the relationship between tech and social cohesion, the relationship between tech and democracy. And I just encourage anybody that might be listening to this that works in tech to think about how can I be part of a pro-Democratic movement in my work? You know, that's not going to be easy for everybody. You know, if you're, you know, QR code or something like that and, you know, you've got a just a time constraint on you and you've got to get that work done every day. You know, maybe you can't look up to think about this broader question. But over the course of your career, I suspect there are opportunities for you in the conversations you have in the places where you volunteer or donate your time or expertise in the mentoring you do for people who are coming from behind to think about these questions. And they're not just ethical questions, but they're also ethical questions. How how can I be part of creating a technology ecosystem and a kind of fabric of technology in our lives that supports that, that democratic, that pluralist, that, you know, a more equitable and just society that we'd like to see?
Steven Parton [00:48:52] And I love that. Well said. Justin, again, man, thank you so much for your time. I really enjoyed this conversation and I really do appreciate you taking the time.
Justin Hendrix [00:49:01] Thank you, sir.