This week our guest is writer, educator, and futurist, Zak Stein, who is well known for co-founding the Consilience Project with Daniel Schmachtenberger, as well as his recent publication, Education in a Time Between Worlds: Essays on the Future of Schools, Technology, and Society. In this episode, Zak takes us on a well-articulated tour of the philosophical and sociocultural conditions that are causing us to fail at our central task of educating the next generation. Along the way we discuss how technology is playing a role in this struggle for sensemaking, from social media to the future of AI tutors. Much of this, Zak explains, is due to the current issues in the information ecology, issues that he explains could have catastrophic consequences if not rectified.
Apply for registration to our exclusive South By Southwest event on March 14th @ www.su.org/basecamp-sxsw
Apply for an Executive Program Scholarship at su.org/executive-program/ep-scholarship
Learn more about Singularity: su.org
Music by: Amine el Filali
Zak Stein [00:00:01] So that the AI tutoring system is one of the most important conversations that can be having that can be taking place right now. This will either be the thing that saves us or completely destroys us.
Steven Parton [00:00:26] Hello, everyone. My name is Steven Parton and you are listening to the feedback loop by Singularity. Before we jump into today's episode, I am excited to share a bit of news. First, I'll be heading to South by Southwest in Austin on March 14th for an exclusive singularity event at The Contemporary, a stunning modern art gallery that is in the heart of downtown Austin. This will include a full day of connections, discussions and inspiration with coffee and snacks throughout the day with an open bar celebration at night. So if you're heading to South by and you're interested in joining me and having some discussions, meeting our community of experts and changemakers, then you can go to as you dawgs Basecamp Dash. South by Southwest, which I will link in the episode description so you can sign up for this free invite only event and just to know it is not a marketing ploy. When I say that space is genuinely limited, so if you are serious about joining, you probably want to sign up as soon as you can and get one of those reserved spots. And in other news, we have a exciting opportunity for those of you with a track record of leadership who are focused on positive impact. Specifically, we're excited to announce that for 2023, we're giving away a full ride scholarship to each one of our five very renowned executive programs where you can get all kinds of hands on training and experience with world leading experts. You can find the link to that also in the episode description and once more time is of the essence here because the application deadline is on March 15th. And now on to this week's guest writer, educator and futurist Zac Stein, who is well known for co-founding the Consultants Project with Daniel Shuman and Berger, as well as his recent publication, Education and a Time Between Worlds, Essays on the Future of Schools, Technology and Society. In this episode, Zak takes us on a well-articulated tour of the philosophical and socio cultural conditions that are causing us to fail at our central task of educating the next generation. Along the way, we discuss how technology is playing a role in this struggle. First, since making from social media to the future of AI tutors. Much of this, Zak explains, is due to the current issues that are taking place in the informational landscape. Issues that, he explains, could have catastrophic consequences if not rectified. But Zach can explain this all so much better than I can. So without further ado, please welcome to the feedback loop Zach Stein. So that we can get kind of a big picture of your work and the things that you do as maybe to start with the consilience project. And what. Motivated you and Daniel Schmachtenberger to put together that organization.
Zak Stein [00:03:31] So. The Consilience Project was born in the wake of the many crises that followed from COVID. Since we had brought together a team of researchers to kind of look at the information ecology, try to figure out what was going on. And this led us to a set of important reflections on the sense making crisis, which had already been named by Daniel and others. And so we began just a systematic approach to trying to articulate the questions that were most basic, to understanding the nature of the sense making crisis and the threat that the total confusion of media ecosystem has generated at the individual and the institutional kind of bureaucratic decision making level. So that was the impetus. And in a sense it the prior impetus was to begin a conversation about existential and catastrophic risk and specifically the what we now would call the meta crisis. But this overarching threat to the nature of civilization and possibly the continuity of life itself. And we realized that we couldn't really even have that conversation in a public forum because of the state of science making itself. So the first move to even begin a conversation about the future of civilization is to make a place where that conversation is actually possible. And at first we were attempting to address the media itself and then realized. That a better approach would be to reflect on the nature of the sense making crisis. And so that was the hope. And now we're moving the work to a place where we're going to begin to to speak of the meta crisis.
Steven Parton [00:05:20] Could you talk a little bit about how the information ecology and the meta crisis that we find ourselves in came about What was the, I guess, arc that brought us to this point? Was it something that was pretty dormant for for most of civilization and then got really exacerbated because of technology, because of the Internet, because of social media? Like, was there any real ramp up points or milestones that really drove this into the place of disorientation that we are now?
Zak Stein [00:05:50] Kind of depends who you ask. Not not in the relativistic sense, but in the sense that it's a truly interdisciplinary question that you know, one of the ways that think about the many crises is as a kind of a causal or nexus causal type of phenomenon, where you're looking at many, many complex causal pathways into this phenomenon, which is a sensitive emergent phenomenon from the many subsidiary crises and the generator functions of those. So there really are very truly distinct and incomparably adequate perspectives on the issue. One one perspective would be, yes, this is primarily about. Exponential technology brought into the context of societies that are kind of operating the way societies always have. And then you bring exponential technology and you've changed everything. There's also one that this is fundamentally more about technologies of media, which is the set of communication. So there's a McLuhan task type characterization of this, which sees it as essentially the meta crisis being some kind of inevitability of the digital world. And then there are other ones. And I'm a philosopher of education, so I see this as a compounding educational crisis. Which is to say, a self amplifying educational crisis, because the problem with educational crises is that they degrade the overall capacity to recognize the ability that there is an educational crisis, right? So that there are some crises that have self amplifying properties and the educational crisis has this. So if we set in certain errors in the structure of the way we did education, let's say postwar, those would begin to play out and then they would self amplified to the point where you'd have a profound society wide education crisis, which I believe we have in a context where more money is being thrown into into education reform than ever. So you get the kind of diminishing returns on investment and complexity, which is a characteristic of civilizational collapse more broadly. So those are three. There's other views. Know economic hard not to see this as a climax of a untenable economic, an untenable fundamental economic paradigm, which is to say, infinite extraction of a finite resource. Right. We can do that. There's a whole kind of rabbit hole down. Seeing this as a result of kind of economic generator functions, if you will. And that's the problem with immediate crisis, is that there's not one route. There's actually many, many routes, many inter animating crises, which means that mitigation of such a crisis, such a meta crisis, requires a concerted effort of many very distinct groups.
Steven Parton [00:08:55] I was going to say, is this why you look to education? Because it in a way, it's almost a way to train the wisdom or the understanding of complex thinking or complex science to understand how these things coalesce.
Zak Stein [00:09:09] I mean, if you look for the root, education is root. All of the other crises are fundamentally crises of human decision making. The root of human decision making is how that human was educated. And so there's a fundamental source problem of the Cape of Mind that is being created and domesticated and made common with in our civilization is a mind that has created patterns and processes within the civilization that are driving it towards self-termination. So that there's a there's a fundamental problem in educational paradigm. If it's creating psychological and skill sets and personality types that are not allowing the civilization to to transform in ways that have it maintain its viability. So it's a root. It's a root of the many crises. And it's what's so funny, you know, educators know.
Steven Parton [00:10:10] Yeah.
Zak Stein [00:10:13] And there's something interesting about education as I define I'm not talking about schools. And so I think that's the the thing that maybe people would hesitate to take this view is because is, you know, there's way more than just failure of the schools. And in fact, maybe the schools aren't failing. But the point is that the education of a society, as I define it, is or the broadest social functions of society that's performed by almost every institution, but specifically the institutions of the family, of school and of law, and of course, media being the other obvious one. So the idea here is that it's not failure of school. It's actually inadequacy of total set of educational institutions that are responsible for the art of poetic reproduction, of the skills necessary to maintain the civilization. And often when civilizations or any social system is in crisis, the education system then has to change its nature so that it doesn't just simply reproduce the same skill set into a place where they're not relevant but can make new skills. And so we're in that kind of crisis where for about 30 years or more, the educational system has needed to change much more fundamentally than it has been able to change.
Steven Parton [00:11:31] And in what ways? Because you know, you. You talk about the two worlds that were between this this sort of liminal state that we've been in maybe for those 30 years. What is it that we we left behind and what is it that we're moving towards? What is that transformation? What are the transformational endpoints there?
Zak Stein [00:11:49] There are several, and they kind of pull together some of the threads because it's related to the the interdisciplinarity of education as a problem space, Right? So like you, when you think about what happened from 1972 to 2000, from 2000 to now in the large scale public school system, let's say in the United States. Right. You had a profound move out of a prior model. So you've got the birth of the great American High School, which was the climax of the kind of modern architecture for large scale public kind of bureaucratic school systems. Prior to that, in the United States, you still had one room schoolhouse, right? There's none of that. Like the the total kind of creation of what we think of as the public schools. Now. The concept goes way back to our friend John Amos, Camellias and others moving through the Enlightenment, but the beacon to the world that America sent out. Postwar America as part of the Cold War competition, was a size and scope of public bureaucracy or schooling that was unprecedented, and it was specifically a reflection of a kind of factory model of schooling. And it was training specifically American citizens. So this was the height of the American civic religion to the civil rights movement in the United States was one of the most important. Kind of world historical, ethical things that the United States was able to kind of do. Where did that occur in the in the conversation about schooling? Fundamentally, it was in the conversation about schooling that it was in the American schools that we were able to exemplify democracy. And this is, of course, again related to the Cold War. Sputnik The launch of Sputnik was one of the things that actually galvanized this in the kind of full investment in a very complex bureaucracy for training American citizens. As you move into the eighties and nineties, you get a transformation out of the model of training for a kind of civic religion, of American citizenship and into the kind of Gates era of anthropic investment in education, which moved from the large factory school to something like the small start up charter school or mini high school where you take a large high school and just aggregate it into a bunch of small focused ones which look a little bit like startup companies. And you invest a lot in digital technology. And the civic religion of America has kind of been on the decline. And so now you're promoting something like a 21st century skills planetary citizenship in a global economy. Right? But note the difference here, right? You're preparing people to be individuated workers and a digital workforce made up of small little unit. Companies which may or may not be viable, as opposed to train them to be American citizens, to work in factories and to become scientists and to beat the Russians.
Steven Parton [00:15:06] So is this a big part of the issue here, then, as these kind of economic incentives that come along with maybe increased globalization and the increased scaling of digital technology, that makes something like a more meaningful education less valuable?
Zak Stein [00:15:22] Well. So there's the redefinition of what it means to be a well-educated.
Steven Parton [00:15:29] Yeah.
Zak Stein [00:15:29] And that's been always slowly changing. And the the shifting of allegiance of identity, which is one of the things the school does and understand, it's like the the American civil religion was not great. Like one of the reasons that we pushed towards this multicultural 21st century skilled global citizenship thing was because of how kind of bad civil nationalism is. And the emergence of globalization as a phenomenon which takes root in the late eighties and through the nineties and now by 2000. It's just a a no brainer, that it's just the supply chains, the information flows, the instant the instantaneous ness of communication through digital technologies forces the issue of global dispositions. So it's a long, complex history and a diversion, but it's important to see that there have been these logical progressions in the large scale public school systems. And the ones we're in now right now are kind of in this ceiling effect where we actually need to. completely disassemble the thing. That the digital portended something much, much greater than it has realized in the context of education. And this is what my the book is about. And so we used an actual planetary scale, distributed educational hub network.
Steven Parton [00:16:54] So what's that look like? I mean, what's it look like to to start making that switch just to, you know, if we're hitting that ceiling and we need to kind of tear down these these prehistoric structures in a sense that no longer serve us. How do we make that transition without just having chaos? And what does that thing we build kind of look like?
Zak Stein [00:17:15] And the idea. I mean, so, yeah, chaos is bad. That's good right now.
Steven Parton [00:17:21] Fair enough.
Zak Stein [00:17:22] So.
Zak Stein [00:17:24] Yeah, I mean, like, if you just look at the adolescent health crisis alone in 19 states, the adolescent health crisis is probably the canary in the mine for the true failure of the educational system. Not the schools or the schools have tried their best or talked with the media. I'm talking about the structure of the family and talk about the absence of anything like a coherent religious ideology. So you get it from YouTube. Like I'm talking about a whole complex set of things. Which have made it so that we are truly in an unprecedented. And if you're in a psychologist frightening situation of adolescent mental health, where it's not clear that as these individuals become adults, they will be able to take over the responsibility as they will have to, for the civilization itself to run. So we're looking at failure of intergenerational transmission, which means a failure of the elders to. Pass over the responsibilities for running civilization to the youth because we have not prepared them. We've actually hurt them instead of strengthening them somehow.
Steven Parton [00:18:24] There's an irony here, though, a little bit, if I can quickly point out, I feel like is I want to agree with this idea that you're pushing towards maybe this more decentralized idea. And I often feel very strongly that the more we individualize the instruction to the individual, the better. But also part of what I'm hearing is like the loss of authority or almost like the Nietzsche's idea of God is dead, you know, without some control. Hierarchy or informational meritocracy. Part of the issues that have maybe arisen are from the fact that people are being educated through social media rather than through something that kind of gives them a real sense making apparatus or gives them some curated curriculum.
Zak Stein [00:19:09] You know. The decentralization of educational systems is not the nonhierarchicalization of teacherly authority.
Steven Parton [00:19:18] Fair enough.
Zak Stein [00:19:19] In fact, right now, because we have created artificial contexts where we have made teacherly authority only bureaucratic, we have a situation where people don't know how to engage with teacher authority at all. And so that's that's key. So there's a very real concern. In fact, the the kind of like decentralization of knowledge production processes can work against the quality of the knowledge that's propagated. So it's important not to confuse decentralization with the absence of quality control information.
Steven Parton [00:19:57] Yeah. Is this is this a value systems issue that's maybe arising a bit from social media as well? Because when I think of the way that, you know, humans as social apes kind of learn what to prioritize. If I was born into this world and I was to look at something like social media and see what was most had, the most valence and the most salience and was most rewarded by society, I would not see something that makes me think that like being a independent, free thinker and like trying to stay out of the chaos is good. I would actually think that wading into the chaos, being controversial, becoming an influencer rather than like a thinker is the key thing here. So, I mean, I can't help but think part of the. Disorientation or the failures in education is part of us just not feeling motivated to be educated in that way or to value those things.
Zak Stein [00:20:55] Yeah, I mean. There's there's definitely a shift in cultural value that has contributed to the the confusion and depression of users. So this is true, but the social media thing is, is much worse. Actually, social media's predatory of young spirit to say that it is it is preying upon the attention of the young to make money off of them without giving them any benefit. I believe it's widespread low grade child abuse. I mean, what other situation do we put kids in where we put kids in front of something which waste their time disregulated their attention and extract their attention as a resource to siphon off for money by selling ads? Right. So this is something we have normalized and it is systematically, just during the youth and as part of the mental health crisis, is like looking at the delta. Meg, why aren't you why aren't you doing something? Like, you know, this is terrible. You know, it's not good for me. Oh, because you're addicted to right now. Sorry. They got to all of us. So the algorithmic capture of attention. And then the sequenced exposure to various forms of synthetic media through the infinite scroll, which characterizes massive amounts of time of adolescent experience, isn't just a matter of values. It's literally a matter of central nervous system dysregulation through tech technology. And so, again, TikTok is probably the greatest example of this. But but most of them are built to be addictive and they are not built to educate. And so that's a deep, deep problem. And so how do we solve that problem? By just making schools more like schools. Right. We don't just make schools or we'll just throw more standardized tests at the schools or make them just sit in front of a computer and pretend that school.
Steven Parton [00:23:03] Right.
Zak Stein [00:23:04] When in fact, it's exactly the same type of experience they have when they sit in front of social media. Now, you've just blurred the distinctions between it entirely. Right. So we have to make educational intervention that is more interesting than social media. And we need to regulate social media like what it is which is an addictive substance that's destroying the minds of the youth. And I sound alarmist, but someone else actually let's I don't have the statistics, but bring the statistics in here and let's talk about how it's going with them and tell me that your child's addiction to Facebook or TikTok or YouTube or Instagram or somewhere, I've never even heard of that you don't even know about because you don't know. It doesn't. It's fun. Let's have that conversation. Instead, we're ashamed of what we're putting the youth through. We've also systematically indebted them, which is another thing you don't do if you want to prepare people to take over the responsibilities of a civilization, you don't make them indentured servants to the people who are currently running a civilization. Or the only large industrial nation that's done that. So. So yeah, it's a I believe it's a travesty and we will slowly see the consequences. We already are. Now, if I is able to make it so that a lot of the jobs we need done to keep civilization going can be done by machine intelligence, then the fact that we just made the future workforce completely incompetent isn't as big of a problem. We've also made them very docile and completely manipulable in terms of these social media technologies. So there's another scenario where we just build something like a complex zoo and we house the youth in it and extract the ones who are needed to do the most creative work. So none of these scenarios are good. So the AI intervention is another thing that the educators have to be very wary of. Both AI induced systemic unemployment and AI tutoring systems.
Steven Parton [00:25:09] What do you think about the tutoring systems? Do you think that would be good for for making it more individuated, helping kids kind of find their own dal, so to speak, their own personal path that's meaningful to them and intrinsically motivated.
Zak Stein [00:25:22] So that the A.I. Tutoring system is one of the most important conversations that can be having that can be taking place right now. Like these this will either be the thing that saves us or completely destroys us.
Steven Parton [00:25:38] Yeah, yeah.
Zak Stein [00:25:39] And so there's so many risks. And like, since one of the thing is that I do think about risk, and as a philosopher of education, I think a lot about a weird class of risks, which are the risks that have to do with intergenerational transmission, which is already named in here. But if you fundamentally disrupt intergenerational transmission, you've just destroyed your society.
Steven Parton [00:26:02] Do you think that gap is part of the issue here in a sense, because the previous generation just doesn't realize how quickly things are changing with technology and therefore can't understand it?
Zak Stein [00:26:15] I think that the tutoring system makes possible acatastrophic bifurcation of intergenerational transmission of a type that could never exist without this type of technology. Specifically because what you're going from is the human being is a biological being that is raised by other human beings to the human being as a biological being who was raised by a machine. So now you imagine an AI tutoring system depending on the rate of adoption of the tutoring system, they need everything to go to a teacher. This is A.I. Socialization system. Right? Don't think they're not coming for your parenting. They're coming for parenting the A.I. Parent.
Steven Parton [00:27:06] This kind of most has to by default.
Zak Stein [00:27:08] I mean, it has to. It's the most fundamental unit of where education resides. And so the interesting question, when you have the tutoring system that has catastrophically bifurcated intergenerational transmission and created a generation raised by machines, not raised by humans, what to tell them about the prior humans which created it, which it did not raise.
Steven Parton [00:27:30] So this is two questions. But given that the consilience project and a lot of what you're working on is figuring out how we navigate this transformation, navigate into this future. And one of those things involves governance. What do you see the approach to governing this? One, the social media aspect that we talked about, that's just basically child abuse and the AI tutor aspect of things like the how that develops. Do you do you see a path forward or do you have any ideas on movement in that direction that you think is best?
Zak Stein [00:28:06] Yeah. So there's also a work I do at the Office for the Future with Gaffney and Ken Ruger, and they were working on this problem of techno feudalism. And the problem with techno feudalism is, yeah, there's a political, there's a, there are political accouterments to these forms of to these forms of what the word would be social existence, right. Mediated by digital technology, tutoring systems, structural unemployment. Right. If the tutoring system rolls out to a market based competition where we're still wedded to attention capture business models. Then you would have a real, real problem. And get again, something that looks a lot like a techno feudaist situation where you've got these non-overlapping epistemic empires of people who are completely captured by an actually persuasive AI technologies. Right. And I'm talking about the United States. What China is doing. It's likely China will not build one based on a market based competition. China will probably build one, right, with a whole bunch of strategy. Right. We will maybe build six or three or something like that different. Or maybe we will build a really complex suite of them, like our educational system that will be socioeconomically stratified. So there's many ways that that plays out in the techno feudalist one. If you infuse the design of the A.I. tutoring system with actual first principles and first values of design, that factor builds in nature of the cosmos, the nature of the human. Then, as I said, this thing could be the thing that actually gets us out of the meta crisis to the design conversation that's so fundamental and so important about very deep philosophical questions which we haven't been able to really think about like for a very long time in modern society. So like a tutoring system has to answer the question of what is a good human.
Steven Parton [00:30:14] RIght. And I was going to ask you, do you have a direction you lean in towards what that more harmonious relationship with, what it means to be human looks like compared to what I know you've called in the past the nihilistic design of technology. Do you do you have a bit of a vision of what that looks like? You know, what are some of the characteristics of that healthier human dynamic that you think we could point towards?
Zak Stein [00:30:39] So in education technology specifically. I believe that no education technology should ever be designed to obsolete human relationship. This big bonus goes back to Skinner, which is what Gary writing about in the techno feudalism context. It's like the original idea of a teaching machine goes way back and it's precisely the idea of me and machine learning. And that's what Khan Academy is. I mean, that's what all of the digital educational enterprises primarily are, are ways to make it so that your engagement with the machine is maximally educationally beneficial. Right? If we keep running that model, then we'll get an AI tutoring system that knows more about you than any teacher or parent possibly could know about you and knows more about the world than any teacher or parent could possibly know about the world. And it can match its knowledge of you with its knowledge of the world and become the most charismatic, persuasive, beautiful, caring, loving teacherly authority doing what? On whose orders? Regardless obsoleting your relationship to your parents, your friends, your actual teachers because it is so much better at talking to you and knowing what you want to do and keeping you happy and entertained. So that's that's a real bad problem. That's part of that catastrophic disruption of intergenerational transmission is the literal obsolescence of human relationships through maximally charismatic and persuasive machine technology. So I believe educational technology should instead maximize the benefit of human to human relationship. All right. So the Educational Hub network uses a lot of very complex digital back end, but it's mostly to orchestrate things like pop up classrooms which would be the kind of like time and skill sharing, networked machine intelligence, realizing who in this community has, what skills, who in this community needs to learn which skills by their own self selection and self-reflection, and then providing the materials, providing the context. Providing even objects of curricular study, perhaps even serving as a conversation partner, but always with in the context of scaffolding, the human human relationship to be maximally beneficial in education. Education is not the biological organism. Interfacing with the machine is something else, and this is hugely important. Understand that it's only I'm just going to say this. It's basically nerds who are building things who love just sitting alone with machines.
Steven Parton [00:33:25] So so one of the big concerns people have or things people say right now is that the university as we know it is dead and that a lot of things are going to be moving towards things like your Khan Academies, YouTube trainings, coding bootcamps and things like this. But it sounds like what you're talking about here is maybe almost something like a mix between the two. How do you reconcile those two dynamics in terms of education?
Zak Stein [00:33:53] Yeah, the universities are faltering for many, many reasons. Now, that doesn't mean that you can get what you get at a university from YouTube. You just can't. Again, the Education Hub network is is not to replace the university with a bunch of fancy online courses. It's to replace the university with a distributed network of in person problem-focused guild like educational experiments. Right. Because this isn't mentioned yet, but one of the things that has also caught up with the modern education and postmodern education systems is the is the artificiality of the tasks that students are asked to do. Right. Like the only person affected by my work on this test is me and my future competition with you to get into college or to get a job. We'll do group work, but we're still doing group work for our own individual grades. None of us are actually in a situation of contributing to the community around us in the solving our problems that need solved. Now, compare this to a guild. It's like a blacksmith guild right. Now and a blacksmith guild you will be learning blacksmithing while solving the problems of the community that need to be solved, namely shooting horses.
Steven Parton [00:35:15] Right.
Zak Stein [00:35:17] And so you will be showing a horse or you'll be watching the master shoot the horse and then you'll show the horse supervisor. But the point is, you feel both. And I'm learning and that, hey, I'm solving problems that need to be solved in the community and they demonstrably do need to be solved. So there's a whole class of what David Graber called bullshit jobs, which means that even adults actually do simulations of work, which is to say many adults jobs, or if they do not do their job, nothing bad happens. You don't actually notice you didn't do your job. It's bullshit job. You don't need that job. School is a multi-decade bullshit job. That's not conducive to, let's say, building a generation that's going to be able to collaborate together and solve a bunch of very pressing, complex global problems, you know, because what they've been doing, they've done doing work that is of no consequence to their communities for almost their entire lives.
Steven Parton [00:36:10] Well, on that note, we know we're coming up to the allotted time here, and I want to make sure I respect your time. So maybe that's a good place to end. You know, in terms of looking forward, is there a drastic solution that you want to see a spirit towards, or at least that you have some inklings of a conceptualization of or maybe just some obstacles that we could maybe knock down to maybe increase our clarity of vision?
Zak Stein [00:36:37] Yeah. I mean there's there's the things I would do if I could, like, wave my magic wand designs. And then there are the things that actually seem possible to do. You know? If I could do one thing with my magic wand. I would affect the social media. Like the issue we discussed about. the Potential capture algorithms and the predation of the awareness and consciousness of youth. That to me just should stop, basically. And that just seems simple. It's kind of like saying, Hey, yeah, kids shouldn't smoked cigarets for us. And they're not saying, Don't build those things. I'm just saying don't let kids use them.
Steven Parton [00:37:25] Right.
Zak Stein [00:37:26] And that would make a big difference. It wouldn't. It wouldn't solve everything. And you'd have to replace it with something. So you'd have to you'd have to replace it with something because it's like an addict also. You know? But I think there are easy technical ways to solve that problem. The problem is legal policy economics. Okay. But technically, it's not a hard problem itself really.
Steven Parton [00:37:57] Just cultural politicking.
Zak Stein [00:37:59] Because cultural partner, we just think the algorithm is designed to do what. All, all of the work of all of these minds that all these millions of dollars making this algorithm do what. What if you changed what that operative was intended to do and you put all of the money resources of the same people into something else? So if I tell you, hey, either you do that for sure, engaging in systematic child abuse, maybe that's extreme, but. That's how I think it's that stark. I think we have to start using language that is more apparent now, as I already mentioned. One of the reasons this isn't occurring is because the adults are also addicted to this thing. And they're getting it right. Even the people who make it are getting high on their own supply. And so the senators and congresspeople that would be responsible for creating the complex legislation would have to pay attention for more than 15 minutes to something to write a coherent legislation that would be able to actually regulate an industry that is moving in a much faster, more sophisticated technological base. And the senators are aware of, like the parents of kids who are addicted to this technology would have to put their own phone down and get off their own social media and regulate. So this is a problem with very powerful educational technologies like social media is that once they get going, they degrade your ability to see what they're doing to you.
Steven Parton [00:39:19] Okay yeah. There you go man. Well, we'll leave it there, Zak, on that, I think optimistic note.
Zak Stein [00:39:27] There's a lot. There's a lot of optimism. But, you know, I believe that the risks are great enough that we need to proceed with a tremendous amount of caution and not not just the kind of naive techno optimism, you know.