This week our guest is author and podcaster, Azeem Azhar, who has a robust background as an investor, founder, and regulator in the tech space, including several years working with the World Economic Forum on the Global Future Council on Digital Economy and Society.
Azeem spends much of his time these days creating content for ExponentialView.co, where he provides weekly assessments of the dynamics at play in humanity’s exponential transformation. The key ideas he’s uncovered on this journey can be found in his latest book, The Exponential Age: How Accelerating Technology is Transforming Business, Politics and Society.
In this episode, we review his book and the wisdom he’s gained over the years, with an emphasis on the impacts of the growing divide taking place between technology’s advancements and society’s ability to keep up with it.
Music by: Amine el Filali
Azeem Azhar [00:00:00] But technology isn't neutral. It's always shaped and designed and fashioned by people.
Steven Parton [00:00:23] Hello everyone. My name is Steven Parton and you are listening to the feedback loop on Singularity Radio this week. Our guest is author and podcaster Azeem Azhar, who has a robust background as an investor, a founder and a regulator in the tech space, which includes several years working with the World Economic Forum on the Global Future Council on Digital Economy and Society. Azeem spends much of his time these days creating content for his blog Exponential View, where he provides weekly assessments of the dynamics that are at play in humanity's exponential transformation. The key ideas he's uncovered on this journey can be found in his latest book, The Exponential Age How Accelerating Technology is Transforming Business, Politics and Society. In this episode, we review his book and the wisdom that he's gained over the years, with a particular emphasis on the impacts of the growing divide that is taking place between technology's advancements and society's ability to keep up with those advancements. And with that being said, let's jump into it. Everyone, please welcome to the feedback loop. Azeem Azhar. So about a year ago you released The Exponential Age, How Accelerating Technology is Transforming Business, Politics and Society. So that seems like a pretty natural place to start. So I'd love if you could just kind of tell us a bit about the motivation for that book.
Azeem Azhar [00:01:57] Yeah, well, thank you. It's great to be on the on the show and and with an audience that sort of understands accelerating technology. I'm going to I'm going to guess. Yeah. You know, I mean, I think the real motivation was that as I started to spend time having exited my last startup about seven years ago, there was definitely a sort of a distinct shift in the the nature of the technologies that we were we were seeing. So on the one hand, we were in the midst of this, the start of the AI boom, that it kicked off about 2011, 2012, with these various breakthroughs in deep neural networks. But you were also starting to see within clean tech that things like solar power and battery technology was getting cheaper and cheaper and cheaper. And, you know, the heart of any you know, what makes a tech and a technology exponential when we we talk about it is. As an economist is actually this idea that it's getting cheaper and cheaper and cheaper. So when you draw that graph, that sort of the exponential curve, what's fundamentally embedded in that is that you're getting more power for the same price, right? Your price performance is going up. And that's essentially what's happened in in semiconductors. You know, when you when you bought a transistor, which is like the fundamental unit of the traditional computer chip in 1958, 1960, it would cost like $120, 100 sorry, $100, then 1200 dollars in 20, 20 times. And transistors today cost fractions of one hundredths of a millionth of a dollar. It's so cheap, it's actually really hard to say. So exponential technologies are not actually just about power increasing, it's about price performance improving dramatically. And I started to see that in a few areas, computing as so demonstrated through AI, clean energy, and then also within the biological domain and in additive manufacturing that we were starting to see a whole host of these exponentials taking place. And that I think I wanted to make some sense of why was it happening? And then critically, what does it actually what does it actually mean? And I mentioned that I was sort of an economist by training. And, you know, economists often think in terms of institutions, and I certainly do for a bunch of family reasons that I sort of touch on in the book. And and I started to think that a lot of the issues that we were facing in that period, 2016, 2017 onwards, seemed to relate to the institutions in which we sort of surround ourselves, like defamation laws or monopoly laws were struggling to contend with the questions that were being asked by these technologies. And so that's why I wanted to put out a book that combined both my explanation for why we are actually in the exponential age. So it's a big claim to make about anything and then start to say, Well, what does this actually mean? And how does that help us explain the level of sort of friction and the possibility for transformation that these technologies well will deliver? So that was kind of that was in a sense, the the motivation.
Steven Parton [00:05:48] Yeah. And that friction you talk about there. Would it be fair to say that that's the exponential gap that you talk about in the book, that distance between the advancing technology and society's capacity to keep up with it?
Azeem Azhar [00:06:01] Yeah, that's that is the friction. Right. So the friction is, is essentially that the technology advances construct new ways of behaving, new types of behavior, new ways of manifesting itself. But the institutions that we use to guide our everyday lives are based on on other assumptions. And that I call the exponential gap. The institutions are linear. Podcast listeners can't see this, but I'm holding my hand up like a straight line. It's at a 20 degrees. So the horizon, the technology is exponential, and I'm now making a terrible exponential curve with my hand. And that's the exponential gap. And it was just a way of analyzing and making sense of the fact that we seem to have all these issues around what's going to be the future of work, what's going to be the future of competitive markets, you know, what's going to be the future of trade, what's going to be the future of political discourse? And, you know, I've been around for a while. I just turned 50 and I don't remember a time where so many things were up in the air. I mean, they were things that were existential, that were up in the air, like we were worried about global nuclear war when I was ten years old. And that's existential, but it's only one thing. Whereas today we were like, Well, there's all these things. And it all seemed to point back to somehow Moore's Law or something similar, and I was trying to make make sense of all of that.
Steven Parton [00:07:17] So as you wade into that chaos, what kind of mindset do you go into it with? How do you kind of separate yourself from the stress or the vertigo of such a disorienting environment and kind of ground yourself in a way that allows you to see clearly to start understanding what's actually taking place.
Azeem Azhar [00:07:37] You know, I, I think that I'm really lucky just as my at my moment in time because I got a computer when I was nine years old and I got my second computer when I was 12 and my third when I was 14. And, you know, we had my mum had a computer that she was she was the first person in the family to have one. And in that way, I saw its capabilities improve constantly. And I was exposed to the idea of Moore's Law probably by the time I was 12 or 13. And, you know, so in a sense, that idea that the computers were going to get faster and cheaper and performance was going to improve was really, really embedded. You know, in in me, even if I couldn't make it explicit and actually kind of funny enough, when I did my university finals in one of my economics papers, I had a complete mind bank. This is in 1994 on everything I'd ever studied. And there was one question I forget what it was, and I thought I could just about squeeze talking about Intel and the 8286 chip and the 8386 chip in the 84 chip, because that was the latest chip at the time. And, you know, and I did and I kind of made up and actually the examiners loved it and I did quite well in that paper. So so I think that part of the advantage that that, you know, I and other people, you know, who've grown up in that period when computing wasn't hidden by black gorilla glass, which it now is, is that we, you know, we tinkered and we learned and we understood these processes. And being that close to it, I think gives you a sense of what's going on. When I then get into my career and I'm kind of constantly working on the Internet, I've embedded this idea that stuff will get faster and cheaper. And if it's a bit shonky right now, like I used to listen to sort of Seattle and Portland and electronica radio stations streamed over the Internet like 16 kilobits in 1995. You knew it was going to get better and the quality would get better and better. And so that, I think, sets you up to understanding this. And you understand Moore's Law. And you you tracked I tracked Yahoo's user growth between 94 and 97, 98 for my job, and it looked like a hockey stick. And then, you know, Ray Kurzweil comes out with a couple of his books and his his essay, which I think has really stood the test of time, are the law of accelerating returns, which he came out with in 2001, where he explains this. And I think at that point, it's really, really kind of embedded in me. And so that's a long winded way of saying that, you know, I started with this deeply lived experience around this trend. I learned a little bit about Moore's Law. You know, Byte magazine, be witty, which is no longer published. I recommend listeners go off and look at archives on the Internet as obviously phenomenal from the eighties and early nineties on on this, because it was it was discussed all the time. And I only really started to understand that it was a sort of separate thing that you could think about in terms of planning by, you know, the mid-to-late nineties and then of course the cuts file essay and then starting to realize that when I would do business modeling, I was often the, the models that I built were non-linear. Right. Because you knew they were non-linear, right? That's how, that's how these systems grow. And, and so it was embedded in me for a while. What took a long time was to try to unpick why this is this is actually happening. And, you know, I don't think Rae explains it in that essay, the, you know, the sort of the work that I did around why does this. Exponential thatI actually happen. You know, was was I spent a long time thinking about that.
Steven Parton [00:11:48] Yeah. I mean, could you maybe offer up an answer to that in a sense? Why is it happening? Like, I know that's a big question, but can you can you kind of hit some of the key points?
Azeem Azhar [00:12:00] So I, I love the Y because the Y for me is amazingly human in in all of this. Right. So I just as a simple example, lockdowns happened. People are bored at home. We've all watched the end of Netflix and we've read to the end of read it. So what do we do? We make pseudo and the first thought we make is it takes ages tastes like crap. I if I can say crap on the podcast, there's mess all over the kitchen. You've wasted half of it. No one eats it And the second loaf, there's less waste. You do that by the eighth loaf. It's kind of there's not a crumb of flour wasted in the kitchen is immaculate and it tastes delicious. And we learned by doing. Mhm. And, and so what we, what we discover is that in kind of complex engineered multi-step product processes, there are lots of things that you end up being able to improve and that that gives rise to a learning rate, a rate at which you learn how to do this, do this better, and what determines your learning rate is what your mum used to tell you. Practice makes perfect. It's, it's doing more of it. And so this relationship of the learning curve was identified, sort of formalized, well, by mothers for thousands of years, but by Theodore. Right. He was an aeronautical engineer and attached to the US military. And in the mid thirties he realized that there was this kind of learning curve that every for every doubling of cumulative production of an airframe, it was 15% unit cost cheaper because the engineers figured out how to do things better. And so so that that learning is, is really the relationship that actually underpins Moore's Law. You know, Moore's Law is actually not a sort of it's a sort of social construct that the semiconductor industry has agreed to, to to agree a stick, to adhere to as best they can. But right still, which is about learning rates tells us about kind of got these complex products. So then the question is, well, what when you've got a learning rate, what sorts of things might accelerate it and that there are a number of things that would accelerate it. So one is if you can expand the market faster, right? Because the learning is a function of cumulative demand, not of time. So if demand doubles every ten years, learning will be much slower than if demand doubles every 10 minutes. Right. And so so things that you can do to extend and expand demand help a lot. And so the a increasing architectural choices that were made around technology to make it more modular so that things were not these big meaty stacks like the first mainframes, but what kind of components meant that these components are simpler optimizations are more kind of apparent as to where they need to be, and a modular component, like a silicon chip that can work in everything from the computer of a farm through to the the ram. And a camera expands the market so your demand can increase really, really rapidly. And the the availability of of those types of technical combinations means that new use cases emerge which drive sort of novel knowledge, foraging and discovery that might help people to better improve the the product. And then having networks of information and trade and the finance that supports them gives you global access to markets, which increases demand, but it also speeds up the rate rates of learning. So what you end up with is, you know, this kind of amazing bit of classical economics and work in in semiconductors, where the lasers are built by the people who very, very build the very, very best lasers. It's ASML in the Netherlands, but they're operated by the guys who absolutely operate those lasers better than anyone else is TSMC and Taiwan. And and you're able to, to, to do that. So what happens is that the kind of clock speed takes off, but it's not just about the somehow the inherent of the technology. It yes, the engineers are doing the hard work at the front end. But a global market this kind of architectural modularity. All of these things are helping by driving that demand. And then with that accelerating the the sort of the time based aspect of the learning.
Steven Parton [00:16:52] Yeah. And do you think things are only going to continue to streamline in that sense? Because I think of specifically, I think the EU just passed a law. We're trying to pass a law in order to standardize chargers or ports or something on phones. I can't remember. I just recently saw a thing about it, but this is one of those things where you have like proprietary technologies between different companies that are making things less efficient. It's more difficult to do this innovation and to keep things efficient. And it feels like maybe the government market are kind of stepping in and saying, hey, let's honor rights law, let's honor this efficiency thing. And then there's just this market pressure to streamline and make things more efficient so there's less hurdles in the way.
Azeem Azhar [00:17:35] Yeah, it's really interesting. It's a really interesting question. I mean, the. The power charging thing is was is sort of fascinating because I think it it also reflects you know, it reflects. Something about the power that emerges, like the political power that emerges in these markets. I think Apple was in a just in a really weird place because actually the USB standard was running away from from them anyway. And it was getting better than the lightning standard in the iPhones. So it's not clear to me that they that they frankly lose very much from it. If not, they don't they don't even gain. And I think of all the hills apple could die on the lightning format is it's not plugged format you know is not is is not the one for them to really take their lot they lost and I think there is a question about whether whether this sort of fragmentation of the global economy and agreements might reduce the size of demand and might slow down learning. And so I think that that is not going to happen. I don't really talk about it in the book, you know, but the reason I don't think it's going to happen is because there are new ways for us to to learn. And some of these new ways to learn are coming through the technologies we've developed. Right. So I. And virtual twinning constructs new ways. New ways to learn. So what's virtual twins is, is you basically simulate products that you want and you do the learning by building like a million digital replicas and see how they perform in your virtual environment. And so actually you can speed up that optimization process to get to a better product. And I give an example actually in the book of that, two waves with a yacht, a yacht sort of hydrofoil. The other thing is that, you know, AI is turning out to be a real accelerator, because if you think about, you know, what is that process of of of learning that we're going through? It's it's about having these products, putting them into the into actual use, seeing how they get used, figuring out from experience what paths you could take that would be more more efficient. Now, the thing that you can do within within sort of machine learning or sort of high-performance computing is that you can explore many of those paths. Almost in a sense, virtually without building, building. And I think one of the things that we've seen with an eye is that the rate of acceleration, at least measured by complexity of air models, is much, much faster than the the Moore's Law chart kind of we've seen in the last six years. And I think what you're going to be able to see is you're going to be able to use AI to. Enhance the rate with which learning can can occur. And, you know, it's early days now to see whether it really will have that impact. But there's a kind of there's a sense in it for me that it might be able to to do that. I think the second thing is just that these markets are now getting they're getting really, really big. And even in a fragmenting economy, these fragmented markets might be large. I mean, just think about what's happening with solar panels. And even beyond that, even in a kind of fragmenting economy, trade will continue to take place and that trade will take place not in the in the sort of unalloyed, unconstrained way that that was imagined from from the 1980s onwards. But it will still take place and there will still be massive global markets for the underlying that go into the technology, even if like sort of sharp ends and certain areas, you know, can only address certain countries because of sanctions or or restrictions. So I don't think the I don't think from the kind of geoeconomics of this that the learning rates will will slow down. I think that rate cuts files argument in the 2001 essay which is that this idea of layered curves that actually make the exponential is is a. Is a quite a compelling one, and I actually didn't find any better explanation than that than that. And the idea is simply there is that, you know, when you actually break down an exponential curve, what it is, it's there's there's a particular product enhancement in manufacturing that comes along. It's got the shape of an SD card. It's a bit rubbish at first that gets really, really useful and then you flog it to death and it doesn't deliver anything. But by the time you get there, you've found another optimization that's a different S-curve that allows you to to get up there. And, you know, he talks about it as a kind of empirical set of observations. And and, you know, we can argue about whether that's a robust enough theory, but I didn't find a better explanation than that. So I you know, I thought that's a useful one to use.
Steven Parton [00:23:04] Yeah. And you touched on, you know, the politics there, which I think is really important. And one of the things that you've talked about in the book that the this growing gap affects is something like political polarization. Can you maybe talk a bit about some of these specifics, like how this growing divide is affecting our political landscape?
Azeem Azhar [00:23:27] Yeah, absolutely. I mean, there are two or two ways. So I'm going to be like a choose your own adventure, like an old school adventure. I'm going to give you two choices are and Stephen. And then you can decide which direction you want to go, right? So one is the direction of, you know, how how changes in general might, you know, can start to construct a division. And the second would be how do the platforms built on top of these technologies manifest themselves in ways that maybe may be divisive? So so there's your choice. Like, do you want to go left to the dark woods or right into the cave with only a sliver of light and the rumble of an angry monster?
Steven Parton [00:24:16] I have to I want to go for the angry monster, I think.
Azeem Azhar [00:24:18] I don't know which of those two is the angry monster. Okay, so you tell me. You tell me. Yeah, let's.
Steven Parton [00:24:23] Let's talk about the the the companies in the structure, because I think that might actually include some of the human nature and the inevitable change aspects to it as well. So let's, let's lean towards the latter.
Azeem Azhar [00:24:36] Yeah, of course. So one of the things that these this sort of exponential shift has done is it's put computing everywhere. And so we all have a a digital persona that lives on social networks that we access, you know, 15, 16 hours a day. And and on the other hand, you've got computing that can actually manage, manage all of that for us. And so we find ourselves mediated by a, a, a platform and is an every product developer knows you end up making choices. And so, you know, in the book I talk about how the. The rapidity with which the discourse appeared online and started to then become the the dominant place where conversations about the world happened and where people learned about the world and went was was pretty was extremely fast. To think that, you know, 15 years ago, not a single one of us discussed politics on on Facebook. Not even Mark Zuckerberg. Right. Nobody did. Because it. Well, maybe not 15 years ago. 17 years ago. Right. It's a it's a blink of an eye. And and so. So there are ultimately we then mediated in someone else's platform. And the platform owner has their own ultimately kind of commercial agenda about the choices that they they do and don't make. And those choices are not transparent. And what they've done, what's happened in a really short period of time, is that they have constructed a new. A sort of common, common space, but which we all think of as being a common space, but actually it's owned and controlled by them. And and so we've seen over the years, Facebook and Twitter and so on change the way that they handle, you know, bad behavior. They've changed the way the kind of information they show to us. Facebook used to just show a raw feed of reverse chronological information, and then that became too spammy and they started to rank algorithmically. But we didn't know what they were ranking for. Of course, it's turned out and come out more explicitly since I wrote the book that a lot of the ranking they did was designed to get people to engage in. The best way to do that was to make them outraged. So there was there was that dynamic. But there's also another dynamic, which is the the fact that all all private space things that we think of as private were starting to. Big seem private that avail itself to these these public these these corporations. Right. And so they were the ones who are starting to mediate and see and access the relationships that we we have. And, and it's really, really powerful. Yeah. My previous company start up was ah, it was called Peer Index and we initially what we wants to do is build a kind of common layer across all social networks. So you could manage your identity across, across all of them. And when we would do our experiments and we would look at the academic research, it's, you know, when you can put the entire graph of relationships of people into a modern data structure, the inferences you can make can be incredibly strong. So one really simple and trivial example is just this idea of homophily. People hang out with people who are like them and who are interested in the things they, they, they do, they, they're all interested in. It's really hard if all of your friends support I'm in the UK some is a soccer example support arsenal and you support Chelsea you know life on a Saturday afternoon with your mates isn't that fun and and so we naturally move towards a kind of realm of homophily because it's quite it's quite easy. And cognitively and emotionally, it's really difficult to spend a lot of time doing things that you're not interested in and you don't like. Right? I mean, if you don't like young adult detective fiction, right? Go off and read 60 of those books in a row and nothing else. And then tell me how you feel after that. Right.
Steven Parton [00:29:14] Great.
Azeem Azhar [00:29:15] No, great. So so, you know, homophily is is again, another sort of human human trait that is incredibly so it seems to be sort of deeply embedded in us psychologically. And it's a sociological function that was is so much easier to do that sorting is much easier to do on a in a digital Facebook platform than it is in the real world now. It does happen in the real world, right through mechanisms of, you know, sort of where the people of different races and socioeconomic backgrounds kind of end up living in cities and and how does that get triggered? And, you know, even before you have all sorts of explicit racist policies, this filtering starts to to happen. So we do it anyway. But it's just that it's it's done at a much, much faster and less explicit rate on these platforms. So so I think that that would have contributed to some degree of political polarization. I say contributed because. We need to see what the data actually says and we need to see what is anecdotal versus what is actually statistical. And we need to see what is was was what emerged thus is what was really done actively. So, for example, I think we know and Facebook admits that when you look at the Rohingyas and the way in which the Myanmar authorities kind of constructed sort of sort of hate towards them or you look at various instances of racial hate or that spread in India through WhatsApp, you can see that connection. I think you can see that, you know, the whole January the sixth thing, you know, relied quite a lot on these sort of digital networks and their ability to sort of connect people. So so it sort of it looks pretty unhealthy there. There's a mix of evidence that shows it's degrees of unhealthiness that may be systemic, maybe not, but it is driven by fundamentally this issue that the technology got really, really powerful. And we didn't know what kind of capabilities we know we need internally to manage it as individuals. And we didn't know what we needed sort of from an institutional level that was kind of, you know, regulatory.
Steven Parton [00:31:43] Yeah, it feels like what you're saying here is that in a lot of ways, as the exponential curve took off, it carried with it just the extreme aspects of human behavior and amplified those in just a very powerful way. So while we think that a lot of this is you could argue that technology is actually getting in there and making things worse. It's really just allowing the human condition to be expressed.
Azeem Azhar [00:32:12] Well, but. But that may be making things worse.
Steven Parton [00:32:15] Yeah, totally. Totally.
Azeem Azhar [00:32:16] That literally might be the way in which the thing gets worse. And it. Well, you know, I think one of the things that was helpful is that it made explicit these types of these types of of of relationships and assumptions. So one of the technologies, unfortunate to I didn't talk about because I cut the chapter for reasons of length from the book was some work that was done in some of these early language models, the things that were sort of pre-dating the transformer models like Bert and then GPT, which were called word vectors. And they what you were able to do starting in about 2015, 2014, 2015 because of the computation being cheap enough was look at the kind of whole of human corpus of texting, this natural language English text and look at the relationships that were established within them. And in those relationships you could kind of go off and say, you know, man is to woman as. And it would say doctor is to nurse as strong is to decorous you know to how it would construct these relationships and make them explicit, these sort of kind of fundamental biases. You know, the great example is if you asked an AI system. Yeah, picture a president, an American president, it's only ever going to picture essentially a white guy, because even though the Constitution doesn't say that. And so so what these techniques technologists did was they actually rendered explicit a lot of things that were were implicit. So that's definitely something that that happened. But the thing there's another thing that happened on with, you know, the regulate recommendation algorithms on places like YouTube and Facebook, which I think is different to this idea that it just showed the human condition up. And, you know, Zeynep took a taxi who's a sociologist attached to one of the sort of storied Ivy League universities, talks about this quite well. It's sort of the extreme ification of of content that you see that drags people into, you know, the Flat-Earth movement. And I had a really good example of that when I had some back issues early on in writing the book, and I went on YouTube to look for some yoga for backs and you know, by the AP, I couldn't settle for the first one I was given and I was like, I am I am not going to do this yoga until I find the perfect yoga. I mean, it was the most fun yogic thing to do. But of course, as you might imagine, not only did I sailed straight past yoga, straight into like extreme back strengthening and and so so actually that's not unveiling the human condition. What that's doing is it's actually kind of conditioning me in a in a really, really negative way out to the out to the extremes. And, you know, again, a lot of people have talked about and I have identified this and there's there have been whistleblowers from these firms to say, yeah, that's what we we knew this thing was was actually doing. And so so I think that we can't there is there was explicit design in these systems and there was explicit awareness of what the risks would be. But it was too much hassle to have the conversation with civil society early on and that's what happened, I think.
Steven Parton [00:35:51] So rather than just revealing our demons so that we could work on them, it fed our demons so that they became stronger innocence. Does that make sense?
Azeem Azhar [00:35:59] I don't think anyone is born a flatter upper right. Right. I don't think anyone believes that anyone is born believing that COVID is zapped in via 5G and nanobots that Bill Gates controls. I mean, this stuff is completely insane. You know, I don't think you know, and even I we know, you know, we know about media manipulation and narrative manipulation and the ability to get people to sort of believe sort of absurd or extreme things. It's been the you know, it's it's been the the the the sort of key strength of many a totalitarian for for for a really, really long period of time. But it's just that it can be done at this big, big scale. And I think what was a bit unseemly in the process was the. The extent to which the denials came from the platforms that this was actually happening, that they and and the the notion that you you could have a little fix on you know, the could just be bolted on the side like community health and you know, you'd see it because over the years since they've had an advertising platform for several years, I used to advertise on Facebook and if you put something in and it passed, that automated test human review would come in and tell you that that was unacceptable within seconds because Procter and Gamble and the Toyota account were too important. Right, to to to spoil. And it was never done on the content itself, on the design, not even the content forget, but on the design of the systems by which we put content out. And and I think that's where I slightly disagree with someone brilliant like Eric Schmidt, who who was the chairman of Google for many years. He says technology is is fundamentally neutral. You know, I don't think it's it's fundamentally neutral. There are certain patterns in technology. The learning pattern that we talked at the beginning that that is happening sort of almost independent play. But technology isn't isn't neutral. It's always shaped and designed and fashioned by it, by by people.
Steven Parton [00:38:20] So that brings us, I feel like, to the inevitable question of of policies and regulation. How do you approach this very challenging topic of kind of controlling and I guess, you know, reining in this this beast that seems to be running amok?
Azeem Azhar [00:38:38] Well, you know, I went through a journey in researching and writing the book, starting from a point thinking that that it's sort of on the technologists a little bit. And the fact that that they're not necessarily, you know, uniformly charming characters doesn't, you know, doesn't help. And some are certainly more charming than others, I think Sundar Pichai and Satya Nadella, you know, Tim Cook, these are characters with a certain amount of of of charm. But as I went through the the work, I started to to think that. That. It's actually as much about that lower linear line asking the right questions and engaging appropriately and starting to understand, you know, what's deliberate, what's asked, and what's design. What emerges from the nature of the technologies themselves. Yeah. One and got one good example about this is this notion of planned obsolescence that goes around that, that, that companies deliberately build their products for sort of end of life thing so that you buy an upgrade. And yeah, there are two things about that, you know. Number one, you know, especially with products, with lithium ion batteries, they only have a certain amount of charge cycles in them and that's the chemistry of it. And below the chemistry is the mother of all sciences is the physics of it. And so that that's just an absurdity. And and the second is, you know, as a kid who still has I'm going to hold up my first computer, the Z x 81 Spectra, one from St Clair, one K of Ram. This thing still works, but I can't do anything with it. And it's not because the developers are evil. It's just that our expectations and demands change. And so and the technology doesn't doesn't stand still. And so part you know, one of the things just to look at those two issues is there's nothing that prevents a liberal arts project complaining about planned obsolescence to go off and invent a battery that lasts a billion charge cycles go off and do it. I mean, nothing is going to stop you doing it. And the second is that it's really hard to to say to people, you should freeze your your technology at a certain point and say that to an ecosystem of free individuals and companies that have rights to just behave in particular ways. And, you know, when you think about the people who sometimes say that and you look at them and you say to them, have you ever discovered a shortcut on your way to work? Do you take that shortcut? Well, you've just gone off and on learning that. Right. And so it's this inherently human thing. So I changed my view. And that's not to say I like Silicon Valley libertarian style. There should be no less regulation and everything should just do what it does. And, you know, let's get better for it. It's to say that actually there needs to be this really, really sensible dialog. And that actually involves, to be honest, both sides are leveling up. But the institutional side, which actually has a force of state behind it, needs to level up faster to have better conversations. Now, it's not helped when you have really ideological, politically skewed people often running these companies who've read a couple of CliffsNotes on and Rand and Milton Friedman, you know, and are willing to sort of opine thereafter. And even those who've read a lot of it and our experts and scholars in their own right, and for them not to recognize that it's kind of an ideology which has to compete with others and we should actually have that dialog, I think is unhelpful. So, you know, my sense now is, is you're either going to be a make or a taker in this space. If you want to be a maker, then you need to level up in both directions. And but my emphasis started to change when I started to realize that actually it's it's really as much about, you know, regulators and business leaders and ordinary folk outside of the tech industry to start to understand how products come to be, how learning happens, how the second and third order effects happen. But it's a really, really difficult issue for which, you know, I don't have any any easy answer.
Steven Parton [00:43:21] Yeah, well, in that sense, as we kind of come to the end of our time here, I just want to get the future oriented perspective from you. Are you optimistic about the direction we're going? Are you pessimistic? How do you things or how do you think things are playing out in terms of assuaging or reconciling this this chasm or this divide that's forming between the technology and society? Do you feel like we might be able to actually close the gap or do you just see an ever widening future?
Azeem Azhar [00:43:53] You know, I think it's it's a really it's a complex pattern which plays out differently in different parts of the world. You know, say India, for example, has had this really worrying turn away from democracy and towards a sort of real thuggish, racially motivated, you know, populism. And yet, therefore, smartphones are getting more and more powerful. And so I am. I am optimistic about the fact that the technologies will continue to deliver capabilities, that we can apply wisely or less wisely. And, you know, certainly the ability to have clean non-carbon energy is just an incredible bonus. And but we will have some institutional choices about the way in which we want these these things to sort of manifest themselves. And that's really about about power and power being the ability to get people to do things they don't know they want to do or or don't want to do explicitly. And and that is is where I think the politics starts to play a role. So large part of the Web3 movement is this idea that there should be new gatekeepers to digital platforms. It shouldn't just be, you know, a handful of very large companies. It should be a lot of pseudo user owned services. But in reality, they just have their own internal power structure. And, you know, that's a discussion. There's a discussion about what should be provided as a public good or a commons versus things that are provided purely for for for profit. So in other words, should there be many, many other things that look like Wikipedia or should everything look like. I'm trying to think of an example for our Facebook or the old CompuServe where you would charge by the minute. And I think that those those are really, really important questions of choice that talk to how we how we then close the the exponential gap. And and. I think it's really, really hard to. It's really hard to know because. It's a very, very complex environment. So so, you know, for all of the. All of the sort of angst that the right feels about government involvement in the economy or in the world at all. The right wing, both in the U.S. and in the U.K., has embraced government intervention in the supply of critical minerals for exponential technologies like molybdenum and rare earths and cobalt and so on. Right. Because they've realized that the market on its own is not going to deliver that. And so there is also this sense that that actually there is a bit of a consensus, even if people's ideas don't, you know, necessarily agree with them around things that must be done. And I think it's important to start with points of, you know, points of agreement as we as we move forward. And I'm not a politics person, so I don't have to get into that messy, you know, messy fight. But but, you know, the technology creates opportunities. And I hope that partly through generational shift, partly through books like mine, if more people have the tools to ask the right questions or, you know, shape things in in ways that are more broadly beneficial.
Steven Parton [00:47:36] Yeah, I'll always support using wisdom as the guiding focus through through life. Azeem We're coming up on time. So I want to respect yours, but I want to give you a chance to kind of lay out any, any stuff that you'd like to talk about, anything you'd like to let people know that you're working on. I believe the book comes out in paperback in March of next year.
Azeem Azhar [00:47:59] In paperback in March in the US and out. In paperback. In other parts of the world now. I mean I think I think finishing thoughts really are that. We're on the second half of that curve. We're all we're past that inflection point now. And we went through that in that period. I think I think sort of 2013 to 2017 sort of paid. But we're really, really distinctly, distinctly past it. And and actually, a lot of the choices that we have as individuals are really now live in that space because, you know, you can go off and buy an electric vehicle now. You can go off and do it. There's no question it's not going to be the the platform of the future. And so I would just sort of encourage people to it's a chance to go back and revisit your priors, because as exponential processes go, they start off very, very boring. And as you know that the the king learned on the chessboard, there's a turning point where suddenly things get out of control. And I think we're about there now.
Steven Parton [00:49:10] Fair enough, man. Well, I want to thank you so much for your time, Azeem, and thank you for all your wisdom.
Azeem Azhar [00:49:15] Thanks so much. Yes.