“There are people in the world who habitually say yes. And there are people who habitually say no. The people who say yes are rewarded by the adventures they get to go on, and the people who say no are rewarded by the safety they attain.”—Keith Johnstone courtesy of SU Faculty Dan Klein, “Getting Comfortable with Ambiguity”
I am most definitely a “yes” person. So recently, thanks to sponsorship from my employer Westpac, I attended the Singularity University Executive Program. It was nothing like I anticipated and better than I ever could have imagined.
But to try to recreate the experience for this audience in a blog post would do it a disservice. There is no way to replicate the impeccably curated, fully immersive experience of six days on the NASA campus in Sunnyvale CA with the world’s most pre-eminent experts in their respective emerging fields: areas as diverse as genetic modification, exponential economics, augmented and virtual reality, artificial intelligence, entrepreneurialism, climate change, robotics… So instead I will try to do it justice by exploring seven of the key themes.
The programme is predicated on the concept of exponential growth and the fact that technological advancements of recent years have led to exponential outcomes across almost every part of our lives. As human beings, we experience this as disruptive and destabilising. But, if we can adapt, the opportunities for our own lives, the future of the institutions we work in and indeed the world, could be extraordinary.
“Humans are not equipped to process exponential growth. Our intuition is to use our assessment of how much change we’ve seen in the past to predict how much change we’ll see going forward. We tend to assume a constant rate of change. Exponential growth is both deceptive and astonishing, as the doubling of small numbers will inevitably lead to numbers that outpace our brain’s intuitive sense. This is meaningful because we as humans tend to overestimate what can be achieved in the short term, but vastly underestimate what can be achieved in the long term.”—Introduction to Exponentials, V2.1 September 2016, Singularity University
The sheer array of subject areas we covered in a very short space of time was powerful. On the first two days alone we explored AR, VR, AI, robotics, synths, digital manufacturing, climate change, and futurism. I could start to see a clear correlation between the exponential pace of change and the convergence of these emerging technologies. Combine atoms with bits, artificial GENERAL intelligence with a humanoid body (to form a synth) and you suddenly see how bringing these seemingly different fields together might further accelerate returns.
“The technology you use today is the worst it will ever be. Technology is moving faster than you think and will never move this slowly again.”—Rob Nail, CEO Singularity University, “Leading in the Age of Disruption”
However, exploring the convergence of technology and humans, in particular, raises fundamental ethical questions. Ethical questions which are far from resolved by either the expert community or humanity at large.Take artificial intelligence. The current focus on narrow AI is dependent on significant recent advances in deep learning which has been a major accelerator of exponential growth. At a basic level, it looks like enabling chatbots, virtual assistants, and automation of vast swathes of manual human activities—which is controversial in itself when you consider the impact on the workforce.But when you understand that deep learning has become so advanced that once we have given the algorithms inputs and they have started learning, the process of learning is so complex, has so many variables, that it actually becomes incomprehensible to the current human brain how the AI is making the decisions it's making. It is, in essence, a black box. Take the example from the United States of AI being used to decide sentencing when a person is found guilty of a crime. Given that data inputs are things like demographics and where the felon lives, how can we trust that the algorithm is not biased-by-design, taking on all our inherent assumptions and perpetuating them through learned data?The Theranos scandal is perhaps the most startling recent example of technology being a (literal) black box. Although admittedly, in that case, there was clear deception on a massive scale by the CEO. But as technology converges with the fundamentals of what it is to be a human being, our health, and indeed the very makeup of our DNA, we have to prioritise ethical and human issues alongside functional workings of the technology.
In an increasingly exponential world, there needs to be a distinct difference between our approach to tackling complex challenges as opposed to complicated problems.Complicated problems are characterised by predictability, there being a “right” answer. They’re not necessarily easy to solve, but good practise and specialist expertise can usually find the answer, e.g., rebuilding Notre Dame, putting a man on the moon, or brain surgery. Complex challenges are typically unpredictable. There’s no one solution, but instead a direction. Patterns exist, but cause and effect are only seen in hindsight. There are many moving parts, and emergent practice is required to explore the challenge, e.g., culture or behaviour change or climate change.
Complicated problems require problem-solving, but this isn’t enough for complex challenges. The trouble is, starting by asking, “What is the problem we are trying to solve?” limits us to the current paradigm in which the problem already exists. Navigating complex challenges requires curiosity, experimentation, and a willingness to learn. You need to be willing to walk in others’ shoes to understand the challenge better and be able to stand back and see the system rather than the individual parts. Above all else, you need to be OK not having THE answer and instead explore a potential new paradigm enabling options to emerge.
Did you know, when asked to vote anonymously, i.e., without colleagues knowing the result of the vote, 76% of people say they prefer working alone than in a team. Why? Because with collaboration comes a lot of friction—socialising concepts, navigating politics, achieving buy-in. In an exponential world, the need for speed will surpass the human desire for collaboration. The benefits of pace will supersede the benefits of everyone having an opinion and a say. That is not to say that we should discount input from customers and colleagues. But the approach will be more consistent with coordination rather than collaboration.
“The next 20 years will be more transformative than the last 2,000 in terms of technological advances.”
—Ray Kurzweil, Co-Founder and Chancellor, Singularity University, “The Future Is Better Than You Think”
The challenge is that we, as human beings, are not transforming anywhere near as fast as the technologies around us. And businesses, governments, and institutions are failing to transform. Indeed 80-90% of all organisational transformations fail. This is because we treat transformation as a rational (technological) process when in reality it’s a fundamentally political (behavioural) process. By proactively tackling the human process as a core part of your transformation programme, you will deliver exponential outcomes. Behavioural and neuroscience provide a valuable scientific basis for behavioural transformation strategies.
“The future is notoriously difficult to predict. Our shortcomings in seeing and extrapolating the exponential trends that will shape the coming century sets the stage for us to experience perpetual futureshock.”—Ray Kurzweil, Co-Founder and Chancellor, Singularity University, “The Future Is Better Than You Think”
For most of the planet, all of the above may sound like the realms of science fiction and a million miles away from their reality today. Indeed one could interpret the current political divergence across the planet as stark evidence of the uneven distribution of understanding and access to the future. It was clear in presentation after presentation that the next is already in the now. Part of our role as alumni from over 20 countries is to start to distribute an understanding of an exponential world and help enable democratisation of access to the benefits.
The good news? Whilst the sheer pace and complexity of technological and human advancement is mind-blowing, we actually have everything we need already to solve the world's problems. Critical, complex challenges like climate change, poverty, and distributed prosperity. For me, I left the Executive Program with an invigorated sense of purpose and a vast toolkit and network to tap into to help me make a dent in the world. I’m going to work on making Australia one of the world’s great innovation nations and in doing so enabling innovative advances that will positively impact the whole of humanity.Photo by frank mckenna on UnsplashKate Cooper attended the Singularity University Executive Program in June 2019. We regularly feature posts from alumni and community members who share their perspectives and experiences.