Button TextButton Text
Download the asset
Back
Podcast

Lessons From a Computational Social Scientist

Lessons From a Computational Social Scientist

This week my guest is computation social scientist and professor at Columbia University, Sandra Matz, who recently published her book, The Psychology of Technology: Social Science Research in the Age of Big Data.

In this episode, we explore many different ways in which technology and psychology are influencing one another in the modern era. This includes but isn’t limited to the influence of big data on psychological research, the battle between exploitation and exploration as fundamental dynamics in our digital lives, the ways in which algorithms shape our views of the world, and a whole lot more. Sandra delivers her expertise with candor and humor, and this makes for a truly enjoyable discussion that I hope you’ll all enjoy as much as I did.

Find out more about Sandra and purchase her book at sandramatz.com

**

Host: Steven Parton - LinkedIn / Twitter

Music by: Amine el Filali

Technology has found its way into every aspect of our lives. While we often hear about the not-so-subtle outward changes, there also exist myriad changes taking place behind the scenes that impact human behavior. To understand these subtle alterations to the human condition, we often turn to the social sciences, whose unique lens for understanding these issues is currently undergoing a major upgrade thanks to advanced computing. In episode 85 of the Feedback Loop Podcast:  "Lessons From a Computational Social Scientist," computational social scientist and author, Sandra Matz explores the different ways in which technology and psychology are influencing our modern world, including things like psychological research, the dynamic between exploitation and exploration of technology, the role algorithms play in  shaping our views of the world and ourselves.

Before we dive into the recap of this episode, it's best to answer a key foundational question… What is a computational scientist?

Matz, a veteran when it comes to answering this question, explained her simple definition as someone who studies human behavior at scale using computational tools such as modeling, simulation, and big data analysis. These tools can include your social media data or credit card spending data that we can collect with your smartphones and then using technology, be that artificial intelligence or machine learning to answer the question of what the human experience looks like.

In her book, "The Psychology of Technology," Matz dives deeper into the topic of how the definition and process of the social sciences are shifting due to technology and the  outcomes of this on our psychology, well-being and how we interact with one another. Her goal? Understanding what technology can teach us about human experience.

Below we've summarized five of the many lessons we're taking away from our conversation with Sandra Matz.

Lesson #1: Our data can paint a vivid picture of what the human experience is like far better than traditional anthropological methods. **

Think of technology as a tool for social science. We live in a predominantly digital world, so naturally, we leave a data trace behind with every step that we take online. We create a digital footprint, and by using such technology, we might not always be aware of the breadcrumbs we leave behind. Matz details that psychology is a behavioral science that we were once able to capture by simply asking people about their behavior or looking at how people behave in a lab in a relatively controlled setting. However, over time we have noticed that it wasn't necessarily capturing people's daily experiences and what they do as they naturally go about their day. Instead she proposed that "technology offers us a more unobstructed way of obtaining very granular insight on the true lived experience on a massive scale."

Lesson 2: The solution to navigating data and privacy challenges could lie in informed consent.

While mining data might be the new field journal, there is a valid concern around this approach in regards to privacy. Big Data has conditioned us to provide our data in exchange for access to platforms that play a key role in how we experience and interact with our digital identities but there is a shift in thinking that our data is worth more than we've been led to believe. There is an interesting difference and lesson that lies in the way social scientists go about collecting this otherwise invisible data. The difference? Informed consent.  When used for science, there is a more open and transparent approach as subjects can easily understand what information is being collected and what it is intended to be used for. Before researchers like Matz can begin their work they have to go through an ethics approval, that says "the way that we're using data and is aligned with these core values that we have as scientists, and it's protecting the rights of the subjects that we study." Subjects also have the power to revoke the right to use their data at any time, giving them almost all control of their information. However, some data is public and can be accessed by anyone, causing concern for people wanting to keep their data private. There is also the issue of people knowing everything about anyone nowadays due to our digital footprint and a new age attitude on whether people care about their information being public or not.

Lesson #3: In relation to time, your data will last much longer than current leadership will.

Our relationships with our digital identities and what we share online is uniquely dependent on our personal experiences. Some people are indifferent to public personal information as they feel they have "nothing to hide." However, as highlighted by Matz, it is a privilege to have that mindset. Our digital footprints can shape our future, often in ways we cannot predict. Though it may be daunting to consider that data created during our lifetime could leave us vulnerable to biased practices and discrimination, the truth is this: while data remains permanent over time– leadership does not. Decisions regarding how (and why) technology uses information today are rarely set in stone; what's considered acceptable usage of personal online info will likely change as those with a temporary decision-making role come and go within organizations worldwide.

"I think it's coming from this false dichotomy where the narrative that we have, and I think it's driven predominantly by Silicon Valley, for whom this narrative is very convenient, says you can either have privacy, self-determination and all of that good stuff, or you can have this amazing service and convenience that we offer in our products. But you can't really have both for us to be able to offer you these amazing services. You just have to give up your sense of privacy and self-determination, and you just have to live with that." - Sandra Matz

The fundamental value of the majority of people is self-determination. But if you give away information, you give away the power over the choices and decisions you make in your life. According to Matz, it is easier to get people to care about certain things on this level of choice and not necessarily the level of privacy.

Lesson #4: Embedding exploration and exploitation into our existing systems could shift the human experience in a more positive direction

It is very difficult to break out of an algorithm once you have been targeted and put into a trajectory of exploitation. Due to algorithms, we are limited in what we see and closed off to a world we never get a chance to access unless we actively explore ourselves. Exploration and exploitation are two modes of learning about the world from which we can benefit. Imagine a future where you could ask Google, what was it like to live in Montana in 2020 and instead of articles that summarize observations filtered through small select moments you could observe the real lived experience through different digital footprints? Maybe this approach would allow us to better understand not just the world around us but each other. Unfortunately, exploration is not as profitable as exploitation because, unlike exploration, exploitation keeps people's attention on a platform longer. On the other hand, with exploration, people can benefit from an enriched environment, curiosity and novelty, which are great for learning, well-being and social development.  Don't be afraid to ask yourself what it is that you don't know, but should?

Lesson #5: Shaping positive and more sustainable digital social behaviors requires platform and user participation.

What you often see on your explore, discover, newsfeed, etc. is what content that can evoke an emotional response. These platforms are judging your behavior through macro (liking, commenting, sharing or saving) and micro interactions (hover, number of replays, app quits, etc.). In the episode, Matz acknowledges that the opportunity here lies in finding a social feedback mechanism that can allow for more diversity to reach our feeds. While she isn't sure of the answer, she suggested a potential solution could involve the capping of how many eyes see a particular post. This way, specific posts get pushed down in the algorithm, making space for fresh, new content to be seen – content you otherwise would've never come across. This could counteract the emotional tendencies algorithms reinforce and help slow down certain content from spreading and potentially getting into the wrong hands. It's important to remember that these social networks exist to serve you, as a single node on the network you might not have much power but if users align behind change, it's more likely to happen.

Lesson #6: We need to start thinking about our data as one of our most valuable assets.

When we can understand someone's values and psychological personality through your public and private data or digital footprint, it gives  technology the power to influence choices positively or negatively without awareness. You might be familiar with the concept of "data is the new oil," and as the owner of this precious, and personal commodity, users should have the right to use it, sell it, rent it or anonymize it to navigate the digital world as they wish. Matz says that it would be impossible to manage every little thing we do online and equally impossible to fully understand what companies could do with our data. Instead we need to shift our approach and find  allies to help us maneuver around the digital space. Finding or building these expert communities, policies or governing bodies that we can trust to look after our data can benefit everyone. With a small circle of trust, we can once again learn to just be an explorer.

To create a better, brighter future we need to be aware of the way technology, and how we use it, can impact the human experience. Listen to the full episode here and tune into new episodes of the Feedback Loop podcast every Monday, available wherever you listen.

*Singularity Radio's podcast, "The Feedback Loop" hosted by Steven Parton, keeps you updated on the latest technological trends and how they impact the transformation of consciousness and culture from the individual to society at large. *

Singularity

Singularity's team of internal thought leadership works to develop interesting resources, articles and insights about our core areas of expertise, programs and global community.

Download the asset