Button TextButton Text
Download the asset
Back
Article

Data Privacy in the Digital Age

Data Privacy in the Digital Age

Can our current legal frameworks keep up with the fast-paced advancements in technology? It's a pertinent question, especially in light of the challenges presented by digital privacy. Over the past two decades, we've witnessed legal systems struggle to adapt to new technologies, resulting in a somewhat lackluster performance. But is the solution to change the law itself or rather the way we approach it?

In episode 89 of the Feedback Loop Podcast: "How Data Distracts Us From Human Rights," we speak with lawyer, author and senior research associate at the institute for ethics and artificial intelligence (AI) at Oxford University, Elizabeth Renieris. In this episode, we explore Elizabeth's brand new book, "Beyond Data: Reclaiming Human Rights At The Dawn of The Metaverse." Her new book takes us on a tour through the ways in which our obsession with data and data policy has failed and distracted us. In essence, there's very little need to separate the digital from the physical. Elizabeth argues that to think about this otherwise allows for the government to become outdated and corporations to get away with lots of bad behavior as they fall through the legal cracks. Below we've shared three reasons why we must figure out how to handle data and privacy to ensure a better future.

#1: You can't really opt out in an increasingly digital world.

Nowadays, most people portray themselves online in a way that leaves little room for privacy. For example, vlogging or documenting every second of our lives through social platforms, constant updates via photo uploads, or status changes. But the question remains, can you even opt out when your data is your key to accessing your digital identity?

"I think people still share the same feelings and the same sort of intuitive sense that there's something very perverse about this behavior; I think what's changed is that it's becoming increasingly hard to opt out of it." - Elizabeth Renieris

According to Renieris, the digital infrastructure of our society has been gamified to incentivize people to share and engage in everything from professional networks to social media, fintech, and beyond. For example, in finance, as she mentions in her book, we see this very prominently with many fintech companies trying to create incentive structures and behavioral patterns. This constant need to participate in the digital world has created an emotional and cognitive cost for individuals, making opting out increasingly difficult, if not impossible, for some. Despite the growing feelings of digital lethargy and exhaustion, they are not always at the forefront of our minds due to the toll of living in this interconnected world. Our obsession with data has caused us to become blind to the key issues at stake as we focus on collecting and using personal information. This blind spot is dangerous, as it neglects the importance of protecting our privacy and fundamental human rights in the digital age.

#2. The current approach to privacy and data security leaves the legal system to play catch up.    

Obsession with data is a very unhealthy distraction, and it's causing us to essentially become blind to some important things like data protection laws and the shifts seen in legislation.

First recorded in 2006 by British mathematician Clive Humby, the infamous phrase, data is the new oil, still rings true today, but now we've begun to ask to who that data belongs. Is it the users, or is it the companies mining it? While Humby was mainly talking about the immense business opportunity at stake, he nailed the analogy by touching on the potential massive political, economic, and societal impact that the supply and use of data would cause. As a lasting side effect of the dot com boom, we've created a culture obsessed with metrics, numbers, and algorithms, with little regard or foresight toward their impact on the individuals attached to the data.

While the United Nations Universal Declaration of Human Rights recognizes the right to privacy, as of 2023, legal protection is limited to one's geographical location or often only applies to one's physical sphere. Even in established democratic countries like the United States, a right to privacy is not implicit within the Constitution but rather implied and protected by follow-on cases and precedents. Since 2008, these are four of the significant cases that aim to establish a basis for digital privacy but as we continue to see, leave major gaps.

  • Facebook-Cambridge Analytica scandal: In 2018, it was revealed that the political consulting firm Cambridge Analytica had obtained personal data of millions of Facebook users without their consent. The incident led to multiple investigations and lawsuits against Facebook for data privacy violations.
  • Schrems II: Court of Justice of the European Union (CJEU) ruled that the EU-US Privacy Shield, a framework for transatlantic data transfers, was invalid due to concerns over US surveillance practices.
  • Carpenter v. United States: US Supreme Court ruling held that law enforcement agencies must obtain a warrant to access historical cell phone location data. The decision recognized that individuals have a reasonable expectation of privacy in their location data and that warrantless access by the government violates the Fourth Amendment.
  • Illinois Biometric Information Privacy Act (BIPA) cases: requires companies to obtain consent before collecting and storing biometric data such as fingerprints and facial recognition data.
"Rather than introduce new laws and regulations every time we have these technological advancements or developments, we need to look to the frameworks that have withstood the test of time, which are typically found in constitutions in human rights law and civil rights law. And in that way are sort of agnostic to what happens in terms of technological development and have a much better shot at being sustainable and future proof." -Elizabeth Renieris

With increased urgency to create new laws and regulations around AI & Machine Learning, the first reaction is to build new frameworks to police the technology and data it will create rather than leveraging the existing frameworks for Privacy and Human rights to protect the physical and digital concept of self that will be impacted. This approach to updating and protecting our rights around privacy is slow, expensive and incremental and leaves it on the prosecutors to prove the wrongdoing occurred. With the exponential rate that technology changes, lawyers who try these cases have the added disadvantage of explaining technical, invisible crimes to a judge and jury who may need a deeper understanding of how these technologies work- which diverts the conversation to a technocratic one. We must take the time to understand the evolution of data privacy and ensure that our current laws and regulations align with the broader principles of constitutional human rights.

#3: The current technocratic approach to data and privacy limits how we approach personal data protection.

The proliferation of laws and regulations around data has created a fixation on the notion of data, resulting in a narrow and technocratic framing of privacy. This approach may have been appropriate in the past when we had clear distinctions between digital and non-digital realms. However, in today's post-digital, cyber-physical world, we cannot settle for a black-and-white lens on data protection. Instead, we must prioritize fundamental human rights and constitutional laws that align with human-centered technology.

Data is often viewed as an objective and detached substance, separated from the human experience. However, it is a deeply integrated and intrinsic part of our lives. The danger of this narrow perspective is that we focus only on two human rights out of more than thirty, privacy and free expression, and fail to consider the broader principles of constitutional human rights. She highlights the tension between privacy and free expression in technology governance, where free expression often takes center stage. However, this oversimplifies the complex reality of our post-digital, cyber-physical world. The private industry has benefited from the proliferation of laws and regulations around data, but as we advance in the field of AI and machine learning, the limitations of this technocratic framing have become more apparent.

Data collection and usage go beyond our right to privacy and can lead to discrimination, harassment, and inequitable treatment. This growing digital divide raises concerns around economic, social, and cultural rights, which are often left out of the conversation. Renieris reminds us that we must reassess our approach to data protection and consider the impact on individuals on a personal level. Let us prioritize the principles of constitutional human rights to create a just and safe digital space where human rights are protected and companies and governments can benefit.

Listen to the full episode of The Feedback Loop for answers to the following questions:

  • How is legislation being handled around data and human rights?
  • What are the solutions regarding data and human rights laws?
  • How well do current laws handle the nuances of the technical space?
  • What can the average person, entrepreneur or executive do to push towards a more human-centric direction with a more human rights focus rather than a data focus?

Interested in learning how you can help create a better, brighter future where our relationship with technology is a positive one? Join us for an upcoming Singularity Executive Program in Silicon Valley.

Venus Ranieri

Venus is Director of Marketing at Singularity.

Download the asset