
Mei Dennison
First-year Samuel Radcliffe outlines how, despite privacy policies and agreements, big technology companies continue to use, sell and buy user data created from everyday internet use.
Privacy is an intrinsic part of human life. It gives us security, allows us to build trust and manage our relationships. The United Nations even formally recognized privacy as a fundamental human right in 1948. However, our personal information has become a commodity. As digital technologies continue to evolve, our personal privacy becomes increasingly under threat, sometimes without our knowledge. We are often blinded by new technologies, and in racing to embrace the benefits, we ignore the drawbacks.
Every day when we elect to use technology, we leave behind a digital footprint. Since data is a by-product of computing, nearly every action we take online leaves behind a trail. However, these technologies are so deeply embedded in our daily lives that we often neglect to notice that our privacy is being compromised.
There is an important distinction between understanding how to work something and how it works. For example, most of us can drive a car, as in we know how to work it. Still, many people cannot explain everything that goes on under the hood — the intricate mechanics of the engine, fuel system and transmission: what really makes it work.
Technology works in a similar way. We know how to use our smartphones, apps and the internet. But most of us do not understand the complex systems that collect, store and share our personal data. Technology truly is the kryptonite of personal privacy. We are already living in a state of digital surveillance because digital spaces allow for the collection of vast amounts of personal information, from identifiers to financial data to biometrics.
For example, according to a Gallup survey conducted in 2019, more than one in four Americans regularly use mobile health apps or fitness trackers. A similar survey by the cybersecurity company Surfshark revealed that about 80% of top fitness apps, such as Strava, Fitbit and Runna, share user data with third parties. The types of data often collected include fitness metrics, but can extend beyond this to include personal information such as identifiers, device data and location data. Now consider that most modern cell phones contain a GPS receiver, allowing software on the phone to provide real-time information about the movements of an individual, resulting in a state of constant surveillance. So where does all this data go?
In most cases, our data is collected for capitalization, not for our benefit. Third-party companies use the information shared with them for profiling, risk assessment and targeted advertising, all for the purpose of generating a profit. Digital tech companies pursue profits on behalf of their shareholders. They are responsible for some of the most valuable stocks in the U.S. market. According to Project Censored, a nonprofit media literacy organization, these stocks are collectively worth about $2.9 trillion. To these corporations, you are the product. It is not about hacking your bank account anymore, it is about hacking your mind.
When we click “agree” to app agreements, website cookies or privacy policies, we are signing away our constitutionally allocated right to privacy by arbitration. Terms of service are legal agreements that often contain clauses that allow companies to collect, store and share data. Often, there is nothing “private” about privacy policies. They do not protect our privacy, they just detail the ways in which it is breached. In this way, we allow our own data to be used against us. Now, the corporations decide on our rights, not a judge or a jury.
Agreeing to these terms and conditions is voluntary, but do we really have a choice? Digital technology is ingrained so deeply into society that in order to write ourselves out of surveillance, we almost have to write ourselves out of society itself. Moreover, sites and applications usually do not tell you if and what kind of data they are collecting. They are not up front because it is harder to opt out of something that you do not know that you are a part of.
As consumers of technology and as participants in the digital landscape, it is important for us to ask questions beforehand and not after. Who controls the technology? For what purpose? To what end? The problem is not just that our data is collected, but that it is often done so without our knowledge. We need to demand more transparency from the companies we trust our data with in order to safeguard our information.