Privacy Digital

Illustration by Anastasiia Zubareva

Why Should We Care About Our Privacy In The Digital Age?

It becomes difficult to elucidate the real consequences of leaving a digital footprint.

Dec 3, 2016

“But I have nothing to hide,” is the go-to response of many technology users when confronted with the statistics of the vast amount of data that is collected about them. Facebook collects 2.7 billion likes a day, Google retains everything you have ever searched and Microsoft Windows 10 has access to every file in your file system and all of your offline activity. The data that is collected blurs together. Articles saying that tech giants are collecting more data than one can think of are constantly coming out, and yet, nobody cares.
I didn’t care about all of this data either, until recently. Yes, I thought data collection on a grand scale was terrifying. But when thinking about my mundane activity log on my computer, it was hard to understand the concrete consequences of someone hacking into my New York Times account or reading emails to my professors. Recently, however, I have taken interest in reading more about privacy in our world today, and I have started to understand how much our privacy matters, and why nobody cares.
A part of the problem is that the leak of information is so gradual. It becomes difficult to elucidate the real consequences of leaving a digital footprint. It is hard to understand how an email address here and some basic demographic information there can lead to harm or unintended results. However, it is not one single data point that poses a threat to our privacy, but rather our web of personal data that exists on the internet. Individually, these data points may seem irrelevant. Yet, aggregated, they create an accurate snapshot of who you are.
Why is it troublesome that companies collect these profiles? We know companies sell collected information to advertisers and other businesses. While this is not ideal, many argue it is the price you pay for a free product. However, more people than you think might end up getting access to your data. The data can be requested by governments, which have done so en masse in the United States and, by law, companies must oblige without informing the user. Can we believe that our governments won’t turn against us? Apple famously refused to unblock the San Bernardino iPhone for the FBI, afraid that this would establish a precedent for governments beyond this one case.
However, even in countries commonly perceived as having free democratic processes, people seem to be swayed by those who do not fit the mainstream image. What if the Trump-led CIA uses the collected data disproportionately against minorities? What if a law-abiding Muslim citizen is caught in the wrong place at the wrong time and falls victim to this data collection? We perceive data points as facts, but these bits of information can be taken out of context and molded to tell a false story.
In this excerpt from Citizenfour, Jacob Applebaum, a computer security researcher and hacker with the Tor Project and other social movements against surveillence, gives security training to Occupy Wall Street protesters and outlines how a completely peaceful and lawful protester’s data can be used against them to tell a factual story that is not true. Due to increased fear of terrorism in the United States, regulations regarding data collection have relaxed, and as The New York Times states, “Even minor offenses like trespassing can be enough to trigger surveillance of political groups.” These instances extend beyond the United States, as many governments are moving to gain increased control over personal data.
Countries that we assume uphold Western ideals of freedom and privacy have laws that limit citizen freedom. The United Kingdom recently passed a bill that allows “extreme surveillance” that is arguably more invasive than even that in some dictatorships. Similarly, Germany is drafting measures to increase facial recognition and video surveillance. And these are three of the strongest powers in the world — powers that claim to value freedom, privacy and individual rights. With this in mind, the Orwellian picture of mass surveillance becomes much more tangible.
The web of information also allows companies to manipulate their users. The metadata the internet holds means tech giants could potentially know you better than your closest friends and family. This means Facebook can affect your emotions, relationships and thoughts without the same investment a friend may have, and without the same consequences. For example, Facebook recently conducted a study and found that algorithms that show more positive newsfeeds can make the person post more positive content, and thus make them happier, while negative news feeds can have the opposite effect. The intimate role of technology in our lives, and the data we put out there is not just being commoditized. It is changing the way we live and interact with the world, without any say — or even awareness — of its users.
The Facebook newsfeed has also faced significant criticism in light of the recent U.S. election, as many claim Facebook was able to manipulate political opinions and possibly the course of the election. While the election has forced people to confront this reality, the Facebook newsfeed affects people worldwide — it affects their perception of their own country, political beliefs and friends. Every time you post to Facebook, you are giving them more data and a better sense of who you are. And by posting this data, you are relinquishing control of your data to let Facebook use it as they will.
While Facebook is not necessarily intentionally manipulating users, there are no inherent forces that guarantee that Facebook and other Silicon Valley companies put the well-being and interests of its users first. Facebook has recently been under attack for allowing advertisers to use ethnic affinity to specify the reach of employment, housing and credit-related ads. While there are valid-use cases for this option, it ends up significantly disadvantaging certain populations that Facebook isn’t necessarily motivated to take care of. At the end of the day, Facebook doesn't profit from the social good it achieves, so should we trust them with our most personal data?
So what can one do to be more thoughtful about the technology one consumes?
Pay for software. If you are not paying for software, you and your data become the product that is monetized. Be conscious about the data you are releasing on social media. Always ask: is it necessary? Encrypt your data: if your data is encrypted, only you can have access to it, and, therefore, you are the one in control. Make smart passwords.
Brooke Hopkins is a contributing writer. Email her at feedback@thegazelle.org.
gazelle logo