The Illusion of Free – A List Apart


Our data is out of our control. We might (wisely or unwisely) choose to publicly share our statuses, personal information, media and locations, or we might choose to only share this data with our friends. But it’s just an illusion of choice—however we share, we’re exposing ourselves to a wide audience. We have so much more to worry about than future employers seeing photos of us when we’ve had too much to drink.

Article Continues Below

Corporations hold a lot of information about us. They store the stuff we share on their sites and apps, and provide us with data storage for our emails, files, and much more. When we or our friends share stuff on their services, either publicly or privately, clever algorithms can derive a lot of of detailed knowledge from a small amount of information. Did you know that you’re pregnant? Did you know that you’re not considered intelligent? Did you know that your relationship is about to end? The algorithms know us better than our families and only need to know ten of our Facebook Likes before they know us better than our average work colleague.

A combination of analytics and big data can be used in a huge variety of ways. Many sites use our data just to ensure a web page is in the language we speak. Recommendation engines are used by companies like Netflix to deliver fantastic personalized experiences. Google creates profiles of us to understand what makes us tick and sell us the right products. 23andme analyzes our DNA for genetic risk factors and sells the data to pharmaceutical companies. Ecommerce sites like Amazon know how to appeal to you as an individual, and whether you’re more persuaded by social proof when your friends also buy a product, or authority when an expert recommends a product. Facebook can predict the likelihood that you drink alcohol or do drugs, or determine if you’re physically and mentally healthy. It also experiments on us and influences our emotions. What can be done with all this data varies wildly, from the incredibly convenient and useful to the downright terrifying.

This data has a huge value to people who may not have your best interests at heart. What if this information is sold to your boss? Your insurance company? Your potential partner?

As Tim Cook said, “Some companies are not transparent that the connection of these data points produces five other things that you didn’t know that you gave up. It becomes a gigantic trove of data.” The data is so valuable that cognitive scientists are giddy with excitement at the size of studies they can conduct using Facebook. For neuroscience studies, a sample of twenty white undergraduates used to be considered sufficient to say something general about how brains work. Now Facebook works with scientists on sample sizes of hundreds of thousands to millions. The difference between more traditional scientific studies and Facebook’s studies is that Facebook’s users don’t know that they’re probably taking part in ten “experiments” at any given time. (Of course, you give your consent when you agree to the terms and conditions. But very few people ever read the terms and conditions, or privacy policies. They’re not designed to be read or understood.)

There is the potential for big data to be collected and used for good. Apple’s ResearchKit is supported by an open source framework that makes it easy for researchers and developers to create apps to collect iPhone users’ health data on a huge scale. Apple says they’ve designed ResearchKit with people’s privacy values in mind, “You choose what studies you want to join, you are in control of what information you provide to which apps, and you can see the data you’re sharing.”

But the allure of capturing huge, valuable amounts of data may encourage developers to design without ethics. An app may pressure users to quickly sign the consent form when they first open the app, without considering the consequences. The same way we’re encouraged to quickly hit “Agree” when we’re presented with terms and conditions. Or how apps tell us we need to allow constant access to our location so the app can, they tell us, provide us with the best experience.

The intent of the developers, their bosses, and the corporations as a whole, is key. They didn’t just decide to utilize this data because they could. They can’t afford to provide free services for nothing, and that was never their intention. It’s a lucrative business. The business model of these companies is to exploit our data, to be our corporate surveillers. It’s their good fortune that we share it like—as Zuckerberg said—dumb fucks.

To say that this is a privacy issue is to give it a loaded term. The word “privacy” has been hijacked to suggest that you’re hiding things you’re ashamed about. That’s why Google’s Eric Schmidt said “if you’ve got something to hide, you shouldn’t be doing it in the first place.” (That line is immortalized in the fantastic song, Sergey Says.) But privacy is our right to choose what we do and don’t share. It’s enshrined in the Universal Declaration of Human Rights.

So when we’re deciding which cool new tools and services to use, how are we supposed to make the right decision? Those of us who vaguely understand the technology live in a tech bubble where we value convenience and a good user experience so highly that we’re willing to trade it for our information, privacy and future security. It’s the same argument I hear again and again from people who choose to use Gmail. But will the tracking and algorithmic analysis of our data give us a good user experience? We just don’t know enough about what the companies are doing with our data to judge whether it’s a worthwhile risk. What we do know is horrifying enough. And whatever corporations are doing with our data now, who knows how they’re going to use it in the future.

And what about people outside the bubble, who aren’t as well-informed when it comes to the consequences of using services that exploit our data? The everyday consumer will choose a product based on free and fantastic user experiences. They don’t know about the cost of running, and the data required to sustain, such businesses.

We need to be aware that our choice of communication tools, such as Gmail or Facebook, doesn’t just affect us, but also those who want to communicate with us.

We need tools and services that enable us to own our own data, and give us the option to share it however we like, without conditions attached. I’m not an Apple fangirl, but Tim Cook is at least talking about privacy in the right way:

None of us should accept that the government or a company or anybody should have access to all of our private information. This is a basic human right. We all have a right to privacy. We shouldn’t give it up.

“Apple has a very straightforward business model,” he said. “We make money if you buy one of these [pointing at an iPhone]. That’s our product. You [the consumer] are not our product. We design our products such that we keep a very minimal level of information on our customers.”

But Apple is only one potential alternative to corporate surveillance. Their services may have some security benefits if our data is encrypted and can’t be read by Apple, but our data is still locked into their proprietary system. We need more *genuine* alternatives.

What can we do?

It’s a big scary issue. And that’s why I think people don’t talk about it. When you don’t know the solution, you don’t want to talk about the problem. We’re so entrenched in using Google’s tools, communicating via Facebook, and benefitting from a multitude of other services that feed on our data, it feels wildly out of our control. When we feel like we’ve lost control, we don’t want to admit it was our mistake. We’re naturally defensive of the choices of our past selves.

The first step is understanding and acknowledging that there’s a problem. There’s a lot of research, articles, and information out there if you want to learn how to regain control.

The second step is questioning the corporations and their motives. Speak up and ask these companies to be transparent about the data they collect, and how they use it. Encourage government oversight and regulation to protect our data. Have the heart to stand up against a model you think is toxic to our privacy and human rights.

The third, and hardest, step is doing something about it. We need to take control of our data, and begin an exodus from the services and tools that don’t respect our human rights. We need to demand, find and fund alternatives where we can be together without being an algorithm’s cash crop. It’s the only way we can prove we care about our data, and create a viable environment for the alternatives to exist.

Scroll to Top