Why We Should End the Data Economy

The data economy depends on violating our right to privacy on a massive scale, collecting as much personal data as possible for profit.

Carlo Giambarresi

I know that you know you’re being watched. But I’m not sure you realize the extent of it, or its implications. Hundreds of corporations that you’ve never heard of, as well as many governments around the world, are surveilling you.

They know who you sleep with because both you and the person you share your bed with keep your phones nearby. They know whether you sleep soundly at night or whether your troubles are keeping you up. They track whether you pick up your phone in the middle of the night and search for things like “loan repayment.” They infer your IQ based on the pages you “like” on Facebook and the friends you have. They track your restaurant visits and shopping habits. They know how fast you drive, even if you don’t have a smart car, because your phone contains an accelerometer. 

They can calculate your life expectancy based on how fast you walk, as measured by your phone. They can infer whether you suffer from depression by how you slide your finger across your phone’s screen. They know if your spouse is considering leaving you because she’s been searching online for a divorce lawyer. If they identify you as someone with a gambling problem, they may use that knowledge to lure you back into gambling. They know your weaknesses.

The more companies know about you, the more power they have over you.

Many of these companies call themselves “data brokers.” I call them data vultures. They generate profits by compiling a profile of you from your data trail and then selling it to the highest bidder — banks, insurers, prospective employers, and many others. They can sell individual data profiles as well as lists of people. Some of the categories these companies use to identify and classify individuals include rape victims, erectile dysfunction sufferers, alcoholics, and people who have AIDS or HIV. It was recently revealed that Facebook allows advertisers to target children as young as 13 who have been profiled as being interested in smoking, alcohol, online dating, extreme weight loss, and gambling.

The internet is largely funded through the exploitation of personal data, which is widely repackaged and sold to support targeted, personalized advertising. This is the basis of what’s known as the data economy.

You may think to yourself: “I’ve done nothing wrong, so I have nothing to worry about, nothing to fear.” You’re wrong. Does everyone who can access your data have your best interests at heart? Of course not. That’s why you keep your credit card number to yourself, and why you have a lock on the entrance to your home. If you don’t go around giving your email account password to strangers, then you shouldn’t go around giving away your personal data.

Privacy is important because it protects you from the influence of others. The more companies know about you, the more power they have over you. If they know you are desperate for money, they will take advantage of your situation and show you ads for abusive payday loans. If they know your race, they may not show you ads for certain exclusive places or services, and you would never know that you were discriminated against. If they know what tempts you, they will design products to keep you hooked, even if that can damage your health, hurt your work, or take time away from your family or from basic needs like sleep. If they know what your fears are, they will use them to lie to you about politics and manipulate you into voting for their preferred candidate. Foreign countries use data about our personalities to polarize us in an effort to undermine public trust and cooperation. The list goes on and on.

Companies that accumulate data about you can also end up determining what counts as knowledge about you. They get to categorize and define you, and then treat you accordingly.

Algorithms are making decisions about your life right now, as you read this article, based on your data, without your knowledge and without your consent. Algorithms that have often not been thoroughly tested, let alone periodically audited. Maybe you’re one of the 3.5 million Black Americans targeted by Cambridge Analytica to try to deter them from voting in the 2016 elections. Perhaps you have been denied a loan, or a job, or an apartment recently. Your data almost certainly had something to do with it. This data might be inaccurate, but you can’t correct it because you don’t have access to it. You could have been fired on account of a faulty algorithm.

The data economy undermines equality and fairness. You and your neighbor are no longer treated as equal citizens. You aren’t given an equal opportunity because you are treated differently on the basis of your data. The ads and content you have access to, the prices you pay for the same services, and even how long you wait when you call customer service depend on your data.

We are much better at collecting personal data than we are at keeping it safe. But personal data is a serious threat, and we shouldn’t be collecting it in the first place if we are incapable of keeping it safe. Using smartphone location data acquired from a data broker, reporters from The New York Times were able to track military officials with security clearances, powerful lawyers and their guests, and even the president of the United States (through the phone of someone believed to be a Secret Service agent).

Our current data economy is based on collecting as much personal data as possible, storing it indefinitely, and selling it to the highest bidder. Having so much sensitive data circulating freely is reckless. By designing our economy around surveillance, we are building a dangerous structure for social control that is at odds with freedom. In the surveillance society we are constructing, there is no such thing as under the radar. It shouldn’t be up to us to constantly opt out of data collection. The default matters, and the default should be no data collection.

It’s time to end the data economy. We don’t allow the selling and buying of votes because it undermines democracy. We shouldn’t allow the commercialization of personal data either. We shouldn’t allow personalized ads. If ads were transparent about what they know about us, as reflected in these Signal ads that Facebook rejected, perhaps we wouldn’t be so indifferent to how we are targeted. The advantages of personalized ads for users are minor at best, and can be achieved through contextualized advertising, such as showing sports gear ads when users search for sports gear.

We need to be sure that the algorithms that affect our lives are trustworthy. We need to be able to know how algorithms are judging us, and on the basis of what data. We should implement data fiduciary duties: anyone who wants to collect or manage your personal data has to vow to use it only for your benefit and never against you. Those who manage the personal data of others have a responsibility to them. We have to greatly improve our cybersecurity standards, by law. And we have to periodically delete data that we don’t need anymore.

In the meantime, choose privacy-friendly products. For example, instead of using Google search, use DuckDuckGo; instead of using WhatsApp, use Signal; instead of using Gmail, use ProtonMail. These simple choices can have a significant impact. We need to teach companies that we care about privacy.

Ending the data economy may seem like a radical proposition, but it’s even more extreme to have a business model whose existence depends on violating our right to privacy on a massive scale. In fact, it’s unacceptable.

Artwork By

Carlo Giambarresi

Contact Us

Have an idea for a story or illustration? Interested in discussing partnerships? We want to hear from you. Send us a note at info(at)thereboot(dot)com.

Recommended Reading