The next MBA application deadline is January 6.

People in a crowd, holding up their phones.
Personal Data | Mind Your Business

Data. Mine.

Americans Are Very Concerned About Data

By Jennifer (Jennie) Latson

Americans Are Very Concerned About Data

When Facebook CEO Mark Zuckerberg testified before a Senate committee in 2018, he was supposed to clarify how his social media site uses — and maybe abuses — the personal data of its 2 billion monthly users. 

But what the hearing revealed instead, as many analysts have pointed out, was that few senators seem to have a firm grasp on what Facebook does, how the internet works, or what they mean by the word “data.”

Take this exchange between Zuckerberg and Senator Deb Fischer, R-Nebraska:

FISCHER: So how many data categories do you store, does Facebook store, on the categories that you collect?

ZUCKERBERG: Senator, can you clarify what you mean by “data categories”?

FISCHER: Well, there's — there's some past reports that have been out there that indicate that it — that Facebook collects about 96 data categories for those 2 billion active users. That's 192 billion data points that are being generated, I think, at any time from consumers globally. So how many do — does Facebook store out of that? Do you store any?

ZUCKERBERG: Senator, I'm not actually sure what that is referring to.

The discussion went on awkwardly for a while. Ultimately, however, Fischer revealed that her chief concern was not the number of data points Facebook stored, but something more primal. “Is Facebook — is Facebook being safe?” she asked.

Senators aren’t the only Americans who have a generalized dread about bad things happening to our data — but no clear idea what exactly we mean by “data,” or why it’s so important to protect.

Many of us are worried by revelations that the consulting company Cambridge Analytica, hired by the Trump campaign during the 2016 election, accessed the private data of 87 million Facebook users and used it to create detailed personality profiles and highly specific marketing pitches.

This is, of course, far from the first widespread data breach we’ve weathered. The 2017 Equifax breach also compromised the personal data of millions of Americans, including our Social Security numbers, birth dates, and credit card numbers. But the data Facebook shared is far more personal, and that may be why the breach feels like even more of a violation. Cambridge Analytica is unlikely to open a line of credit in our names, but it’s somehow creepier to think about all the things they’ve found out about us: which hotel we stayed at on vacation, when we potty-trained our kid, what Disney princess we would be.  

What makes it creepy is that they can, and did, use this intimate information in an attempt to manipulate how we think and act. That’s what most of us are really talking about when we talk about protecting personal data: the fear that people will use knowledge of our habits and preferences to serve their own purposes.

Facebook wasn’t the first to use (or allow others to use) personal information to influence our behavior. As long as “data” has existed, it’s been scrutinized by people who want to sell us products, get our votes, or change our opinions.

Nearly a century ago, advertisers were already working to harness the power of behavioral psychology to develop “subliminal advertising” that would motivate us to buy what they were selling without realizing we’d been influenced, Vanderbilt history professor Sarah Igo explains. “Those probes into consumers’ personalities and desires foreshadowed Cambridge Analytica’s pitch to commercial and political clients – using data, as its website proudly proclaims, ‘to change audience behavior,’ ” says Igo, the author of a forthcoming book, “The Known Citizen: A History of Privacy in Modern America.”

Many people recoiled then, as they’re doing now. Especially in the U.S., where independence and self-reliance are highly prized, the possibility that our psyches might be shaped by outside forces — without our awareness or consent — is extremely threatening.

“We have a lot invested in the idea that we’re autonomous individuals, that we’re in charge of ourselves,” Igo explains. “The idea that there’s this psychological probing going on all the time threatens that image of who we are at some deep level.”

But there are benefits to sharing our personal data, and that’s why so many of us do — not just on Facebook, but with Google, Apple, Amazon, Netflix, Fitbit, E-ZPass and countless others.

“We aren’t giving away our personal details only on Facebook. We are giving away our personal details every time we go online for any purpose,” says Utpal Dholakia, a marketing professor at Rice University’s Jones Graduate School of Business. “Coming to Facebook, we all have some basic social needs, such as to be acknowledged, respected, valued, praised, etc. — which is why we give away our information. I will give just one example: our birthday. Most people like to be greeted on their birthday, so they post their real birth date. The risks are obvious. The smartest thing to do is to share minimum amount of personal information on social media.”

Some people, however, argue that the benefits outweigh the risks, and that we should embrace the convenience and efficiency of a fully-connected, privacy-free future.

“The bottom line is that we can’t say, ‘I want my technology personalized to me, but I don’t want companies to know too much about me,’” argues Ben LeDonni, CEO of the digital strategy agency CreativeMMS. “The technology needs to know as much as it can about you to help personalize your experience and make it richer, better and easier for us all.”

That’s a hard sell for those of us who want to retain some control over how our data is used and who’s using it. And while we recognized that Facebook was using the intel it gathered for the mildly disturbing purpose of showing us targeted ads, few of us realized just how much was actually at stake.

“We all knew that Facebook was harvesting our data, but we were seduced by its free ‘gifts,’ and unaware of the significant cumulative value of the totality of our privacy exposures,” said Moshe Vardi, director of the Ken Kennedy Institute for Information Technology at Rice. “The Cambridge Analytica affair revealed two things. First, the cumulative value of our minor exposures is quite significant, perhaps enough to tilt the result of the 2016 election. Second: It was not just Facebook that was collecting our data. It was also done by third parties we have never heard about, and on a rather massive scale.”

The fact that this time our data was used for political propaganda, and not just the commercial propaganda we’ve come to expect, is part of what makes this scandal so troubling. But political propaganda has been around forever, too. Cambridge Analytica’s tactics aren’t so different from the “push polls” pioneered by the Nixon campaign, in which pollsters used loaded questions to push people toward a particular candidate, Igo points out.

“There’s a continuum here — it’s not brand new, by any means. The techniques are just more sophisticated and less visible, and there’s the added layer of outside parties tampering with the very information that people are getting in the first place,” she says.

And while in the past you’d have to choose to take part in a poll or a focus group, social media has created a constant stream of opportunities for us to provide our personal information to propagandists without thinking twice about it. We don’t even have to post it, since so much of our online activity is being tracked. The Chronicle’s Dwight Silverman calls this information “a kind of ‘dark matter’ data.”

“Facebook uses both user posts and this dark-matter info to build a set of interests associated with you,” he writes. “If you routinely click on country music videos … the site may add ‘country music’ to your list of interests. If you share a lot of posts from progressive political pages, it may tag you as a liberal.”

So how can we reclaim our personal details? An obvious step would be to stop taking online quizzes. The source of the Cambridge Analytica leak was, after all, a personality quiz app called “This Is Your Digital Life.”

But are we idiots if we downloaded the app? No, says Igo. Technology is changing too quickly for the average social media user to be expected to recognize all the risks at any given moment.

“That’s an impossible task,” she says. “My point is not that people should take on that burden themselves, but that, as citizens, we should have some say on how data is used in our country and beyond. It has to be a political solution, not a behavioral solution.”

Whether or not we should change our online behaviors, we’re unlikely to, says Dholakia.

“A much more serious milestone was the data breach at Target, when the financial information of 41 million customers was stolen,” he says. “That didn't change consumer behavior in any significant way, so I don't think this will.”

But a political solution could be on the horizon. The European Union adopted a data protection law that will go into effect next month — and the U.S. Congress is discussing ways to follow suit. Sooner or later, we will need some ground rules governing the use of personal data, Vardi argues.

“We keep generating data points, and there are other third parties out there that are hungry for data,” he says. “So the discussion is very much alive.”


Jennifer Latson is a staff writer and editor at Rice University's Jones Graduate School of Business and the author of The Boy Who Loved Too Much.

This article originally ran in the Houston Chronicle’s Gray Matters: https://www.houstonchronicle.com/local/gray-matters/article/facebook-privacy-data-protection-debate-12842032.ph

You May Also Like

Metal straw in a glass mason jar.
Environment | Features
If plastic is out, what’s next? A new generation of innovators rises to the occasion.

Keep Exploring