Previous: The Real Reason You're Always Losing Your Keys
Next: Why Is It Important to Express Your Feelings?



View count:65,536
Last sync:2022-11-05 04:15
Facebook has access to extensive data about its millions of users across the world, but what exactly can they learn from that information?

Hosted by: Hank Green
Support SciShow by becoming a patron on Patreon:
Dooblydoo thanks go to the following Patreon supporters:
Lazarus G, Sam Lutfi, Nicholas Smith, D.A. Noe, alexander wadsworth, سلطا الخليفي, Piya Shedden, KatieMarie Magnone, Scott Satovsky Jr, Charles Southerland, Bader AlGhamdi, James Harshaw, Patrick D. Ashmore, Candy, Tim Curwick, charles george, Saul, Mark Terrio-Cameron, Viraansh Bhanushali, Kevin Bealer, Philippe von Bergen, Chris Peters, Justin Lentz
Looking for SciShow elsewhere on the internet?
[INTRO ♪].

If you have a Facebook account, you've probably started wondering, how much does Mark Zuckerberg really know about you… and the rest of the site's 2 billion users? One company in particular, Cambridge Analytica, was recently in the news because they took data from millions of Facebook profiles and used them in political campaigns, with questionable or no consent from the users.

But what does knowing what memes you liked tell companies about you? And does it make you an easy target for marketing and manipulation? Turns out, you can potentially glean a lot about personality from Facebook data.

But while these data can help companies fine-tune some ads, we're not sure if it's enough power to do something like manipulate your voting behavior. Let's start back in 2013, when researchers at the University of Cambridge Psychometrics Centre got permission from over 50,000 Facebook users to see what they clicked “like” on. These users also took a long survey on things like their Big Five personality traits, intelligence tests, life satisfaction, political and religious views, and even whether their parents had been divorced or whether they'd used various drugs.

Then the researchers designed an algorithm that basically determined which Facebook likes were correlated to the private information from the surveys. Overall, there were over 9 million possible likes, which meant the computer found some surprising relationships. Like, you might guess that someone with a high score on an intelligence test might like "science" on Facebook, but "curly fries" and "Morgan Freeman's voice" were also closely associated.

And it's not a huge stretch that the people who scored most extroverted clicked on "dancing" or "socializing.” But they were also fans of Michael Jordan and Chris Tucker. The researchers then tried to flip the algorithm: how well could their computer program guess your private information, knowing only your likes? Turns out, it did the best at guessing a person's gender and race, with over 90% accuracy in each.

But the accuracy varied depending on different categories. One of the hardest things to guess was whether a person's parents had divorced before they were 21, which was only 60% accurate— just a little better than tossing a coin. But with just a percentage, it's hard to tell what accuracy is actually any good.

Like, could this algorithm know you as well as a close friend, or your family? So some of the same researchers designed a follow-up study to look at that in 2015. They ran the whole process again, but this time they got a few thousand people and their friends.

The participants gave their Facebook data and took a personality test. And on top of that, a good friend or two took a short test, describing the participant's personality. And pretty much across the board, the like-based computer algorithm did about as well as the users' friends did.

In a few cases, the computer did even better— like predicting self-reported alcohol and drug use. But in the category of life satisfaction, the human judges had the upper hand. It's worth noting that even the algorithm's more accurate predictions weren't perfect.

Because these are just correlations, these studies could only test for so much, and every human—as you may have noticed— is complicated. And not everyone shares the same amount with Facebook. The algorithm was only good if it had enough “likes” for each participant.

The researchers compared their data to past research on how good people are at judging each other's personalities. And they estimated that the computer needed 10 likes to match how well a coworker knows you, 70 to match a friend, and 300 to match a spouse. With over 2 billion active users and a whole lot of likes,.

Facebook probably has the most data to play with— far more than most other social media sites. But this kind of psychological profile research has been done with other sites too, like blogs or Twitter. For example, in 2011, researchers gathered 2000 tweets and gave a personality test to a small sample of 279 people.

And they found some correlations, like extroversion with people who used more words in their tweets, used more question marks, and used more social identifiers like "daughter," "husband," or "friend.” So who's to say what we might find with a bigger study? Now, for decades before the internet, companies have been targeting ads in TV and magazines based on age, gender, and other demographics. But adding traits like agreeableness or extroversion to the list, and being able to fine-tune arguments to specific people, affects the psychology of persuasion.

Armed with their computer algorithms that could sleuth out your personality traits from Facebook likes, some of the same researchers did a study in 2017 and tried to see if they could make ads more persuasive. They picked two of the Big Five personality traits that the algorithm identified best: extraversion and openness to experience. And then they made ads for things like makeup and gaming apps.

For extraversion, for example, they made two kinds of ads for makeup. One kind said things like "dance like no one's watching (but they totally are)" to target those who scored high on extraversion. And another kind said things like "beauty doesn't have to shout" to target those who scored low on the trait.

Then, they ran the makeup ads for a week to over 3 million people that were currently active on Facebook, along with ads for a free personality test. With a combination of the data available in public profiles that could be run through the algorithm, and results from the personality test if people filled it out, they could get a sense of the introversion or extraversion of users. And they found that users were more likely to click the ad that was targeted towards their personality trait, and even more likely to buy the thing.

So getting people to buy stuff is a big part of social media. But it isn't the only way data can be used to manipulate people. Another way is by deliberately changing your emotional reaction to a site.

For example, researchers worked with data scientists at Facebook and published a paper in 2014 reporting on manipulations of the content in nearly 700,000 people's news feeds. They took users' posts and sorted them by positive and negative content— lots of happy, celebratory words versus lots of sad or angry words. And then, they randomly selected users to get more or less of each of those in their feeds.

As you might guess, people who got more positive content got a little happier, and people who got negative content got a little more negative. At least, as far as researchers could tell from the emotional content of what the people posted next. It's also worth noting that the participants didn't give express permission to be part of this experiment.

This research technically fell under of Facebook's data use policy, so that's what they considered informed consent, but random users didn't know that the website was trying to make them feel good or bad ... you know, for science. Now, there do seem to be limits on manipulation, as far as we know. Cambridge Analytica seems to have developed personality-sleuthing algorithms like the ones used in this kind of research, and advertised that they could swing elections with that data.

But none of these studies have shown that companies could use Facebook data to really manipulate thoughts and feelings— like causing people to vote for candidates they weren't planning on. That being said, it's still a good idea to keep in mind how much you're revealing online, and how companies might use that information in the future. To change your behavior ... which is what advertising is supposed to do, but it's a little ... creepy.

Thanks for watching this episode of SciShow Psych, which is produced by Complexly. If you want to keep thinking critically about how all media— not just social media—affects our brains, we have a Crash Course you should check out, called Crash Course Media Literacy at It is very, very good and I like it a lot and it has changed the way that I see the world. [OUTRO ♪].