YouTube: https://youtube.com/watch?v=WXsNh7QV_4Y
Previous: Why Do Apples Turn Brown?
Next: The Secret World of Ones and Zeroes: Moore's Law Explained

Categories

Statistics

View count:614,281
Likes:11,820
Comments:1,311
Duration:04:03
Uploaded:2014-07-11
Last sync:2024-04-01 17:00

Citation

Citation formatting is not guaranteed to be accurate.
MLA Full: "Facebook's Secret Psychological Experiment." YouTube, uploaded by SciShow, 11 July 2014, www.youtube.com/watch?v=WXsNh7QV_4Y.
MLA Inline: (SciShow, 2014)
APA Full: SciShow. (2014, July 11). Facebook's Secret Psychological Experiment [Video]. YouTube. https://youtube.com/watch?v=WXsNh7QV_4Y
APA Inline: (SciShow, 2014)
Chicago Full: SciShow, "Facebook's Secret Psychological Experiment.", July 11, 2014, YouTube, 04:03,
https://youtube.com/watch?v=WXsNh7QV_4Y.
SciShow News explains the science behind a psychological experiment performed on about seven hundred thousand Facebook users, although none of them knew that they were participating.
----------
http://www.audible.com/t1/30DayGoldFT_at?source_code=PDTGB0009PD010214
----------
Like SciShow? Want to help support us, and also get things to put on your walls, cover your torso and hold your liquids? Check out our awesome products over at DFTBA Records: http://dftba.com/artist/52/SciShow

Or help support us by subscribing to our page on Subbable: https://subbable.com/scishow
----------
Looking for SciShow elsewhere on the internet?
Facebook: http://www.facebook.com/scishow
Twitter: http://www.twitter.com/scishow
Tumblr: http://scishow.tumblr.com

Thanks Tank Tumblr: http://thankstank.tumblr.com

Sources:
http://www.pnas.org/content/111/24/8788.full
http://www.pnas.org/content/early/2014/07/02/1412469111.full.pdf+html
http://www.epa.gov/oppfead1/guidance/cr-require.htm
http://psychcentral.com/blog/archives/2014/06/23/emotional-contagion-on-facebook-more-like-bad-research-methods/
http://www.nytimes.com/2014/07/03/technology/personaltech/the-bright-side-of-facebooks-social-experiments-on-users.html?_r=0
http://bits.blogs.nytimes.com/2014/07/02/facebooks-secret-manipulation-of-user-emotions-under-british-inquiry/
http://www.webpronews.com/google-runs-20000-search-experiments-a-year-heres-the-process-diagramed-2012-04
http://www.technologyreview.com/featuredstory/428150/what-facebook-knows/
https://www.facebook.com/notes/facebook-data-team/rethinking-information-diversity-in-networks/10150503499618859
http://www.scribd.com/doc/57223242/Social-Network-Activity-and-Social-Well-Being
http://cameronmarlow.com/media/massive_turnout.pdf

(Intro)

You've probably already heard about it. In fact, you might have been part of it. Facebook caused an uproar last week when scientists from Cornell University working with Facebook's own data science team released results of a psychological experiment that they performed on about 700,000 Facebook users, none of which knew they were participating.

The team was studying what they call 'Emotional contagion' or how people influence each others' emotions. The experiment took place over one week of January 2012. In all, 683,000 English speaking Facebook users were selected at random to take part. That's about 1/1500 total Facebook users.

Once they were chosen, the contents of those users’ newsfeeds were filtered without their knowledge. The filter hid certain posts that contained terms deemed to be either positive, like 'happy, good, yes' or negative, like 'sad, angry' or maybe a frowny emoticon.

Posts were scanned for these terms using an automated program called the Linguistic Inquiry and Word Count Software, so they weren't actually read by someone on the Facebook staff.

The software was set up so that one group of subjects had between a 10% and 90% chance of not seeing a post if it had a lot of negative terms in it. Another group had the same chance of not seeing posts with positive words. And then after the feeds were tweaked, the same program then scanned what the subjects in the experiments posted for themselves.

After scanning through 3 million posts, the psychologists found that exposing users to more positive or more negative content seemed to have only a small effect on what they posted. If a subject saw more posts with positive language, they were about 0.07% more likely to include positive words in their own posts. That is 1/15 of 1% - like I said, it was a pretty small effect. You'd have to write a couple thousand words before you'd use one more 'awesome' in your status update that would make a difference.

Interestingly the findings themselves got less attention from the public than the fact that the study was done at all. A lot of users, somewhat understandably, thought it was creepy that Facebook was intentionally trying to manipulate their emotions, even if it was only by 1/15 of 1%.

Meanwhile, the psychological community and even some government regulators were more concerned with whether or not this was actually legal. In its defense, Facebook says that when users agree to its terms of use they agree to take part in market research. But as any of you who watch Crash Course psychology know, the first principle of research on human subjects is that you need their consent. Like, their explicit, written, informed consent. In international law this is known as The Common Rule.

So, the office of the Data Protection Commissioner in Ireland and the Information Commission's office in Britain are both asking Facebook some very pointed questions. But the fact is these go on all the time. In 2013 the Facebook data science team conducted an experiment on about 250 million users. They wanted to see if it was true that friend's networks created so called Echo Chambers that prevented users from learning and therefore posting about things outside the interests of their group. So for a period of 7 weeks, Facebook just randomly stopped some users from seeing a link that was shared by friends. It turned out that the Echo Chamber effect wasn't very strong, the results showed that people get their news from a wide array of sources and what your friends reblog have surprisingly little to do with what you end up posting about.

And it's not just Facebook that does tests like this. Google data science team conduct more than 20,000 experiments per year on its users to do stuff like refine its recommended links algorithm or see what shade of blue people respond to more in an ad. Information like this is valuable in a concrete, cash, money sense, so companies are probably going to keep collecting it. But have we reached a point where this is something that needs to be regulated? I mean sure they're doing experiments on humans, but does it count as 'human experimentation'? I wouldn't be surprised if people are debating this a lot in the next few years.

Thank you for watching this episode of SciShow News, brought to you by Audible which is giving away a free audiobook to SciShow viewers. If you head over to audible.com/scishow you can download Carl Jung's classic 'Psychology of the Unconscious' or practically any other book you want to listen to, anytime for free. So audible.com/scishow