YouTube: https://youtube.com/watch?v=M5YKW6fhlss
Previous: Mass-Producing Ice Cream with Food Engineering: Crash Course Engineering #39
Next: Why You Need Trust to Do Business: Crash Course Business - Soft Skills #1

Categories

Statistics

View count:293,195
Likes:6,168
Comments:272
Duration:16:51
Uploaded:2019-03-12
Last sync:2024-11-14 17:00

Citation

Citation formatting is not guaranteed to be accurate.
MLA Full: "Social Media: Crash Course Navigating Digital Information #10." YouTube, uploaded by CrashCourse, 12 March 2019, www.youtube.com/watch?v=M5YKW6fhlss.
MLA Inline: (CrashCourse, 2019)
APA Full: CrashCourse. (2019, March 12). Social Media: Crash Course Navigating Digital Information #10 [Video]. YouTube. https://youtube.com/watch?v=M5YKW6fhlss
APA Inline: (CrashCourse, 2019)
Chicago Full: CrashCourse, "Social Media: Crash Course Navigating Digital Information #10.", March 12, 2019, YouTube, 16:51,
https://youtube.com/watch?v=M5YKW6fhlss.
Today, in our series finale, we're going to talk about the great white whale of navigating online information: your social media feed. Social media shapes both our online and offline behaviors from how we engage in communities and consume goods and services to influencing our thoughts and opinions. So let's talk about how they really function - the good stuff and also the terrible stuff.

We know that navigating our current information environment can be frustrating, and we hope this series has helped you develop the habits to navigate our digital world a bit more confidently. Thanks so much for watching!

Read more about YouTube's effort to improve recommendations here:
https://youtube.googleblog.com/2019/01/continuing-our-work-to-improve.html

Special thanks to our partners from MediaWise who helped create this series:
The Poynter Institute
The Stanford History Education Group (sheg.stanford.edu)

Follow MediaWise and their fact-checking work across social:
https://www.instagram.com/mediawise/
https://www.youtube.com/mediawise
https://twitter.com/mediawise
https://www.facebook.com/MediaWise/

MediaWise is supported by Google.

Crash Course is on Patreon! You can support us directly by signing up at http://www.patreon.com/crashcourse

Thanks to the following patrons for their generous monthly contributions that help keep Crash Course free for everyone forever:

Eric Prestemon, Sam Buck, Mark Brouwer, Bob Doye, Jennifer Killen, Naman Goel, Nathan Catchings, Brandon Westmoreland, dorsey, Indika Siriwardena, Kenneth F Penttinen, Trevin Beattie, Erika & Alexa Saur, Glenn Elliott, Justin Zingsheim, Jessica Wode, Tom Trval, Jason Saslow, Nathan Taylor, Brian Thomas Gossett, Khaled El Shalakany, SR Foxley, Sam Ferguson, Yasenia Cruz, Eric Koslow, Caleb Weeks, Tim Curwick, D.A. Noe, Shawn Arnold, Malcolm Callis, William McGraw, Andrei Krishkevich, Rachel Bright, Jirat, Ian Dundore
--

Want to find Crash Course elsewhere on the internet?
Facebook - http://www.facebook.com/YouTubeCrashCourse
Twitter - http://www.twitter.com/TheCrashCourse
Tumblr - http://thecrashcourse.tumblr.com
Support Crash Course on Patreon: http://patreon.com/crashcourse

CC Kids: http://www.youtube.com/crashcoursekids

Hi, I'm John Green, and this is Crash Course Navigating Digital Information. So, we're going to talk about your social media feed today, but, first, at the beginning of this series, I told you one of two jokes I know. And, now that we've reached the last episode, I'd like to tell you the other one.

So a moth walks into a podiatrist's office, and the podiatrist says, "What seems to be the problem, moth?" And the moth answers, " Awww, Doc. If only there were only one problem. I can't hold down a job, because I'm not good at anything. My wife can hardly stand to look at me; we don't even love each other anymore. Worse than that, I can't even remember if we ever loved each other. When I look into the eyes of my children, all I see is the same emptiness and despair that I feel in my own heart, Doc." And then the podiatrist say, "Whoa, moth. Okay. Those are very serious problems, but it seems like you need to see a psychologist. I'm a podiatrist. What brought you here today?" And the moth say, "Oh, the light was on."

We humans like to think of ourselves as extremely sophisticated animals. Like moths may just fly toward the light, but humans are endowed with free will. We make choices. Except, a lot of the time, we just go where the light is on. We do whatever feels like the natural thing. We get on Facebook, because other people are on Facebook. We scroll through posts, because the architecture of the site tells us to scroll. We become passive.

In the past decade especially, social media has fundamentally changed us. Like, take your vocabulary, for example. Silicon Valley rivals Shakespeare in its prolific additions to the English language; friend, Google, and 'gram are all verbs now, snap and handle have new definitions, sliding into someone's DMs is a thing. 

But, it's not just how we speak. These apps have not-so-subtly become embedded in our daily lives very quickly. Sometimes we don't even realize how much they impact us. They've changed our perceptions and expectations of privacy and they've also helped to shape our offline experience. In 2016, for instance, Russian agents organized political rallies all over the U.S. by creating fake Facebook pages for made-up grassroots communities that then had real offline rallies. Just by posing as organizers against Donald Trump or against Hillary Clinton, they actually got real people to show up in Flordia, New York, North Carolina, Washington, and Texas. And, those rally-goers didn't know that it was a ruse. I find that scary.

So today, for our big finale, we're talking about the great white whale of navigating online information: your social media feed.

[Intro]

So, quick note here at the start: I'm not currently using a bunch of social media platforms, which may mean that I'm no longer an expert in them. But, it's only been six weeks and I don't think anything has changed that much.

Also, it turns out that whether or not you participate in Twitter is irrelevant to whether Twitter effects your life, because what's shared online has offline consequences. Like, online shouting matches about politics can influence how we vote, and also how we talk to our extended family at the Thanksgiving dinner table. Unless you don't live in the U.S. or Canada, in which case, I guess you don't have Thanksgiving and, presumably, you never fight with your aunts and uncles about politics. 

The way we interact in social media is shaping all of our offline behaviors, from how we engage with IRL communities to how we consume goods and services. That's why there are so many people you don't know, and companies and organizations using social media to try to influence your thoughts and actions. Sometimes, those who want to influence you use false identities, like those with the Russian rallies. Sometimes, and more overtly, they buy your attention with advertising. Some just create really engaging videos about a kitten saved during a hurricane to steal your attention.

Some of these actors have relatively benign goals and act fairly, like a company sending ads into your feed for a Harry Potter mug that it turns out you actually want because you are a Hufflepuff and you are proud. But, others have terrible motives and spread disinformation, like hoax news sites, which are all run by Slytherins.

Still, others aren't quite in either camp. Like, they might unwittingly spread inaccurate information or misinformation. Like, your aunt who always posts about Onion articles like they're actual news. Or me, on the several occasions when I have failed to pause and laterally read before retweeting news that turned out to be false.

The big problem with all of that is that 68% of U.S. adults get news through some form of social media, and nearly half of U.S. adults get news through Facebook. And, across the globe, people between 18 and 29 years old are more likely to get their news from social media than older adults. 

When we're this reliant on a media ecosystem full of pollution, we have to take responsibility for what we read, post, and share. And to do that, we should fully understand how social media networks function, including the good stuff and, also, the terrible stuff.

First, the good side. For one thing, platforms like Facebook, Twitter, and Instagram allow us to share information and thoughts without the help of traditional gatekeepers. Prior to social media, it was really difficult to have your voice heard in a large public forum. And, because all the posts in our feeds look more or less equal, social media has allowed people to have voices in public discourse who previously would have been silenced by power structures. That's great! All tweets were created equal and everybody's faces look weird with that one square-jawed Snapchat filter, and we're all in this together!

Also, social media is great for making friends and finding communities. We can organize ourselves into these little affinity groups around special interests or organizations, which makes communication much easier than it ever was before. Like, for example, what if a group of people who want to get together and figure out how decrease overall the worldwide level of suck. Or, when I need to know what is eating my tomatoes, I can go to a gardening Facebook group. That example, by the way, is for old people alienated by my previous mention of Snapchat filters.

That said, there are plenty of problems with social media from cyberbullying to catfishing to scams to massive disinformation campaigns to people live tweeting shows you wanted to watch later. And, if you're going to live partly inside these feeds, I think it's really important to understand both the kinds of information that are likely to be shared with you, and the kinds of information you're incentivised to share.

Let's start with targeted advertising. So, you're probably seeing an ad in this corner, possibly this one. I don't have a great sense of direction when I'm inside the feed. Or, maybe you watched an ad before this video played. Regardless, you may have noticed that something you searched for recently has been advertised to you. Like, for instance, I'm trying to improve my collection of vintage cameras for the background, and suddenly all I see are advertisements for vintage cameras.

Social media companies make money by selling advertisements. That's why you get to use those platforms for free. But, these ads are very different from billboards or ads in a local newspaper, because these ads were crafted just for you, or people like you, based on what social media companies know about you. And, they know a lot. They can learn your interests and habits based on how you use their app, but they also track you elsewhere, via other apps associated with that company or by using geolocation features to figure out where you physically are.

Social media companies take all that information and present it to advertisers in one form or another, so that those advertisers can target their ads based on your interests and browsing history and location and age and gender and much more. Can you protect your privacy and your feed from targeted advertising? Kind of. Sometimes. You can check your favorite apps and disable data and location tracking where you can, these features may fall under ad preferences or security or privacy settings.

Another potential downside to social media: how algorithms organize our feeds. So, algorithms are sets of rules or operations a computer follows to complete a task. To put it very simply: social media sites use what they know about your habits, they combine that with their knowledge of other people and the things you've self-selected to follow, and funnel all that information through an algorithm. And then, the algorithm decides what to show you in your news feed.

Generally speaking, a news feed algorithm looks for what you're most likely to engage with, by liking or sharing it. Social media companies want you to stay engaged with their app or site for as long as possible; so they show you stuff that you like, so you won't leave, so that they can sell more of your attention. And, because the algorithm mostly shows us things we are likely to like and agree with, we often find ourselves in so-called filter bubbles surrounded by voices we already know we agree with and often unable to hear from those we don't.

This also means that most news feed algorithms are skewed toward engagement rather than truth. This is so often the case, in fact, that entire businesses have been successfully run on posting engaging, but false, news stories. Many news feed algorithms favor outrageous and emotionally-engaging content, so companies looking to make money from clicks and advertisements can use that to their advantage. Hundreds of websites were built on false viral stories leading up to the 2016 U.S. election, and Buzzfeed later found out many were run by teenagers in Macedonia.

Valuing engagement over quality makes it harder for users to distinguish between truth and fiction. Like, humans tend to interpret information in a way that matches our pre-existing beliefs. That's called confirmation bias. But, even if you did somehow manage to be completely emotionally and ideologically neutral on a topic, research has shown that if there's information you know is bogus, encountering it again and again means you start to believe it. 

Warding off all these negative effects of algorithmic news feeds and filter bubbles is really hard. But, I do think you can limit these effects by, A, following people and pages that have different viewpoints and perspectives than you do, to add some variety to your feed and, B, looking for ways to turn off the "best" or "top" posts features in your favorite social apps so that they display information to you in a more neutral way.

All of these negative features of social media combine to create the feature that I personally worry about the most: extreme recommendation engines. Social media algorithms show you more of what you've already indicated you like. The way we use those apps tend to keep us surrounded by information we're primed to believe and agree with. And, because engagement is the most important thing and we tend to engage with what most outrages, angers and shock us, the longer we hang out on some social media apps and engage with outrageous content, the more likely those apps are to push outrageous content to us.

Researchers have found that YouTube's recommendation algorithms, for instance, consistently showed users more and more extreme, far-right channels once they began watching political videos. They called it a radical rabbit hole. YouTube was lumping together outlets like Fox News and the channels of Republican politicians with those of known far-right conspiracy theorists and white nationalists. They also found that far-left channels have smaller followings and were not nearly as visible via those same pathways. Now, beginning in 2017, YouTube started to update its algorithm to prioritize what they called authoritativeness. In part, to try to stop this from happening, but, as previously noted, no algorithm is perfect or objective.

Ultimately, it's on us as users not to fall down these rabbit holes, not to go merely where the light is on. That's why I think it's so important to follow accounts with differing viewpoints, and to turn off data tracking if you can, and, in general, to try to unwind the algorithmic web around your social media life.

And, while you're in the feed, it's important to remember to read laterally about sources you don't recognize. And, also, take a break once in a while. Talk to actual people. Get some fresh air. I really think that's valuable.

But, even though I personally had to leave lots of the social internet, I do believe that social media can be an effective way to learn about news and other information, if you're able to protect yourself. Let's try this in the Filter Bubble. Oh yeah, that looks about right. Yes, surrounded by everything I love and believe in.

Ok, that's enough, let's go to the Though Bubble.

Ok, so your cousin DMed you a link. Headline: Singing Creek Park Sold, Will Be Home to Monster Truck Rally. Wow! That is your favorite park, so that's a huge bummer. Your first instinct, of course, is to report it with an angry comment, like "UGH we need nature WTH this is so unfair."

But, wait. No. Take a deep breath and think. Your cousin is kind of a big deal, he's Blue-check verified and everything. But, blue check-marks and verified profiled do not denote truth. They just mean an account itself is who they claim to be. 

So, you click the link, and it's from a site called localnews.co, which you've never heard of. And, this is where your lateral reading kicks in. Use a search engine to look up the name of that site. Its Wikipedia entry reveals it's a recently founded independent news site for your area, but it's a very short Wikipedia article - not many reputable sources have written about the site to give us a better idea of its perspective or authority.

So, you search for their claim instead: Singing Creek Park sale. The first result is that sketchy Local News site. Let's peruse that entire search page. Aha! There you go, the seventh result is from a website you do know and trust, your local TV station, and they say the park was sold, but it's actually going to be turned into a non-profit wildflower preserve. Which, you know what, sounds pretty lovely.

You could leave it at that, but as a good citizen of the internet, you should correct this misinformation. Tell your cousin what's up, they won't at all be defensive, ask them not to share it, and then post the trustworthy article yourself with the headline, "condolences to monster truck enthusiasts." Mission accomplished.

Thanks, Thought Bubble.

So, during this series, we've talked a lot about using lateral reading to check the source, look for authority and perspective, and then check the claim and its evidence. With social media, a more flexible approach is probably best. Like, sometimes, it makes sense to find out who's behind the account you're seeing. Sometimes, you should investigate the source of what they're sharing. Other times, it's best to evaluate the claim being made. As you practice, you'll develop a better idea of how to spend your time online.

No matter where you begin, lateral reading will help you get the information you're looking for. When in doubt about anything you encounter online, you can challenge your source and your own assumptions, and see what other people have to say. And, there's one last thing I'd add: Be suspicious of information that confirms your pre-existing worldview, especially stuff that confirms that people you believe to be evil or stupid are evil or stupid.

Read laterally, not only when it comes to stuff you don't want to be true, but also when it comes to stuff you do want to be true. I know our current information environment can be frustrating. Believe me, I am frustrated by it. It is really difficult to know where to look for truth and accuracy, and I wish I could tell you there is one right way, one source you can always rely upon. But, the truth is, anyone who tells you that is selling you an ideology or a product or both.

But, by making a habit of following up and following through, we can be expert navigators of digital information, and maybe even go to places where the lights are not on. 

Thanks so much for joining us for Crash Course: Navigating Digital Information. And, thanks again to the Poynter Institute and the Stanford History Education Group for making this series possible. MediaWise is supported by Google.

If you're interested in learning more about MediaWise and fact checking, a good place to start is @MediaWise on Instagram. 

Thanks again for watching. Good luck out there in the wild west. And, as they say in my hometown, don't forget to be awesome. 

[Outro]

Thank you for watching Crash Course, which is filmed here in Indianapolis, Indiana with the help of all of these nice people. For this series, Crash Course has teamed up with Mediawise, a project out of The Poynter Institute that was created with support from Google. The Poynter Institute is a non-profit journalism school. The goal of MediaWise is to teach students how to assess the accuracy of information they encounter online.

The Mediawise curriculum was developed by the Stanford History Education Group based on civic online reasoning research they began in 2015. If you're interested in learning more about Mediawise and fact-checking, you can visit @Mediawise on Instagram.

Thanks again for watching, and thanks to Mediawise and the Stanford History Education Group for working with us on this project.