YouTube: https://youtube.com/watch?v=EJtNmd1kV44
Previous: Searching for Betsy in Brazil
Next: On February 3rd, 2020, I Wrote a Video I Didn't Make. Here It Is.

Categories

Statistics

View count:426,303
Likes:28,450
Comments:2,181
Duration:21:42
Uploaded:2021-10-07
Last sync:2024-10-25 04:15

Citation

Citation formatting is not guaranteed to be accurate.
MLA Full: "We're Thinking About Facebook Wrong." YouTube, uploaded by vlogbrothers, 7 October 2021, www.youtube.com/watch?v=EJtNmd1kV44.
MLA Inline: (vlogbrothers, 2021)
APA Full: vlogbrothers. (2021, October 7). We're Thinking About Facebook Wrong [Video]. YouTube. https://youtube.com/watch?v=EJtNmd1kV44
APA Inline: (vlogbrothers, 2021)
Chicago Full: vlogbrothers, "We're Thinking About Facebook Wrong.", October 7, 2021, YouTube, 21:42,
https://youtube.com/watch?v=EJtNmd1kV44.
Pizzamas products available now until the end of next week!: http://www.pizzamas.com

Usually I'm up in the description being like, "here are a bunch of things I wasn't really able to fit in the video" but actually...turns out, if you let me go for 20 minutes, I get most of it out.

OK, I guess the one insight I had durning this that didn't fit is to do with righteous superiority. It (and fear) are most effective on people who struggle with self worth (which is to say...y'know...basically everyone, but also some people more than others.) In particular, people who might have been told that they would have a lot of status but then ended up not getting it might be very susceptible to both fear-based and superiority-based content. Interestingly, the fear and the superiority almost always go hand-in-hand. Like, "here are the people you are superior to, and here is why you should be afraid of them" is a wildly engaging thought.

Anyway, this is really intense and I hope we all get through it.

2:00 So... Facebook is scary
3:56 We're babies
6:58 Ok, but what is Facebook?
10:43 What should we do?
16:10 Platforms don't just host!
19:02 Facebook is regulating society
19:40 The end

----
Subscribe to our newsletter! https://nerdfighteria.com/nerdfighteria-newsletter
And join the community at http://nerdfighteria.com
Help transcribe videos - http://nerdfighteria.info
Learn more about our project to help Partners in Health radically reduce maternal mortality in Sierra Leone: https://www.pih.org/hankandjohn
If you're able to donate $2,000 or more to this effort, please join our matching fund: https://pih.org/hankandjohnmatch
John's twitter - http://twitter.com/johngreen
Hank's twitter - http://twitter.com/hankgreen
Hank's tumblr - http://edwardspoonhands.tumblr.com
Book club: http://www.lifeslibrarybookclub.com/
Good morning John, it's Thursday. 

So, remarkably, we have mostly forgotten about this by now, but there was a day this week when a lot of people had the thought, "Facebook has been down for a really long time now, maybe it won't come back."

That was accompanied by this tweet getting 400,000 likes and retweets, "So, someone deleted large sections of the routing... that doesn't mean Facebook is just down, from the looks of it... that means that Facebook is GONE."

Several people then wrote blog posts about how Facebook might be gone because of a tweet from someone who is not qualified to comment on it, because like that would be the most interesting piece of news that day, and I'm sure it got a lot of impressions. Which I guess is the goal, though it does rely on a complete misunderstanding of how websites work.

So maybe that's lesson number one: while something is happening is the worst time to learn about it, but it is also the time when we most want to know about it. And that is the root of one of many different destructive mechanisms that happen on social media. That has nothing to do with what platform you're on, it has to do with the barriers to distribution and discovery being low enough to ice skate over.

This all came on the back of Frances Haugen, a former Facebook employee, releasing a huge trove of documents in which Facebook did research on itself and discovered that it was, you know, not great- research that it then promptly ignored and kind of hid.

So Facebook's no good very bad week gave us a chance to think, "Uh, what if it didn't exist?" But actually, a lot of people really do depend on Facebook to maintain their businesses, their social lives, their families, their connections with people in other countries who they love. I'm just saying, maybe don't go to the people who are gleefully celebrating to be the ones who have the best insight on this topic.

But I do think it's important to engage in a thoughtful way with the existence of Facebook, the impact of Facebook. Hence, this very long, educational video, which makes it exempt from the 4 minute Vlogbrothers rule. 

Now I have, over the years, been vocally critical of Facebook for a number of different reasons, see, you might just expect me to like launch out with how bad and evil it is. 

[audio from CollegeHumor sketch] "Oh, it's just a fun little social network that undermines the foundations of our democracy. And lets me know when my dad is done building the deck."

"Why would those two things be put together?"

"I don't know."

When Justin Timberlake was bringing "Sexy Back" 15 years ago, when I was a full-fledged adult, Facebook was only available to high school and college students. Now it is the sixth most valuable corporation in the world, and possibly the most influential corporation ever.

Like yes, there have been newspaper monopolies that control the news, there's been energy monopolies that controlled energy. There's been telephone monopolies that control long distance communication. But Facebook might know more about you than you do. Like their internal research has shown that they can affect people's moods. They can decide whether to inform or misinform you. They can help a community thrive, they can make a community stop existing. 

And they can do that not just with like, 60,000 people, like the size of Missoula, they do it with billions, billions of people. You start thinking about this too hard and you realize that Mark Zuckerburg might be the most powerful person to exist, which is not a super comforting thought. 

Facebook has a lot of power, like a coalition of European politicians on both sides of whatever they have, I guess an aisle, probably they have an aisle. They wrote to Facebook and they said, "Your platform is forcing us into more extreme positions than we want to have, and maybe even that our constituents want us to have, because, in order to get elected, we have to play the intention and engagement game that you have set up for us."

Now you might say, "Well those politicians should have some spine, and do what they want to do, rather than what like, the mass of people on Facebook want them to do." But let's all accept together that a politician's incentive is to get elected. And if they don't engage in the attention game and someone else does, they don't get elected and then that person is in power.

Look, it's easy to not be sympathetic towards politicians, I get that, but I am also very aware that there are probably a lot of people who would be very good public servants who are not ever going to consider taking on that line of work because of the environment that the social internet has created. 

These platforms make life miserable for every politician who is not a garbage human. 

We are at the very beginning of this.  It always feels, in any given moment (I'm old enough now to recognize this), it always feels like where we are is the destination. We have arrived at this moment, and that is where we were going. And then it turns out to be not true- this moment ends up being another step on the journey to the other place where we were going, which is the new present that we will experience in the future. So I'm just trying to remember that we are at the very, very, very beginning of this.

If you ask someone what the most disruptive technologies in history are, they are probably going to tell you about some weapon. Which I get, but I think the most disruptive human technologies are always communications technologies, ways that people transmit information from person to person. Stories, and books, and plays, and trade, and radio, and television. Radio was fantastic for the Nazis, the Catholic Church was fractured by the printing press. 

I used to think that I irrationally believed that the social internet was as big of a deal as the printing press. I now believe that that is 100% a rational belief. In fact, it feels like a common belief now. 

And so, if we are at the beginning of this, and it is a huge revolution in like the main human thing, which is human to human communication, then like it shouldn't be a surprise that we're bad at this. And we're bad at this.

I'm sorry if you work at one of these companies, but we need to accept this. Facebook, and Twitter, and Tiktok, and YouTube, are bad at this. And you, watching this, are bad at it; and I, making this, am bad at it. Of course we're bad at it. It's revolutionary, and it's a baby. We have no systems for how to deal with many-to-many decentralized communication. It's brand new. We don't know what we're doing. 

So as with apparently everything, we have to figure it out, and that means that some, not all, but some thoughtful people need to engage with it in a serious way. And if you're watching this, maybe you're one of those people. 

And actually I was a little bit impressed by the congressional hearing, the testimony that Frances Houghin, the Facebook whistle blower, gave. And the questions that were asked, it seemed to me to show a much deeper understanding of how all of this works than I had previously seen. It means maybe we're starting to have useful conversations about how to live in a society that contains social media.

And look, it is very easy to say, "Facebook sucks." I say it all the time. But it is much harder to imagine what one would do to make Facebook better. 'Cause guess what, like, apologies to the random bloggers who thought that Facebook was gonna not exist anymore, it's gonna keep existing. If Mark Zuckerburg wanted to just like pull the plug and be like, "Woah, what a great experiment, we're not gonna do it anymore," that would be hundo-p baller move. But it does seem unlikely, like I feel like Mark Zuckerburg has a lot of incentives to believe that Facebook does more good than harm, and so he's probably going to believe that, whether or not it's true. Which is actually, for clarity, not something I know. Honestly, I'm not even sure many people could even answer the question, "What is Facebook?"

Let's try and build a mental model of what Facebook actually is. I'm not saying it's going to be perfect, but I am saying that it's probably gonna be better than whatever jumble of hot takes you currently have associated with your Facebook neuron. 

Facebook simplified: there are three centers of power. There are the people who consume the content, the people who create the content, and the people who host and promote content; that third one is Facebook. This finger, I'm not flicking off Facebook. 

All three of these groups have different goals, but it turns out that their incentives align really well, which is how Facebook became worth a trillion dollars. 

So let's go over the incentives, let's try to understand why these people are all doing what they're doing. 

Starting with the people who create the content. They are trying to make their numbers go up, whether that is for status, or for money. And trust me when I say, watching the numbers go up can feel really good, and once you start to get them, it can take a whole lot of time and work to start divorcing your self-worth from your internet numbers. Facebook needs to incentivize the creation of content on its platform, because without that, there's nothing there for people to do. And so they provide a lot of different feedback mechanisms that make people feel good about the stuff that they make, that make people feel like they are acquiring status, and very occasionally, that they are getting revenue.

Next, let's talk about the people who consume the content, which is to say everyone, and let us all accept that we, as people who consume content, have incentives for what content we consume. We like good cats, we like good jokes.

There are number of different incentive cycles here, but the ones that we tend to talk about and need to talk about most are the ones that are destructive to society or to people who are consuming the content.

And the number one emotion everyone talks about, when talking about this, is "outrage." And I would like us to stop talking about outrage, because I don't know what it is.

Instead I would like to talk about two other things: one, is fear. People, when they are afraid, experience a lot of emotion and they experience a lot of desire to share that information and try and take on that thing that they are afraid of. Which like, we knew about that. I mean, we knew about that before social media, like news was always aware of fear. 

But the second thing that social media is very good at, that we often call outrage, is actually a feeling of righteous superiority. I feel really good about myself because I know that I am better than whatever other human I have just been exposed to or told about.

Feeling like a good person who is specifically good because they are better than someone else, super seductive. Because, it means you don't have to do anything to be good. You get to feel better about yourself just by looking at how awful people are. But it is not new, like I don't wanna point fingers or anything, but it turns out that righteous superiority was a pretty big deal for Martin Luther. 

And also, people are often correct in their righteousness, and in their superiority, like, hello vegans.

And finally, the last player in the game? The platform itself. Their incentive is, or at least was, very clear: keep people on the platform, learn more about them so you can sell more advertisements at a higher price. So there's a positive feedback loop, where platforms will promote content that makes people feel righteously superior, those people will consume that content, they will feel righteously superior more people will create that content because it will do better, because the platform will promote it more.

And there's also external positive feedbacks into the loop, because if you have a group of people feeling righteously superior, other people find out about them and then they make themselves feel righteously superior to that other group because they don't want to feel inferior.

I don't want to oversimplify this, but like if you run this positive feedback loop for about ten years, maybe you end up with everyone spun up like a bunch of Beyblades thrown into a fishbowl full of fragile institutions.

There are also other feedback loops, which is why there are cute cats, and good jokes on all of these platforms. But hopefully this gives us a workable mental model of what Facebook is, and we can move to the next section of the video.

We've got these three power centers, and two of them are extremely dispersed and difficult to control. Like no one is able- no one!- is able to control what people are posting and consuming on social media platforms.

But the third one is centralized, and it's making 85 billion dollars a year, so that makes it a more attractive target, both in terms of optics, and also in terms of something you can actually control for regulators. You might have even seen or heard Facebook kind of begging to be regulated, which sounds strange.

Here, for example, is the end of a statement that Facebook made in response to Frances Haugen sharing all this information about the platform.

"Despite all this (all this being the fact that they don't agree with anything she said), we agree on one thing; it's time to begin to create standard rules for the internet. It's been 25 years since the rules for the internet have been updated, and instead of expecting the industry to make societal decisions that belong to legislators, it is time for Congress to act."

This is a good point, like it's a good talking point, it's also just a good point, but let me explain why you're gonna be hearing a lot of it and why it might not be as good as it sounds. 

This is Facebook saying, "Okay. I see that you think that we've created a lot of problems, how about you fix them? Are we going to give you advice into how to fix them? Nah. Are we gonna give you insight that will allow you to understand the platform well enough to create good solutions? Also nah."

They're basically saying, "Oh, okay, I see you, I see what you're doing, you're saying that we're the big bad, and the responsibility is on us, we're saying, 'What about you? We think you're the big bad. The responsibility should be on you.'"

And maybe it should. Right? Like, should Facebook's responsibility be to, like, impose things that control society, or should that be the job of legislators? 

This outlines a really good and interesting point, which is that Facebook is actually, with regards to this stuff, not like anything else, they're doing fine, with regards to this stuff, they're kind of in a terrible position. 

So a company as big and powerful as Facebook can kind of have one of two goals. It can either be out there for profit, or it can be out there for profit and. And what "and" is gets really complicated when you are as powerful as Facebook. And I think maybe this conversation kinda terrifies them because it outlines a reality that they would like us to not notice.

They are not just a place where content goes, they decide what people see. They make millions of decisions per second about the world that people experience. And it is very easy to make those decisions when you have a particular goal, that goal being how do we keep people on the site longer, learn more about them so we can sell them more effective advertisements.

And then you can start to put on some little limits, like if this going to incite violence, if it's hate speech, we're gonna take that off. But that's just about the law, and like what the right thing to do is, but if you get beyond that, and you start to say, "Okay, well, sort of what counts as hate speech, what counts as incitement of violence," and raising up the barriers higher and higher and being like, "ah, that's on the line but we're going to call it incitement of violence," that's gonna constrict the speech of some people more than others, particularly it's gonna constrict the speech of some members of certain political parties than others.

And of course they can legally do that, they're a private company, until they can't anymore. We're already seeing this, like right-wing politicians getting up there and saying, like "People are being censored on social media," in a way that of course makes the constituents feel both afraid and righteously superior. What a win!

And many of them are threatening a whole separate set of regulations that would prevent these social media platforms from policing themselves, which for clarity, would be a disaster.

So Facebook is saying, "If you want us to do something, tell us what to do, because we're afraid that if we do more than the bare minimum, you're going to regulate away our ability to do anything at all. So until you make that call, we'd rather just do what is normal and expected of a corporation, which is to focus on profit."

Now there are also other reasons why Facebook is focusing on this conversation about regulation. One, is that it means that it would regulate all internet companies, not just Facebook, the "rules of the internet," not the rules of Facebook. And regulations are often actually a competitive advantage for existing incumbent large businesses over potential competitors because the regulations increase the barrier to entry into that industry. I'm not saying I'm anti-regulation on these companies, I'm very pro-regulation on these companies, I'm just not so pro-regulation on these companies that I will ignore reality.

But the last and maybe most important reason why Facebook is calling for regulation on itself is that then they get to go after the individual actual pieces of legislation, like the individual suggestions, which they will of course do. Like, yes, please regulate us, but then any individual regulation they will fight against tooth and nail. And yes, it is a lot easier for a legislator to just like yell about how much Facebook sucks than to actually write legislation that would make it better. Especially when regulation is complicated, whereas shouting that Facebook sucks can make them and their constituents feel, wait for it, righteously superior.

And none of this regulation would be easy, 'cause yes, legally, Facebook can control its product in whatever way it wants to, but let's accept that it's not just a private corporation in that way anymore, like we don't want it to be able to control society in whatever way it wants, and that is why they're sort of very hesitant to do that, because it's terrifying. 

Facebook is a place where communities thrive, where families connect, where businesses grow. In a very real sense, it is kind of both corporation and government, like if you spend a lot of time on Facebook, a lot of your life there, they are in charge of that part of your life. That makes them a kind of government. A fact that they would like everyone to keep not realizing for as long as possible, because it has implications 

Oh my god, I've been recording for 50 minutes now, what the hell did I do to myself, I do have more that I need to say here.

I've implied this already, but I'm gonna say it out loud right now. The real danger and power and weakness of Facebook and YouTube and Tiktok and Twitter is not that they host content, it's that they decide what content gets new eyeballs on it. Hosting a piece of content is the building block that the entire internet was built on. Promoting content is an active, editorial decision that is being made by the company. Now that decision might be being made by a computer program that no one understands, that only has inputs from the users of the platform. It is still a decision that the company makes. 

And so, when a hateful Facebook group gets promoted, when there's an anti-vac talking point trending on the side of Twitter. When watching a YouTube video about history takes you into the alt-right pipeline. Those are decisions that are made by platforms full of thoughtful people who did not think at all about the fact that that was gonna be a thing that happened. 

This is a problem with every social media platform that uses algorithms. Like you can scroll all day on Tiktok and it will just be full of people celebrating the diversity of humankind and their own remarkable human body and mind. And you will never be exposed to the garbage, terrible, alt-right pipeline inside of Tiktok, because Tiktok is really good at knowing who you are.

What I'm saying is when people say that Facebook should be held responsible for the content that's on its platform, I agree, but in a pretty limited way. Like they should not host hate speech, they should not host illegal content. But I believe that in a much more limited number of cases than the following sentence: Facebook should be held responsible for the content that it promotes on its platform.

I think that that is something that every single person  who works at a social media company needs to square with, and it's not an easy thing. Because it is very difficult to create that joyous, wonderful experience that you might have on an algorithmic feed without indulging in other people's righteous superiority and their fear in a way that continues to spin us up, create division, and destroy fragile institutions. 

In general, an algorithmic content platform's response to this situation is, "Okay, let's create artificial intelligences that can identify hate speech, they can identify incitement of violence, and then we will take down those individual things, but we're gonna leave up this remarkable, self-reinforcing feedback loop that is so fantastic at keeping people engaged on our platform," which is a little like saying, "We're gonna, like, follow behind the dune buggy full of flame throwers and put out the fires that it's creating." You guys, you gotta focus on the dune buggy with the flamethrower! 

And that is a conversation that I actually feel like is starting to happen. 

Now I feel like it's necessary to say there are people who are smart and who I respect who say that when you talk about regulating social media companies, you're not talking about regulating media, you're talking about regulating society, and we don't want the government to regulate society, that's scary.

My response to that argument is that Facebook is already regulating society, it is just doing it with a very narrow, short-term focus, which is how do we increase the amount that we know about people and the amount of time they spend on our platform, so that we can make more money advertising to them. 

I wish, at the end of this video, that I had, like, good advice for how to move forward, like I think that we're not gonna educate people out of, you know, response to fear and indulgence in righteous superiority.

But there are some signs that maybe, like communities do kick in and some people, not like think about this actively, but realize that, like, ah, that kinda content seems like overplay.

But we can't say, like, "Facebook is only bad because humans make bad decisions," because like humans are humans, and we're not gonna get out of that.

It might be that these platforms start to get their platforms to recognize when they are engaging in the promotion of societally or personally destructive content. I encourage all computer scientists to work on that problem, I think that we need to say to ourselves, "It would be very hard for Facebook to make a profit in a world that has been completely ripped apart by Facebook."

And as for what legislators should do, like, I don't know, I'm not, thank goodness, I don't have your job.

But I will say I'm very much not in favor of making it so that platforms are ultimately responsible for like, anything that gets posted on their platform. That would basically make it so that none of these things could exist at all, but I am in favor of saying that they are responsible for what they promote on their platforms, because that is their decision.

I don't know if the world would be better without this many-to-many decentralized communication system that we have developed. I know that my life would be very different without it.

But I will reiterate something I said earlier, which is that we are at the very, very beginning of this. Of course we are bad at it. We're gonna look back in 20 or 30 years at the way things are right now and we're gonna think about how quaint and clunky and just hilariously bad it was. 

I don't know how we're gonna move forward from here. But I know that we will. 

John, I'll see you on Tuesday.

All of our lovely Pizzamas stuff is available only during Pizzamas, so for the next week or so, and then it will never be available again, none of it will ever be available again. You can go to https://pizzamas.com/ right now to check it out, all of the profits go to support our community's work, developing better healthcare systems in Sierra Leone.