Hank: Hello, Hankschannel. So for whatever reason, people keep wanting to talk to me about stuff, and recently I got an email from Bill Gates's...people, and they were like, "do you want to talk to Bill about misinformation and the internet?" And I was like, "Yeah, I do, I definitely do."
But I also didn't want to have a conversation that went over, like, I felt like, the same ground over and over, so -- First, I asked him for a little bit more time than he offered, because 20 minutes didn't seem like enough to me. And second, I did some more thinking and reading to try and have a more-- like, a hol-- more holistic look at how this stuff is made, why it's made, what the motivations are for people who are both creating this content and getting really hooked in by it.
Obviously, I think about this kind of stuff a lot; I wrote two whole books about it, kind of. And also, a lot of what I do on the internet is related to how to educate people, and sometimes how to educate people to, like, interface better with the systems that they use on the internet. But I think under-discussed are the reasons that people share this stuff and create this stuff and believe this stuff. There are reasons, and I think we should try and understand them. I'm very interested in those motivations, and obviously also how to influence them.
Because I think two things are important here, and keep this in mind as we talk about this stuff. First is helping people understand their own motivations, and understand what systems might be used to get them to believe things that, on the face, are actually pretty outlandish. But second, to change peoples' behaviors by changing what the platforms are rewarding.
So that's just a little bit of an intro, before we get into the conversation with Bill, which will begin now.
--------
Hank: First, maybe this is too obvious of a question, but why did you want to have this conversation right now?
Bill Gates: Well, we have to worry that some of this misinformation will cause people to, you know, take the wrong drugs, or not wear a mask, or we won't have enough people who are enthusiastic about the vaccine--
H: Mm-hm.
B: --to create the broad protection that will bring the pandemic to an end. And honestly, you know,
I'm surprised that the phenomena, trying to think about how we get trusted voices to be as interesting as the titillating conspiracy theory.
H: Right. Well, that's the thing and something that I know very well from creating content is that like, creating a good story is one way to get people to consume your content, and I can see that. I can see in the way these people are talking about it, it's very clear storytelling tricks, very clear hooks, very clearly aiming for the kinds of things that are gonna make people feel the most invested, the most scared, the most valiant. In your sort of explorations of these things, have you found that this is pretty new? 'Cause misinformation, of course, has always been around. Conspiracy theories have always been around, but does this seem like it's, like it's bigger than it once was or is there any way of measuring that?
B: Well, it's dramatically bigger because we're in a pandemic and we have digital social media where the number of entrepreneurs that can try out those titillating techniques, you know, it's not just people who edit newspapers and try and hawk them on the corner, it's now almost anybody can be a publisher. You know, I, my bias is that the world's improving and a lot of things are getting better and people tend to miss that--
H: Yeah.
B: But here, I'm afraid uh, that the--it probably is worse.
H: Yeah.
B: That is, when you get into a pandemic and it's awfully complex and the results in the US aren't nearly what we would have expected. We've had, you know, the worst epidemic by most measures, and you get a lot of polarization. You know, we're on the verge of an election and of course the digital empowerment is stronger than ever.
H: Right.
B: I think it is kinda peaking. You know, I hope as we get smarter, as we get past the pandemic, past the election, that you know, the trusted voices can be more prevalent.
H: I had early optimism that this would be a good thing for paying attention to science, 'cause at least, hey, now like, we really have to, like, it's a life and death situation, and that was at least, it feels right now to have been misguided about. You used the phrase "digital empowerment" which I thought was interesting and I wonder to what extent we--'cause I know how I feel, but I wonder to what extent these platforms on which so much people spend so much of their lives now, so much--especially their social time, and also their like, world-construction time, so the time that we spend making sense of our world. We spend more and more time on these platforms, and so I wonder if you feel like those--these, you know, very few, you know, maybe four total platforms, you know, those places are having an outsized impact on the growth of this misinformation? It's not just the political environment we're in, not just the pandemic that we're in, but also there's something different about how people are communicating and connecting and building their worlds.
B: If you'd said, you know, ten years ago that your front page will be tailored according to your interests, the sports you care about, the places whose weather you care about--
H: Yep.
B: I would have said, okay, that's just an unadulterated good that I don't have to look at sports I don't care about in my newspaper.
H: Wonderful, yeah.
B: The idea that instead of just feeding us the, you know, facts we wanna learn, that it would be more about negative thoughts and attacks, and that we would literally have people drawn into, you know, saying hey, these people you don't like, let me just make you feel good that they're really bad, it certainly is different than when we used to watch three networks and the local, you know--
H: Yeah.
B: Very few newspapers, and so we were all kind of centered on, okay, we all read that same article that says, okay, the economy's good, the economy's bad. That fracturing, at least in terms of agreeing on a strategy for the country or, you know, for democratic elections, it's a little bit scary, you know, will it self-adjust as people realize okay, when I believe those things, it causes me to do things that aren't in my interest. Hard to say.
H: So I like to think of it in terms of the people who are making the content and the people who are consuming it. Of course, there's overlap between those groups, but like, I think the motivations are a little different, and there's also this feeling, especially once you've bought into it, that you know, you are fighting against this, like, massive bad and so anything you can do, like bad acting, like, no matter how bad you behave, it's good because the thing you're fighting against is so bad. That's pretty scary. What responsibility do platforms like these have to control the misinformation, to control the spread, to certainly not be surfacing it in peoples' feeds algorithmically who weren't looking for it in the first place? Do the platforms have that responsibility?
B: Well, some of the people involved in these platforms like to say that, you know, maybe they shouldn't even see what goes on. They should just end-to-end encryption and the argument there is that you know, for a political dissident who then isn't subject to the government being able to see what they're saying, you know, who they are, that political dissident thing, you think, oh, well, that's a good thing, but then when it comes to, you know, Holocaust denial or anti-vaccine things, you'd say, wait a minute.
Because this medium can reach so many people so quickly, it's worse than one-to-one communication. It's more like a newspaper and therefore, society has an interest in making sure we don't, you know, essentially addict people to those things. Now, when the platforms come back and say, wow, you're putting me in a tough position, I have to admit that exactly what we want the platforms to do, that, you know, where should they draw the line, there are some clear cases like saying that a medicine is bad for you, say, drinking bleach, is a great idea, that yeah, block that. They say the Holocaust didn't happen, block that, but we do have to be a little bit careful to be prescriptive enough that we don't put them in an impossible situation and I--I distinguish a bit between messages that only go to your four or five friends versus that go out and, you know, you can raise a mob around.
H: Yeah.
B: So there was a video about the pandemic that was an hour long, you know, with people on the Supreme Court steps saying things that included not to wear a mask and that hydroxychloroquine clearly works super well, that got 10 million views very quickly and so I do think anything that's getting to like, newspaper publishing scale, it should be examined and it should be subject to a pretty high bar, whereas just chatting with your friends should not be.
The creativity about where, how we draw the line in a practical way, I'm a little disappointed that we're not finding a way to throw out the bad while keeping most of the good, because I still believe this idea you can connect up is a, is, often has a lot of great things to go with it.
H: I think it's very important to recognize that we can be critical of the internet while also still loving it. We are that way with all of the things that we love, hopefully, and so you're disappointed there. Are you disappointed in like, the reality that like, just reality, the situation, or are you disappointed in like, leaders' inability to find these solutions? Like, are there people who are letting us down or is it the situation that's just bad?
B: No, I think there are solutions, you know, and that we need to debate about, you know, like micro-targeting at very granular levels where you can say one thing to one group of people and another thing to another group of people, in the political world there's that kind of deep micro-targeting where you really get people angry 'cause you understand what might rile them up.
H: Right.
B: Is that net beneficial as opposed to okay, you can go kind at, you know, some large zip code level or you know, broad demographics, but not this just plane on somebody's belief system and then other, you don't pay the price that what you're saying to them is so crazy that other people looking at what you're saying to them go, 'Wait a minute. What--you're just WAY over the top. You're claiming things that aren't true there." You know, usually the marketplace of ideas everyone sees what's being thrown out there, and so you have to worry that you'll offend people as you try and play to these niche type audiences.
H: Right.
B: So I do think--I wonder about micro-targeting, particularly in the political realm.
H: But the political realm can also become the influence realm, and so if like, if it's not about getting this one candidate elected but it's about just making America less stable, making, like, increasing dissent, decreasing trust, decreasing trust and expertise, if like that's your goal and you can use these tools for that, then that's frightening to me.
B: Yeah, the basic human psychological desire to say, hey, there's my group and there's this other group that we don't like and they do the bad things and we're better than them, this transference that can take place, you know, I thought we were on a trend line where because we're so much wealthier, that we can expand our circle of caring, you know, to all human beings. You know, some people would go even to all mammals or something like that.
H: Yeah.
B: But at least just try to go with human beings and there are trends in that direction that, you know, we can think about helping humanity broadly. Now the fact that these things can drive us back into small groups that really fulfill us because that's--that becomes our identity, it's almost like a tribal thing, you know?
H: Yeah.
B: It makes Balkanization look good. It's these very small tribes with these very ingrained beliefs that, as you say, have a certain amount of, okay, I have to defend this because it's who I am and it, you know, it's my honor and we know this and no one else knows it.
H: Right, and this brings me to a--to a big question which is that there are sort of two ways to broadly think about combating this. There's like, educating people to make them better at dealing with it, or helping basically deprogram people who have been indoctrinated into a kind of cult. So like, that work, and then there's the kind of nip-it-in-the-bud on the content side, on the creation side, on the exposure side in the first place, and like, those seem like two very different problems to me.
The exposure problem seems to mostly be a platform problem. What tools do they have to limit the spread of malicious misinformation in terms of how they can control it and also maybe interfacing with the fact that we now have these couple of corporations that just have a tremendous amount of power over the spaces on which people build their worlds, but you know, this other piece to me seems, like this education piece, I don't, like, I don't know how effective it can be, but it does seem to be something that we should be studying a lot and you know, you have put a lot of skin in the game when it comes to like, helping along the process of getting vaccines both developed and into peoples' hands. Is there a similar kind of scale project that needs to be done to understand how misinformation spreads and how to help people get to a place where they can interface with it better and understand the moments in which they're being manipulated?
B: Well, a lot of the progress we've made is where you get exposed to people who have different viewpoints or even that you've been told to think of as very different than you and you see that, okay, that's wrong. You know, the fact that, over time, more gay people felt comfortable with coming out meant that more people said wow, that's somebody I know and love and therefore, you know, being a gay person can't be a bad thing, and so we, you know, our proximity made us realize, okay, they, that's us. That's not the 'other'.
The digital realm a little bit has reversed this by allowing us to create an enclave and you know, you'd think, well, what's the equivalent of the anti-housing discrimination law, you know? Do you force people to meet somebody with different views? That sounds very prescriptive--Big Brother-ish almost.
H: Well, I mean, do you force it or do you allow it, you know? If like, the algorithms are saying here are all the things that you're most likely to engage with and spend the most time on because it's outraging you the most and it's aligning with your biases the most, do you have the opportunities to say like, or the obligation as a platform, to surface things that are like, well, actually, these people might just be trying to take your money in the form of, you know, scaring you and getting you to donate to something that might not be legitimate or buying merchandise, etc, 'cause I think that like, that is much less scary to me and I think that it--but it--I think it does however require a deep understanding of how these things work psychologically and sociologically and to create content, I think specifically to combat some of these ideas, maybe coming from people who have once been inside of the system and who have gotten--made their way out, and it seems to me that there is, there needs to be people making that kind of content and it's not profitable to make though. Like, it's not like you could jump out there and be like, oh, this is gonna get, you know, millions of views and I'm just gonna rake in the ad revenue. Like it--you have to support that content, because it doesn't tell the beautiful, weird, scary, sexy story.
B: Yeah, it's interesting to think of people who are drawn into the stuff and then later were pulled away from it. When did that happened--
H: Why, yeah.
B: And if you wanted that to happen a lot more easily, what facilitated it, you know? It might be somebody that they trusted for other reasons, you know, said to them, hey, come on, you know, those crystals don't, you know, cure cancer.
H: I have one other thing that I want to ask you about before I let you go. You talked a little bit about proximity and how that helped us with being a more understanding of people who are different from us and you know, I think that that oftentimes we pick, in these conspiracies, we pick the people that we have the least proximity to but also like, reasons to feel suspicious of, and that seems like now it is oftentimes those who are powerful, those who are experts and this idea of elites, and I feel like a lot of people maybe don't have that much proximity to power anymore. Maybe they once knew people who had more power, but now there's sort of more isolation between those two groups, the people who feel like they have an impact on society and people who feel a little bit like they don't. We also have a lot more people who are dealing with loneliness and isolation and so are looking for sources of meaning in their lives, and so I think, you know, I don't think that like, to combat misinformation we have to always go straight down to the roots and be like, well, we must combat loneliness and isolation first, we must combat, like, people not feeling like they're participating in the success of the country or at all proximate to or interfacing with the systems of power, but that does strike me as a thing that we, we may be feeling the negative effects of now, and I wonder what you think about that.
B: Yeah, I think the ideal is that you know, we all belong to one or more community groups, you know, where we meet face to face and you know, the decline of religious attendance a little bit is a bit of a problem, because it makes people a bit more isolated and you don't have that group that you're coming together to support and then so you kind of sit out and you expect the government to do more than it can do, so I'd love to draw people back in to those kind of fellowship groups.
H: Some kind of system, yeah.
B: The idea that the elites, you know, in some cases, you know, the salaries of, you know, lawyers, managers, doctors, tech guys in the US, you know maybe the system has been rigged so the elites are having more fun than the rest and you know, it's, there's kind of a conflict there, and from time to time, you need an anti-elitist politician, who, you know, if they're very capable, will look at some of those things and reduce some of those privileges. I think there is a natural cycle there that's okay, you know. The populism, you know, is gonna come up from time to time as part of a cycle. It's scary if you get locked into that, that the cynicism about the system and particularly the populists actually instead of knocking down the elite, you know, actually is just another elite that, you know, managed to masquerade as this anti-elite, you know.
H: Usually that's how it works. Yeah, yeah.
B: Which some people may feel that there is a bit of that, whichever country they come from. You know, some areas like you know, are we treating women better, you know, the fact is that because we're seeing more of the bad behavior, you know, that's probably a good thing and more women are going to get elected, so you say, okay, that's a trend that's positive. Whereas this, the populism and the fragmentation and the extreme right and extreme left, I don't know if that was there before, because they would have had a hard time finding each other and re--'cause they're, you know, very sparse and their family might be saying to them, hey, wait a minute, you know, it's almost like digital addiction and you know, we've always wondered okay, where should society step in to help people out of, you know, take drug addiction. There's a pretty clear case that it's good not to have people falling, you know, into heroin use 'cause it's not in their interest, yet this idea of, okay, here's my ideas, we've always thought of that as more of a protected thing that you go off in your own crazy thing and only so few people will do that that we never need to--
H: Right.
B: Intervene on that.
H: To think about the--who would be doing the intervention is the--you know, it's a difficult thing, and so I--like, you kind of have just two groups. You have the government who, I don't know if they're the right group, and then you have these kind of new kinds of, it feels to me, a new kind of government and these large platforms who, you know, control the space in which a lot of society and economy occurs. That is very clearly that these platforms are much bigger than just businesses in a lot of ways and so the question, you know, becomes when we are looking to those people for solutions, you know, how did they get that much power, how do we hold them accountable with the power that they now have, and also to some extent, do you think there's any, any necessity to try and sort of decrease the amount of power that the leaders of those platforms have?
B: Well, there is a paradox which is, if they sort of behave in this neutral way that just human nature, to go off into these niches, they don't push back on that, they can say, hey, it's not us doing something.
H: Yeah.
B: That's just human nature being revealed with the flexibility of digital connections.
H: Yeah, this is (?~
23:40)
B: And so, yes, of course, we'll just stand back--
H: Yeah.
B: And watch what humans do and (?~
23:45)
H: Well just stand back and--yeah, and optimize our algorithms to, to reinforce whatever will keep them on our website, which is not just what humans do.
B: Fine, fine. Fine.
H: Your algorithms are making the decision.
B: Yeah, okay, with the profit motive on top of that.
H: Yeah.
B: Human nature plus what makes us money, and but if you say to them, hey, do X, Y, or Z, it gets them involved in drawing boundaries.
Some of these authoritarian governments like China, they at least tell the tech companies exactly what to censor. I'm not saying that's a good thing, but they don't define some vague thing like, oh, you must do good things for the people.
H: Right, right, right.
B: Every day they publish the list and they've hired lots of people to look at the synonyms and the various ways it's going, so they take responsibility for shaping what's out there. The idea that the US is gonna hire lots of people and publish a list of--
H: Yeah.
B: --of things, that seems very unlikely and so--
H: It would be bad.
B: You know, where is this normative, hey, you're working against your own interests, you know, is there a civil society group that somehow shows up in a way that's accepted as kind of positive and not political and then the platforms are required to give them an opportunity to get into those dialogues. You've got me thinking about that but again, we're short of concrete solutions for a very important problem.
H: Bill, we're short of concrete solutions for a very important problem. A situation that we will continue to find ourselves in throughout the course of our lives, I'm sure. Thank you so much for spending a little bit of time talking with me today and even more than I think that you asked for, so I really appreciate it and I hope that you have a (?~
25:40) rest of the day.
B: Thanks. That was fun. That was great. Important. Bye bye.
H: It's been like a week since I recorded this video now and the thing that I've been thinking about the most is when we were talking about how people might eventually change their behaviors because they see that the thing that they are doing actually negatively impacts their lives. I don't know why that was such a useful perspective for me. I mean, obviously all the stuff I said, I already knew, so when he said that, I was like, oh, that's a--that's a lens through which to look at this.
That right now, the people who are engaging with, sharing, believing, creating this conspiracy theory content, this made up stuff, these dangerous things, they're doing it because it's rewarding, because they like it, because at the moment, it makes them feel maybe more in control, maybe more powerful, but ultimately in some way, it is a rewarding thing, but also there is no doubt on my mind that long-term it has a very negative impact, not just on society but on the individual person's life. That's not necessarily true for the people who really are at the center of a lot of the creation and sharing of this content, 'cause those people can actually make a substantial amount of money, but in this way, it's a kind of pyramid scheme where you have to get more people to believe underneath you in order to support you in this very, sort of, extra-cultural experience, because if it's a true conspiracy theory and there's like, all of the people in the world are trying to hide it from you, then you almost can't engage with the rest of the world because you believe that everyone else is either, you know, useful dupes or actually evil.
The long-term outlook for that kind of worldview is very negative. I don't know. I didn't come out of this conversation more optimistic, that's for sure, but I wasn't really beforehand. So as usual on hankschannel, we don't have any answers.