YouTube: | https://youtube.com/watch?v=i861JOJDxbs |
Previous: | #TEAMTREES |
Next: | LAST CHANCE |
Categories
Statistics
View count: | 354,982 |
Likes: | 16,905 |
Comments: | 1,997 |
Duration: | 17:11 |
Uploaded: | 2019-10-26 |
Last sync: | 2025-02-08 19:30 |
It seems to me that there are not many ways to have new social media platforms succeed, and the biggest crack in the armor of any of the big boys is social responsibility. That's bad news. Very curious about people's thoughts on this!
(00:00) to (02:00)
Hello there!
I just had a terrifying thought, so I figured HEY I’ll just come to let you know about it, cause I’m sure that you don’t have enough to worry about right now. I think a lot about social media and I think a lot about content, and like, how like content and platforms interface and also like the human behavior that that like exists in between those things.
And some of the economics that exists, that like to to sort of like push for. or against one or the other of those things. And here’s something that like, is true of content: the the more people want to do this job, that I have, the the harder they will work for it. And hard work functions.
And I’m sorry that I’m like a 40-year-old judgmental old man now, but this functions in two different ways; one as you create higher quality content, that doesn’t have to mean more expensive content, but it doesn’t usually mean harder work, sometimes it can mean really good ideas, but it’s usually some combination of hard work and good ideas and more money. And then there’s this like other thing. There are folks who will also differentiate themselves not with money, not with quality, but by being, you know, by pushing boundaries. And I think that’s something like sometimes you get like, yes, go push boundaries there are lots of boundaries that you know we don’t know whether they’re necessary until we push on them, but but also like sometimes it’s just like this is clearly destructive stuff for kids.
And ahhh like im old and im a dad and i dont like it. So there’s there’s also that thing. And then I think back to how YouTube started, which is it was pushing a boundary, it wasn’t pushing a cultural boundary was pushing a legal one.
And that legal boundary was copyright.
(02:00) to (04:00)
Lots of stuff was getting uploaded a YouTube, daily show clips, Family Guy Clips, and people were going to YouTube to watch that stuff that they couldn’t find any other way. You’d have to watch a whole up is that a Family Guy if you wanted to see just the funny part.
That is kind of the story of a of social platforms. And I think even, you know you could you could imagine Facebook this way. Facebook didn’t think it ton about the significance of the impact that they might have before launching, and like their their thesis was move fast and break things. And you know...
I don’t think they meant democracy, but here we are. Not broken but, you know. dented. And now we are entering into the era where these platforms, and really there aren’t that many of them, there’s Twitter Facebook, Amazon - through twitch, and YouTube- Google.
But then you’ve got like a couple of minor players like Snap which has managed to maintain its independence. despite And then once it was clear that they were going to do that, Facebook just just went after them, and mostly, you know, neutralized a lot of what of snaps competitive advantage. But what you didn’t see was snap really moving fast and breaking things. There’s a little bit of that, when they launched they’re like like location cloud feature. there was some concern about how that might be used for privacy stuff.
But for the most part to me Snap feels like it’s moved out of the realm where it’s going to do something that’s sort of like super weirdly world-changing, (and but) but in a like a destructive way, or either a constructive way. Like it just sort of seems like it’s going to exist, and do its thing, and be good at messaging and be good at like certain certain kinds of communities. But I don’t use it very much anymore though so what do I know.
(04:00) to (06:00)
And then we get to TikTok which I think is really interesting, not just from the like, it’s its own self, but like what it might mean for the future, and I think what it might mean for the future is that we in the US, for two different reasons, can’t create things like this, that move fast and break stuff, and so we won’t anymore. The two reasons are monopoly, because Facebook and Google and Amazon won’t let it happen, they’ll snatch it up or they’ll crush it. And two, because we care now, we’re terrified, we’re like Our regulators are starting to pay attention, we’ve got some people in there who actually understand any of this,we’ve got journalists who are writing stories about it, we’ve got critics, we’ve got op-eds. We’ve got all the things, people being like "actually we’re worried and we should be more careful with this," because we have no idea what the impacts are and they could be pretty severe, we’re seeing an increase in teen depression and teen suicide, we’re seeing more misinformation, and it’s spreading faster, and like all that stuff’s worrying, and so like we’re gonna pay attention. And these main companies are gonna invest less in this kind of content, in these kind of companiesm so Facebook in Google mostly- also Amazon, are going to invest less, in content type in like content platforms They’re going to push them less, they’re not and they’re going to compete them out, because they have the power to do that. So now what you have is like.
Back to my original conversation here, the reason I brought up the content because like I see things through the lens of content.
You can’t compete on quality, you can’t compete on money, and so you have to compete on being on the outskirts a little bit.
(06:00) to (08:00)
And if you can’t do that inside of America, because of you know fear of regulation, and this like overwhelming monopoly that we’re dealing with. Then you like the entrants, almost definitionally, have to be from elsewhere and start and build in other markets before coming into the US, and that’s really scary to me, One of the big reasons that I’m terrified of Google and Facebook, is because we live there, and they are corporations. And so I live in a company town, that is called YouTube, and that’s where I run my business, I spend a lot of my time there, I socialize there so in other other places that are corporations, Twitter and Facebook.
And I, you know, a lot of socialization happens on the inside of Apple’s ecosystems, and the inside of Facebook’s ecosystems, so like, we now live, (I’ve talked about this on this channel before) inside of corporate spaces. Places that are we we have no voting rights, but we do live there. Theyre where like, social interaction happens and I don’t think that you can call it anything but like, living. So we live in spaces that are controlled by corporations, TikTok it’s a little a little more worrying than that. One, because like it’s already a corporationm so you have to worry about all the same stuff, but then also it’s a Chinese corporation. Which means it has to abide by all kinds of China’s specific rules, that mean you know censorship is more easy, they can sort of like the Chinese government can ask for datam, and the you know in the US there are pretty serious rules against like when and how data can be shared with the government, in China they there are none. So that’s that that’s a thing of concern.
(08:00) to (10:00)
And then also like just generally because China, the state, has a lot more involvement in corporations and the state is one that has a somewhat antagonistic relationship with other states, particularly the US and Europe. You know, you start to wonder could this be both data and also systems of control that might be useful to a Government, to a state. And that's, blah. How do you interface wit that? And then the other reason that this freaks me out is that like there's this thing that nobody- like that platforms will say that I don't wanna believe but I don't know if I actually don't believe it. So what they say is you need to be this big. You need to be Facebook big in order to deal with the problems that Faceobok has created.
And yeah, and so like to put it another way, Facebook is only useful to us because everybody uses Facebook and everybody's there and we can see all of our cousin's kid's pictures. So that network effect is the thing that makes Facebook useful and then that network effect also creates all of these problems with regard to how the algorithm you know, keeps people on the sight. And so we need the algorithm to make the network effect so that the product will be useful but the algorithm is maybe taking certain people who have fairly strong political persuasions and putting them down rabbit holes that make them more extreme in their beliefs. Create filter bubbles, create people who are, and one of the things Facebook says, I love this, "People on Facebook actually see a greater diversity of perspectives than people who aren't on Facebook".
(10:00) to (12:00)
This is so insidious because it's absolutely true but you know what the perspectives when I go on Facebook the perspectives that I see from the other side aren't like "Here's my nuanced take on why it's actually.. why I should actually have a handgun". What I see are liberals posting the craziest conservative shit they can find. So I'm exposed to the shit but I'm exposed to the worst of it, right? That's how the filter bubble actually works. That's the really insidious thing about it.
Isn't hearing your own beliefs parroted back at you. It's hearing the craziest most disgusting meanest most fucked up things that the other side says. That and like I'm being exposed to those people, but it's the way the filter bubble exposes you to the worst of it. That's the.... and it makes you more scared and it happens on both sides and like yeah liberals say a lot of shitty stuff too so they'll see the shitty stuff. So just because you're being exposed to a diversity of viewpoints doesn't mean you're not being pushed into a more extreme viewpoint.
As we move along, Facebook is saying that in any platform that actually had the level of network effect necessary for it to be a useful social platform, there would be massive problems with content moderation. So you need to be very big and have a lot of resources to throw at the content moderation problem both in terms of human moderators and in terms of machine moderators.
This is the case that YouTube, Google, and Facebook are making. And then if you fracture them up, if you break them apart, like if you take Instagram and you put it over on the side then Instagram says, "Our entire team now is smaller than the Facebook content moderation team and so we have fewer resources which to tackle these problems". This has got to be bullshit, right? Except I'm afraid that it isn't.
(12:00) to (14:00)
I'm afraid that these are really big problems and that the network effects do actually require the platforms to be this big and that any platform this big would have these problems and if we have smaller networks...
There was recently, I can't remember what it was called, but there was a platform watch that was basically designed to sell weed but also was like an anonymous and open... I don't know, social network of a sort. And of course it immediately devolved into gore and bad porn - not the good porn - and then yeah, stuff that.. illegal stuff, because it was out there being anonymous and a bunch of people were all like "Oh hey"... I don't know it probably still exists. I literally can't remember what it was called, I just read an article about it.
That does make me worried and like TikTok makes me worried that it's sort of like when a new platform arises, it doesn't have to be beholden to all these current regulations or current concerns about regulations that existing platforms need to be concerned about. And that's the only thing that's going to... it's the only chink in the armor, so...
One thing I will say about anti trust is that we need to create space for growth. If we had three big trees that are shading out the entire ground, nothing else will ever happen... but if the only hole that we allow is like, you can be small enough to avoid regulation, or small enough to avoid the fear of regulation, or you know, that's the only competitive advantage any platform can have then the only platforms that rise up are giong to be worse than the ones we have now. Maybe not in terms of creative ecosystem, I actually thing TikTok's creator ecosystem's really great, but in terms of caring about the impact that the platform will have on the world. And I think that any platform right now in the US that's getting funding is probably trying to be more careful about that stuff than less careful like TikTok is.
(14:00) to (16:00)
And I think that that means that none of them will succeed. Because there's, that's the only chink in the armor. It's the only beam of light coming through the Forest canopy is like concern about harming the world. And if you have a company that's like "Okay, well then I won't be concerned about harming the world for the first couple of years and that will be my competitive advantage" then suddenly that's every platform that you have. If that's the only ray of light that's the only thing you'll get. Which is why we need to cut down these trees... We need to make space.
I hesitate to even say it because I benefit from Youtube's fucking monopoly in video because like, I have a good position on Youtube. I have successful channels that have the subscribers. I understand how this system works and a new on would be confusing to me and I have good relationships with the people who work at Youtube.
(points to self)The entrenched power structures don't want to talk about this, and if I'm talking about it, it's only because I'm like a fucking-can't-keep-my-mouth-shut about things that I perceive as like concerning, unjust, or interesting.
So this is the thing that I'm worried about. That there is no competitive advantage except caring less about the impact that the platform has on the world. In the same way that, you know, one of the competitive advantages that content creators were able to go after as Youtube has gotten more and more crowded, is being a little more risque and not caring as much about the impact that you have on like the children who are watching your vidoe. And I don't have to say out loud the names of some of the content creators who do that stuff but like, they're out there.
And also I think that it's really interesting to watch them as they get a chance to interact with the people who are fans of their content and see them sort of moderate what they do.
(16:00) to (17:11)
Slash, like they get their audience power and then they recognize they're not going to get deals with media companies and you know, sponsorships if they don't sort of tone it down. So ultimately the thing that keeps us all behaving on these platforms in the long run is that advertisers won't work with you if you don't fit a certain mold. And that, I think, sometimes actually is, you know, this is 40 year old dad talking, but sometimes actually, sometimes very very bad. And uh, you know, I think that it's good when it's actually considering, "Okay children have to be thought of differently in terms of what content is created for them". It's bad when it's "Okay but we don't want to push certain ideas because they're bad for the status quo". Um, this is less about children and more about, um, you know, protecting entrenched power structures.
So that's where we're at.