YouTube: https://youtube.com/watch?v=QKynMeYpF_o
Previous: 200 Random Facts Presented Without Context
Next: 29 Odd Old-Timey Life Hacks

Categories

Statistics

View count:50,166
Likes:1,449
Comments:328
Duration:14:27
Uploaded:2022-05-11
Last sync:2024-03-14 09:30
It's the 1980s. Everyone is rocking a mullet, drinking New Coke, and avoiding strangers due to the widespread kidnappings of children. Which of these cliches is fact, and which is fiction?

The 1980s were an iconic decade that some of us lived through, and some of us have absorbed through TV and movies. Let's see what society remembers correctly, from the death of hair metal to the use of payphones for criminal activity.

Host Justin Dodd (@juddtoday) breaks down some common myths and misconceptions about our favorite decade, the '80s.

Website: http://www.mentalfloss.com
Twitter: http://www.twitter.com/mental_floss
Facebook: https://facebook.com/mentalflossmagazine
Of all the oversized, regrettable hairstyles of  the 1980s, one bad choice sits above the rest: the mullet. You know what I’m  talking about. The squirrel pelt.

The Arkansas waterfall. The ape drape. The  practice of cutting your hair short in the front and sides and keeping it long in the back.

It’s  a look that says you know how to party and still show up for work more or less sober the next day. It’s a look George Clooney rocked on The Facts of Life—that Patrick Swayze worked in Road House. And a look that absolutely no one actually called a mullet in the ‘80s.

It wasn’t until 1994, when  the Beastie Boys released a song called “Mullet Head,” that the unfortunate hairstyle  was given its equally unfortunate name.   “Mullet head,” as an insult for a stupid  person, dates back to 1855, according to the Oxford English Dictionary. Mark Twain used  it in 1884’s Adventures of Huckleberry Finn. People were definitely rocking mullets before  the Beastie Boys’ song, but they weren’t calling them mullets.

According to a couple  of reader submissions printed in the Guardian, the ‘do was sometimes called a  “bi-level' in the 1980s. Incredible. People rocking a mullet who  actually knew they had a mullet is but one of many misconceptions about  the ‘80s I’ll be addressing today, from the concept of “stranger danger” to  the idea that the ‘80s represented the epitome of American greed.

Let’s get started. Hi, I’m Justin Dodd, welcome to Misconceptions. Whether you lived through  the ‘80s or merely absorbed some details via Stranger Things, we’re taking a look at the myths  surrounding the me decade.

But before we do that, we need to confirm—yes, ALF really was a top  ten rated show. That is NOT a misconception. Prestige television was a long ways off.

If you could fit inside of a car trunk in the  ‘80s, there was no doubt you were constantly warned about the perils of interacting with  strangers. Newscasts and newspapers were rife with stories about missing kids and  cautionary tales about child abductions. It was a public crisis.

It was  even given a catchy name—stranger danger. Everywhere you went, it seemed like  taking your eyes off a child for one second would result in a lifetime of regret. If you  were a child, you were taught that a maniac was lurking around every corner.

But was  there really an epidemic of kidnappings? There was not. There *were* some unfortunate  circumstances that led the public to be afraid of one, though.

In the early 1980s, a number  of missing children—including two paperboys in Iowa named Johnny Gosch and Eugene Wade  Martin—received a great deal of media attention. Anxious parents and public officials were often  in front of cameras and microphones, and they were understandably upset. The disappearance of  Adam Walsh in 1981 only added to the concern.

More than 38 million viewers tuned into a 1983 TV movie  about his abduction. Cartoons had warnings about talking to strange adults. One survey estimated  kids in grades under six were about as afraid of being kidnapped as they were of nuclear war.

The widespread coverage of these incidents made it seem like the danger was omnipresent. At one point, the media was reporting up to 50,000 children were being abducted annually, and  the sight of missing kids on milk cartons meant anyone having breakfast was being confronted  with the possibility of a child—maybe their child—going missing. At the time, abductions seemed like they were happening everywhere.

And who can blame  parents for being protective of their children? But even back in 1985, the LA Times was reporting  data that cast some serious doubt on the supposed spate of child abductions. The FBI had reports  of 67 stranger kidnappings that year, and the National Center for Missing and Exploited children  said they had “firm records” of 142 cases.

Obviously every one of those cases is one too  many, but the media focus on strangers risks misleading the public about the actual risks  to children. In 2018, for example, the National Center for Missing and Exploited Children reported  helping law enforcement with 25,000 missing children cases. Of those, 23,500 were runaways  and 1000 had been abducted by family members, some of which may have been related to  parental custody issues.

In other words, there wasn’t, statistically speaking, that  much danger from strangers—just a relative handful of high-profile cases that captured the  public’s imagination and a much larger number of unfortunate, but less sensationalistic,  stories not involving strangers at all. In 2017, the Center even called for an end to  the phrase stranger danger, citing statistics that most juvenile crimes involved people the  child knew and that at times it might actually be beneficial for a kid to reach out to a stranger  if they need help. And in extreme circumstances, it’s even OK to approach someone with a mullet.

My  dad rocked one in the 80s, and he is a very lovely man who definitely would have offered to help. Everyone who remembers the 1980s remembers a  decade of excess. Tons of cocaine.

Tons of money. A questionable number of leg warmers. But did people in the ‘80s really have an unquenchable thirst for wealth?

Well, maybe. But probably no more so than in any other decade. One way to define greed  is by the amount of charitable giving being done, or lack thereof.

By that metric, the ‘80s saw  unprecedented generosity. In 1980, Americans gave roughly $65 billion to charity, adjusted  to “1990 dollars.” By the end of the decade, that number had grown to over $100 billion. As a  percentage of national income, that’s far higher than it was in the 25 years *prior* to 1980.

Was all that generosity a result of greater wealth? Could be. But the growth in charitable  giving outpaced what people in the ‘80s were spending on material goods.

Giving grew  68 percent that decade over decades prior, while total consumer spending grew 48 percent. It’s easy to see why people stereotype the ‘80s as the “me decade.” In the United States, income  tax rates were slashed on the highest earners—but for much of the decade they were  still higher than today’s top rates. In the 1980s, the number of millionaires in the  country went from 2.8 million to 3.2 million.

But twice as many new millionaires  were minted in the 1990s. Yes, many brokers liked flashy watches and  suits. Madonna had a hit with “Material Girl.” But does flashy equal greedy?

Greed typically  means hoarding as much as you can. Record charitable giving doesn’t support that idea. Before the proliferation of the cell phone, making  a call while outside of your home typically meant using a pay phone—those virtually indestructible  public phones in booths or installed on streets that seemed to scream out, “Please use me  to conduct illegal activity.” Many people thought that no one could trace a public phone,  allowing drug dealers to cover their tracks.

Some communities even lobbied to have pay phones  removed, citing concerns over criminal activity. But public pay phones actually worked a lot  like regular landline phones. Inserting a coin and dialing a number created the same record of  the date, time, and recipient of the phone call, making for a handy reference for law enforcement.

Now, it is true that some company’s pay-phones didn’t keep such records, but others did. And  since most criminals didn’t bother making the distinction, anyone relying on a payphone to  conduct illegal business was taking a chance that their illicit activity would be discovered. The caller might be able to remain anonymous, but most everything else, like the time and length  of the call, and the number on the other end, was fair game.

Some cities even removed the  ability for a pay phone to receive an inbound call in order to make it more difficult for dealers  to treat the phone booth like a remote office. The phones simply weren’t a foolproof  method of concealing a person’s identity. Because of the stigma, though, a lot of pay  phones were removed from places where they were of actual use to law-abiding citizens.

Removing  them likely did far more to keep innocent people from making innocuous calls than it did to  help criminals keep themselves anonymous. Interesting sidenote about pay phones. While  they’re fairly useless in today’s tech world, they were essential in the early 20th century.

In 1946, only half of U. S. homes had a home phone. In some neighborhoods, one pay  phone might service multiple homes.

And yes, criminals were up to payphone mischief  back then, too. Wise guys sometimes tied strings to coins to try and pull them back out  of the machine after making calls. These would-be freeloaders were often thwarted,  though, by string cutters inside the phones, a low-tech security measure that started to  appear around the 1930s.

And even though research proved otherwise, I still like to picture a  tiny little dude sitting inside the phone, waiting eagerly with a big pair of scissors. Mullets were not the only questionable follicular choice of the ‘80s. Many men and women  teased, preened, and shaped their hair into wavy cascades using voluminous amounts of hairspray.

In 1985, this vanity seemed to have brought the world to the brink of destruction. That’s when  scientist Joseph Farman and others disclosed that the atmospheric ozone over Antarctica had  been reduced by approximately 40 percent. Ozone, or trioxygen, is a gas that protects us from the  sun’s potent UV rays.

It’s nature’s sunscreen. If it disappeared, well, that wouldn’t be  good. Like, dystopian levels of “not good.” Farman and others pointed the finger at  chlorofluorocarbons, or CFCs, a type of chemical that had been commonly used in hairspray, air  conditioners, and refrigerators; levels of CFCs had risen high enough to damage the ozone layer.

But even though that theory was confirmed in the 1980s, it had actually been developed in  the 1970s. It was in *that* decade that manufacturers voluntarily stopped using  CFCs and the United States banned CFC use in aerosol products, except in the case of  certain medical applications like inhalers. So those 2-foot-tall hair‘dos in the ‘80s did not  actively contribute to the hole in the ozone.

As for that hole? We don’t hear about it much  anymore since the passing of the Montreal Protocol in 1987, which banned most ozone-depleting  substances from use on a global level. With some luck, the ozone could be fully  healed in the next few decades.

Hopefully. It’s considered one of the biggest consumer  products blunders of all time. In April 1985, after months of research, Coca-Cola unveiled a  drink they dubbed New Coke.

It was a sweeter, more syrupy version of their classic recipe,  one they hoped would better compete with the surging rivals at Pepsi. This wasn’t just  an alternative. It was a replacement.

Why was Coca-Cola so confident in switching up  one of the most beloved soft drinks in the world? Taste tests! Extensive market research  demonstrated that subjects preferred a slightly less fizzy and slightly sweeter  Coke product.

And this wasn’t a few people they cornered at a shopping mall. The company  conducted a reported 190,000 taste tests, and the results were highly encouraging. Unfortunately, what Coca-Cola didn’t count on was the emotional connection people had with the  taste of OG Coke.

New Coke was quickly condemned by soft drink enthusiasts, and common wisdom  has it that Coke pulled the drink from shelves almost immediately owing to mass outrage. While the drink had plenty of detractors, though, none were as vocal as Gay Mullins, a  semi-retired real estate agent who found New Coke so off-putting he sunk $100,000 into a campaign  against it. Mullins was often cited in the media, giving interviews and buzzworthy quotes like  calling the lack of soda choice “un-American” and the new formula “unbelievably wimpy.” He sent  out bumper stickers and set up telephone hotlines.

Gay Mullins was waging a war against  Coca-Cola, and he was winning. It turns out that his motives may not have been  entirely altruistic. Mullins later admitted he was hoping to cause enough commotion for  Coca-Cola to pay him in hush money, or even inspire Pepsi to feature him in a campaign.

When  Coke finally relented and withdrew New Coke as its primary offering in June, Mullins said he’d  be happy to speak on their behalf—for $200,000 per appearance. In the ultimate sign Mullins  may not have been a true devotee, he couldn’t tell the difference between Coca-Cola Classic  and New Coke in a blind taste test. Mullins, you gotta do your research, like us Skittles-stans  who fought for and won green skittle’s return to Lime over green apple.

Good work comrades. One other big misconception about New

Coke: It didn’t actually go away in the ‘80s. Coca-Cola  left it on shelves and let consumers decide which flavor they preferred. The company  actually kept production of the product rolling until 2002 under the name Coke II. Everyone knows the story.

The ‘80s were ruled  by Mötley Crüe, Poison, Van Halen—gods of rock who sported the kind of hair that could destroy  the ozone layer. Which we now know is nonsense, of course. Either way, in the early 1990s, the  Seattle sound took over.

Spandex pants were traded for cardigans and Nirvana and Alice in Chains  sounded the death knell for flashy rock bands. Of course, the so-called grunge scene grew  popular, but it wasn’t exactly at the expense of hair bands. No less an authority than  Vince Neil of Mötley Crüe has said that he bought Nirvana’s Nevermind album and passed  it around, encouraging people to listen to it, and that the business of his band didn’t change.

Grunge offered a new sound, but it wasn’t like New Coke. It wasn’t replacing other genres. There also wasn’t really any rivalry.

Kurt Cobain reportedly bought and loved  Too Fast for Love by the Crüe. Alice in Chains opened for both Poison and Van Halen. So what really happened to hair rock?

Dee Snider of Twisted Sister once opined that hair bands  did themselves in and were already in decline by the time grunge took over. He said, quote: “It became too commercialized, and then it got unplugged and [became] nothing but power  ballads and acoustic songs, and it wasn’t metal anymore, it had to go, it had to change.” So why did the media portray a grunge takeover? Well, it made for a pat story.

But it may have  also been that hair band listeners were simply aging out of their ‘80s tastes and looking  for something else, which they would have done with or without grunge. Cultural tastes  change constantly. After all, you can’t rock a mullet forever.

Unless you hold out just  long enough for them to come back in style. Leave a comment below telling us your favorite  80s hair rock song. And thanks for watching.