YouTube: https://youtube.com/watch?v=VjuFaPAIOHw
Previous: How to Speak With Confidence: Crash Course Business - Soft Skills #4
Next: The Internet and Computing: Crash Course History of Science #43

Categories

Statistics

View count:121,606
Likes:3,113
Comments:94
Duration:09:30
Uploaded:2019-04-04
Last sync:2024-10-29 04:15

Citation

Citation formatting is not guaranteed to be accurate.
MLA Full: "YouTube Couldn't Exist Without Communications & Signal Processing: Crash Course Engineering #42." YouTube, uploaded by CrashCourse, 4 April 2019, www.youtube.com/watch?v=VjuFaPAIOHw.
MLA Inline: (CrashCourse, 2019)
APA Full: CrashCourse. (2019, April 4). YouTube Couldn't Exist Without Communications & Signal Processing: Crash Course Engineering #42 [Video]. YouTube. https://youtube.com/watch?v=VjuFaPAIOHw
APA Inline: (CrashCourse, 2019)
Chicago Full: CrashCourse, "YouTube Couldn't Exist Without Communications & Signal Processing: Crash Course Engineering #42.", April 4, 2019, YouTube, 09:30,
https://youtube.com/watch?v=VjuFaPAIOHw.
Engineering helped make this video possible. This week we’ll look at how it’s possible for you to watch this video with the fundamentals of signal processing. We’ll explore things from Morse Code, to problems like bandwidth capacity and noise, to how we arrived at the digital age.

Crash Course Engineering is produced in association with PBS Digital Studios: https://www.youtube.com/playlist?list=PL1mtdjDVOoOqJzeaJAV15Tq0tZ1vKj7ZV

Check out It’s Okay To Be Smart: https://www.youtube.com/channel/UCH4BNI0-FOK2dMXoFtViWHw

***

RESOURCES:
Sources:
http://www.ee.iitm.ac.in/~giri/pdfs/EE4140/textbook.pdf
http://edison.rutgers.edu/transmit.htm
https://www.history.com/topics/inventions/telegraph
https://www.gaussianwaves.com/2008/04/channel-capacity/
http://www.cs.man.ac.uk/~barry/mydocs/CS3282/Notes/DC06_7.pdf
http://www.madehow.com/Volume-5/Telephone.html

***

Crash Course is on Patreon! You can support us directly by signing up at http://www.patreon.com/crashcourse

Thanks to the following patrons for their generous monthly contributions that help keep Crash Course free for everyone forever:

Eric Prestemon, Sam Buck, Mark Brouwer, Laura Busby, Zach Van Stanley, Bob Doye, Jennifer Killen, Naman Goel, Nathan Catchings, Brandon Westmoreland, dorsey, Indika Siriwardena, Kenneth F Penttinen, Trevin Beattie, Erika & Alexa Saur, Glenn Elliott, Justin Zingsheim, Jessica Wode, Tom Trval, Jason Saslow, Nathan Taylor, Brian Thomas Gossett, Khaled El Shalakany, SR Foxley, Yasenia Cruz, Eric Koslow, Caleb Weeks, Tim Curwick, D.A. Noe, Shawn Arnold, Malcolm Callis, William McGraw, Andrei Krishkevich, Rachel Bright, Jirat, Ian Dundore
--

Want to find Crash Course elsewhere on the internet?
Facebook - http://www.facebook.com/YouTubeCrashCourse
Twitter - http://www.twitter.com/TheCrashCourse
Tumblr - http://thecrashcourse.tumblr.com
Support Crash Course on Patreon: http://patreon.com/crashcourse

CC Kids: http://www.youtube.com/crashcoursekids

This video is remarkable. You're hearing these words months, maybe years, after I've spoken them, yet everything is as clear as if we were sitting in the same room. The ability to record and transmit different kinds of information is a core part of modern engineering, and the world as we know it. That's why some say we're living in the information age.

Whether you're using your phone, turning on the radio, or strumming your electric guitar, you're sending and receiving signals all the time. And, to get all that information where it needs to go, you'll need signal processing.

[Intro]

As an engineer, communicating means more than having a chat in the break room. Whether you're watching YouTube videos, using satellite navigation, or just making a phone call, there's communication happening. Signals are representations of the information we're sending when we do this. Text, sounds, images, and even computer files will all be converted into a signal when you send them. And, that's really what communication is, sending stuff from one place to another to convey information.

The basic task is to take content, turn it into a signal, transmit it, and then turn it all back into content on the other end. These steps are known as signal processing. The signal itself will be a current running through a wire or an electromagnetic wave, like radio or light. However you choose to relay it, the overall process is basically the same.

The problem with communicating remotely is one engineers faced long before digital computers came onto the scene. We saw an example of this in the history of electrical engineering with Samuel Morse's 1837 telegraph. In his design, the operator pushed down a lever, called a key, to complete a circuit and transmit an electric current down a wire. At the other end, a machine called a register would receive that current and mark a piece of paper. By pressing down the key for different lengths of time, the operator could make the register draw little dots and dashes that spelled out a message.

The key and register in Morse's telegraph are both examples of what are called transducers. Transducers take physical information, like the operator's press of the lever, and turn it into a signal or vice versa. To record this video, for example, the input transducer were the microphone and the camera I'm speaking to right now, which measures the sound and light in this environment and converted them to electrical signal. Watching the video involves output transducers, things like your headphones and monitor.

Unlike Mores's system, however, the signal won't stay in one form between transducers. It might start out as an electric current in the camera that gets converted into a file on a memory card. That's transmitted again as a signal when we send the file to a computer or upload it to the internet, where it's stored on YouTube's servers. At least, until you request that the signal be sent to you in its final form, to be converted back into light and sound.

Morse's system was popular, because it was simple and remarkably easy to use, ushering in the era of instant communication we enjoy today. The ingenious part was finding a way to take information as people understand it, in terms of ordinary letters and words, and encode it in a form that could be transmitted as electricity.

Encoding is a key part of signal processing. Signals need a transmission-friendly way of representing the information you're trying to relay. A hundred years after Morse unveiled his telegraph, it was replaced by more sophisticated and convenient forms of communication, like telephones and radios. But, these methods (and everything up to the internet today) are still based on encoding. It's the way the information is encoded and how it's transmitted that's change.

Consider radio waves, like the kind that's used to transmit signals between your phone and cell tower. It's the wave nature of radio that lets your phone encode the information you need to make a call. Engineers design hardware that changes, or modulates, the behavior of the wave to encode information about the pressure of the air near the microphone, in other words, the physical effects of sound. Two of the most common ways of doing this are amplitude modulations and frequency modulation, or AM and FM; that's where the names on your radio dial come from. One adjust the amplitude, or the strength of the wave, while the other changed the frequency, or distance between one peak and the next. Much like telegraph signals, the transmitted wave carried the information you want, which is then decoded on the other side. Similar methods can even represent sounds and images, which is how television broadcasts work. 

But, these methods have two pretty big limitations. The first is capacity. The signal of a radio wave can be thought of as a combination of other, simpler waves put together. Specifically, you can represent a signal as the sum of radio waves with different frequencies. The range of different frequencies you can represent is called the bandwidth, and it limits how much information can be encoded by your signal, as well as how many of them can be sent at the same time. Think of signals as fluids and radio channels as pipes. The bandwidth is like the size of the pipe, which controls how much fluid can flow at once.

The other problem is noise. As they travel through the atmosphere, radio waves interfere with each other and are warped by objects in their path, which both cause distortions. So, the signal the other person receives usually ends up pretty different from the one that you sent. Noise is anything that changes your signal from its original form, usually in a random way. The greater the noise, the more distorted and unrecognizable the received message will be. That's why old TV sets sometimes ended up with static in the image. To go back to the pip analogy, noise would be any contamination the pipe puts into the fluid, changing its concentration. A tine, contaminated pipe does a pretty terrible job of delivering lots of clean water. So, as you can imagine, noisy channels with low bandwidth aren't great for sending signals that can be reliably decoded on the receiving end.

Worse still, both of the problems happen for wired communications as well. The signal traveling down a wire is also a wave, where the amplitude is represented by the power of the electric current at any given point in time. That's how we modulate electric currents to carry signals, but it also means that those signals suffer from noise and capacity issues, too.

Radio and wired communications faced these sorts of problems during World War II, which brought them to the attention of engineer and mathematician, Claude Shannon. In 1948, he published A Mathematical Theory of Communication, which revolutionized how engineers consider information itself and what it takes to send information reliably. Among Shannon's contributions was a mathematical formula for determining the conditions needed for sending a signal at a particular rate. Imagine sending a Morse Code message down a noisy wire. Each segment of the code represents a dot or a dash, what you might call a "bit" of the message. Bit stands for binary digit, because each part of our message only occupies one of two states.

In his paper, Shannon developed a formula that determines the number of bits you can transmit per second, or the bit-rate, given the power of your signal, the amount of noise, and the bandwidth of the channel. When your internet provider advertises a speed of 50 megabits per second, that's Shannon's bit rate. He figured out that it's the ratio of the power of the signal to the power of the noise that determines the bit rate. So either the signal needs to be strong enough or the bandwidth needs to be large enough for there to be so many frequencies representing the signal that noise can't affect them all at once.

As well as this handy formula, Shannon laid out lots of groundwork for calculating the exact conditions needed for reliable communication. Just as importantly, he worked out what kinds of signals you might need to represent the information you're trying to communicate. That work would be vital once signal processing entered the digital age.

Digital signals represent information using a small set of distinct states rather than the continuous variation of a wave. Instead of FM radio, where changes in frequency translate exactly to changes in sound, digital radio sens the data piece by piece, and everything is reassembled on the receiving end. Because the different states of the signal can be more distinct, they're much less susceptible to noise. A large difference is easier to distinguish than a small one, even when it gets distorted.

Morse code, with its dots, dashes, and spaces, was an early digital communication system. But, it would take the advent of computers for digital signaling to really take off. And it was Shannon's work that allowed computer scientists and electrical engineers to find ways of encoding different kinds of information in terms of 1s and 0s, what we now call binary code. Digital signals have come to form the basis of computing, and every form of data associated with it. All of which are still used today.

Of course, we've only just skimmed the surface. Signal processing overlaps with some serious technical challenges. There's the task of actually encoding different sorts of information as signals, and creating channels like phone lines and WiFi routers to transmit them. And, there's the challenge of building hardware that transmits the final output, like computer monitors and headphones. But, the end result is that you can stream videos like this one at the click of a button, virtually anywhere in the world. I might be a little biased, but I think that it's pretty darn cool.

In this episode, we looked at the fundamentals of signal processing. We saw the need to represent information as signal so it can be transmitted, and an example of that in Morse Code. We explained how wired and wireless communications can suffer from the problems of bandwidth capacity and noise, and how Claude Shannon helped quantify the problem so that engineers could build around those limitations and bring about the digital age.

Next time, we're headed out to sea to talk about moving physical objects with ships and marine engineering.

Crash Course Engineering is produced in association with PBS Digital Studios, which also produces It's Okay to Be Smart, a show about our curious universe and the science that makes it possible, hosted by Dr. Joe Hanson. Check it out at the link in the description.

Crash Course is a Complexly production, and this episode was filmed in the Dr. Cheryl C. Kinney Studio with the help of these wonderful people. And, our amazing graphics team is Thought Cafe.

[Outro]