Previous: SciShow Talk Show: Dr. Amanda Duley, Brains, & Joy the Macaw
Next: 3 Misconceptions About Juice Cleanses



View count:742,502
Last sync:2023-01-19 01:15
Nadine the robot has been unveiled, and as robotics technology gets more advanced, humanoid robots are looking more and more human. In this episode of SciShow News we explore how Nadine works and why a lot of people find it creepy.

Images and Video of Nadine courtesy of NTU Singapore.

NTU Article on Nadine:

Hosted by: Hank Green

SciShow has a spinoff podcast! It's called SciShow Tangents. Check it out at
Dooblydoo thanks go to the following Patreon supporters -- we couldn't make SciShow without them! Shout out to Justin Ove, Justin Lentz, David Campos, Chris Peters, Philippe von Bergen, Fatima Iqbal, John Murrin, Linnea Boyev, and Kathy & Tim Philip.
Like SciShow? Want to help support us, and also get things to put on your walls, cover your torso and hold your liquids? Check out our awesome products over at DFTBA Records:

Or help support us by becoming our patron on Patreon:
Looking for SciShow elsewhere on the internet?


Hank: You are about to enter another dimension. A dimension not only of sight and sound, but of mind. A journey into the wondrous land... of robotics. There’s a sign post up ahead, next stop: SciShow News.

[Intro plays]

This week, scientists at Nanyang Technological University in Singapore unveiled Nadine, a robot receptionist with soft skin, real hair, and an uncanny, echoey speaking voice that comes from somewhere other than its face, even though its lips move when it talks. It will reach out with jerky motions to shake your hand, remember you and your previous conversations, and it can even use body language to show happiness and sadness.

Nadine is the next generation of human-like robotics, and it’s an incredible scientific achievement, uniting sophisticated motors with rapid real-time sensors and a cutting edge artificial intelligence, or AI. And a lot of people think it’s super creepy Like me, for example. More on that in a second. First, let’s talk about how Nadine works.

Like Pepper, a personal companion robot that went on sale in Japan in June of 2015, Nadine has concealed cameras connected to facial recognition software. Those cameras tell Nadine where to look, so it can make eye contact with its users, and also feed the AI information about the user’s facial expressions, by comparing things like the shape of the user’s mouth and eyes to an internal database of facial expressions. I’m wondering if one of the facial expressions is this...

Other cameras track the user’s proximity and body language. Some of the body language information is also sent to the AI, and some of it is sent to the programs that control the robotic motors that move and position Nadine’s arms and upper body. That’s how it knows to shake your hand when you reach out to it, take paperwork you offer it, or hand you things from its desk. Nadine’s AI works a lot like other AI companions you might have talked to before, like Siri or Cortana.

Their AIs, like Nadine’s, aren’t actually stored on your phone, or inside a humanoid robot receptionist. Those programs are run on massive servers that can handle huge amounts of information, and the device -- whether that’s a phone, or Nadine -- just connects to them wirelessly. Those servers contain the AI’s vocabulary and speech recognition software, and records of everyone it’s ever talked to, and what they’ve talked about. They also contain the bananas-complicated programming algorithms that let the AI cross-reference all of its previous interactions to figure out what was successful, and what wasn’t. That’s what lets AIs like Siri, or Nadine, learn over time. There are way too many possible combinations of words for programmers to be able to teach them how to respond to any phrase you might say. So, instead, these AIs use that stored information to identify keywords and context so that they can, hopefully, /guess/ what you want.

And as for Nadine’s soft, sensitive skin, that actually lets the robot respond to touch? The scientists at NTU haven’t released its technical specs yet, but we know how most artificial skin works in cutting edge robotics. Generally. a film of thin, super flexible rubber is placed between two sheets of parallel electrodes. Each individual electrode has a partner on the other side of that film of rubber, and a small charge is generated by each pair of electrodes, which is stored by the rubber between them. When you touch this artificial skin, it’s compressed: meaning that the rubber thins, and is able to store less electrical charge. Information on how much charge there is between each pair of electrodes is fed to a program that creates a topographical map of where and how the skin’s being touched. Then the robot’s AI decides what to do with that information.

So, Nadine is an incredible fusion of different disciplines. AI, robotics, computing, pressure-sensitive electronic skin... so what makes it so creepy? If Nadine gives you the heebie-jeebies, you’re experiencing something called the uncanny valley. The uncanny valley is based on the idea that the more human-like and less machine-like something seems, the easier it is for humans to respond to it emotionally. Except when something looks almost life-like, but is off by just a little bit. That makes our emotional response go way down -- our brains just think it’s creepy.

That’s because when an artificial human looks just human enough, all we can focus on is where the illusion is falling short. It makes our parietal cortex light up. The parietal cortex connects the brain’s visual processing center with the motor cortex. When you look at what your brain thinks is a person, it mirrors what you’re seeing onto your own muscular system. It’s an evolved function that helps us learn, and it’s why seeing someone else do something makes it easier for us to do it ourselves. But then we see something like Nadine. It’s human-looking enough that our parietal cortex kicks in and tries to mirror it... but its movement isn’t as human-like as its appearance, so it can’t be mirrored onto our muscles.

Essentially, Nadine fools our visual cortex, but not our motor cortex, and our parietal cortex is what picks that up. It doesn’t know what the problem is, but it knows there’s a problem, and that you should probably be scared. Does that robot receptionist want to eat you? It doesn’t know. Don’t risk it though, just run. So Nadine might be a super advanced robot, but it probably won’t be replacing human receptionists any time soon. Not until they can get it out of the uncanny valley.

Thanks for watching this episode of SciShow news. if you love SciShow and want to share SciShow with those you love. Check out our SciShow Valentines that correspond to upcoming episodes, at And don’t forget to go to and subscribe.