Previous: I’m A Genetic Engineer. I’m Also a Fish.
Next: The Rarest Objects in The Solar System



View count:2,120
Last sync:2024-04-19 17:30


Citation formatting is not guaranteed to be accurate.
MLA Full: "Why the Perfect Clock Is Impossible to Build." YouTube, uploaded by SciShow, 19 April 2024,
MLA Inline: (SciShow, 2024)
APA Full: SciShow. (2024, April 19). Why the Perfect Clock Is Impossible to Build [Video]. YouTube.
APA Inline: (SciShow, 2024)
Chicago Full: SciShow, "Why the Perfect Clock Is Impossible to Build.", April 19, 2024, YouTube, 10:11,
We can make clocks that keep accurate time for millions of years. We can also make clocks with such high resolution they tick one billion billion times per second. So why can't we make a clock that does both?

Hosted by: Savannah Geary (they/them)
Support SciShow by becoming a patron on Patreon:
Huge thanks go to the following Patreon supporters for helping us keep SciShow free for everyone forever: Adam Brainard, Alex Hackman, Ash, Benjamin Carleski, Bryan Cloer, charles george, Chris Mackey, Chris Peters, Christoph Schwanke, Christopher R Boucher, DrakoEsper, Eric Jensen, Friso, Garrett Galloway, Harrison Mills, J. Copen, Jaap Westera, Jason A Saslow, Jeffrey Mckishen, Jeremy Mattern, Kenny Wilson, Kevin Bealer, Kevin Knupp, Lyndsay Brown, Matt Curls, Michelle Dove, Piya Shedden, Rizwan Kassim, Sam Lutfi
Looking for SciShow elsewhere on the internet?
SciShow Tangents Podcast:

#SciShow #science #education #learning #complexly
Dr. Florian Meier, interview

Image Sources:
Why can we remember the past, but not the future?

No, I am not pretending to be a freshman philosophy major cornering you at a party. I’m pretending to be a physicist.

Because the inevitable, forward march of time is one of the big unsolved problems in physics. And to study that question, scientists  have stripped clocks down to their most fundamental parts to try  and understand how time truly passes. But in doing so, they  stumbled upon a curious fact: It is impossible to build a perfect clock. [♪ INTRO] Whether you’re trying to understand  subatomic particles or the whole, observable universe, you’re  going to use some math.

Specifically, a bunch of equations that represent the laws of physics and  tell you how systems change. But the fact that systems change means we have to bring time into those equations, right? Well, it turns out that every basic  law of physics is time-symmetric.

That means it works just as well at  predicting what something will look like, as it does saying what it used to look like. So if you know everything there is to  know about something in the present, you can ‘run’ the equations  forwards to predict the future, or run them backwards to ‘retrodict’ the past. Which means that from a physics perspective, it’s actually not obvious at all what  makes the future different from the past.

Unless, of course, you are the  second law of thermodynamics. This particular law deals with a property  that all systems have, called entropy, which tells you how random,  jumbled-up, and messy that system is. It’s basically a measure of disorder.

And according to the second law of thermodynamics, if you want to make some small corner  of the universe a little more orderly, you must pay for that action by  increasing entropy somewhere else. Which is a great excuse to not  make your bed in the morning. But there’s a second part to the rule.

There are far more ways for a system  to be disordered than ordered. Imagine taking a novel, tearing out all  the pages, throwing the pile into the air, and then sweeping them back  up as fast as possible. What are the chances you get  them in an order where most, let alone all of the page turns make sense?

This jumbling-up process  naturally increases disorder. So over time, the second law of thermodynamics also tells us that entropy inevitably increases. And that means we can differentiate  between the past and future.

The past has less entropy. The future has more. And a couple of years ago,  scientists in the UK and Austria discovered that this fact actually has  important implications for how we tell time.

To measure the passage of time, you need a clock. Doesn’t matter what it looks like. It just has to have something inside  it that ticks at a fixed rate.

That ticking comes from an oscillator, something that goes back and  forth in a predictable way. Maybe it’s a pendulum in an  old-timey grandfather clock that swings once per second. Maybe it’s a pair of broken automatic  doors that keep opening and closing no matter who’s trying to go through them just as long as they’re doing so at a steady rate.

However the clock is ticking,  moving parts are involved. And those parts are made of particles that, just like those book pages we threw into the air, are jostling themselves into new, almost  certainly more disordered, arrangements. So just by tracking the passage of time, a clock contributes some of its  own entropy to the universe.

And for very small clocks,  these researchers found that the entropy a clock produces  depends on its accuracy. In this case, accuracy is a technical term. To define it, picture a clock  that ‘ticks’ at a fixed rate.

By counting the number of ticks and  multiplying by the interval of time between two ticks, you know  how much time has passed. Or at least you have a pretty  good idea how much time passed. Because that counting can never be perfect.

And eventually, tiny errors add up. So a clock’s accuracy is the number of  ticks it takes for it to be off by one. In the case of our grandfather clock  pendulum ticking once per second, we’d be asking how long it would take for it to be either one second ahead or one second behind.

And these days, the most advanced atomic clocks can run for billions of years  before losing a single second. But atomic clocks are absurdly large and complex. So to understand time, the  Austrian team focused on the simplest theoretical models of a clock.

But they didn’t start with something  that ticks at a fixed frequency. You know, like you’d expect for  even the simplest of clocks. Instead, the ‘ticks’ happened at random.

They couldn’t predict when the  next tick would actually come. And that might sound odd,  but the team was trying to figure out how to measure accuracy at all. They couldn’t start by assuming they  had something ticking at a fixed rate.

But of course, for their clock to be  useful, the ticks should be regular. So atop those random ticks,  they added a repeating filter. If we return to that broken  automatic door example, the random ticks would be a  bunch of inattentive people expecting they’re just gonna walk  right through some opening doors.

Any people who randomly approach at  just the right time will be let through and get recorded as an official tick. But any person that walks into  the doors while they’re closed hurts their nose and runs away. It’s not a perfect analogy.

But it shows how the doors help smooth out the ticks from the randomly approaching people. But even with a filter, you’re not guaranteed  to see a tick once per time interval. Sometimes the doors are open  and no one goes through.

And that allows inaccuracies to creep in. So once they were armed with this model, the researchers worked out how  good this filtering process could be on a super simple microscopic clock. And they discovered that there was  a cost if they wanted to improve it.

The better the filter, meaning the  more consistent the time between ticks, the more moving parts they’d need to add. And adding moving parts is  exactly what increases entropy. In other words, the more accurate your clock is, the more disorder your clock  will contribute to the universe.

But that was just what the models said. So the UK half of this research collab  ran an experiment with as simple a clock as they could build: a wobbling nano-scale  sheet of silicon in an electric circuit. In the end, the experiment confirmed the models: higher accuracy means higher entropy.

It’s almost as if, by trying to time  how long it takes the future to arrive, you’re contributing to it arriving at all. But I should stress this doesn’t affect  the rate at which time flows for you. There are so many other entropy sources around you that this one can’t make a difference.

Sorry to all you freshman  philosophy majors out there. But we’re not done, yet. With this result in mind, the  scientists in Austria realized that a clock’s accuracy isn’t the  only factor they had to consider.

They also needed to worry about its resolution. That is another technical term, and it refers to how small a time step you can see with your clock. After all, you can’t use a grandfather  clock to time a hundred-meter sprint.

For that, you’d need a stopwatch  with millisecond resolution. But for a lot of science today,  you need to go a lot smaller. Like, to watch chemical reactions  take place on a molecular scale, you need a clock that can resolve a  billionth of a billionth of a second.

These are attoclocks, and some of the  scientists behind their development won the 2023 Nobel Prize in Physics. They’re just as extreme as atomic clocks,  but they serve a totally different purpose. Attoclocks can measure the tiniest intervals, but don’t need to be accurate  for super-long times.

Meanwhile, atomic clocks  have much worse resolution, but can accurately track what they  need to for much longer times. But could you ever build a clock that does both? A clock that combines the  resolution of an attoclock with the accuracy of an atomic clock?

Well, according to that Austrian  team, the answer is a resounding no! Remember that filtering process  they added into their model? To increase their clock’s resolution, they needed a filter that worked  at smaller and smaller time steps.

But that improvement won’t mean much  if the random ticks it’s trying to filter don’t come frequently enough. So whatever’s making the ticks has  to pump more of them out, faster. And a clock that makes more ticks  is also making more entropy.

It’s part of the arrow of time,  and costs resources to make happen. Resources that, you may have  realized, aren’t infinite. So ultimately, the team found that  the more accurate your clock is, the less resolution it can have, and vice-versa.

If you try to probe smaller  and smaller time-scales, you pay for that by being able to  track them consistently for less time. In this sense, it’s sort of like  the famous uncertainty principle in quantum mechanics: there’s  a fundamental limit to how much information you can have  at once about two properties. But here, the trade-off comes  not from the weird physics that manifests at quantum scales, but from  the universe needing to increase entropy.

It’s also sort of like the  uncertainty principle in that it doesn’t seem to matter to our day-to-day lives. The effect only really kicks  in at microscopic scales, or at the limits of extreme  resolution or accuracy. More often, it’s other sources of error  that limit a clock’s ability to track time.

Like how your wristwatch will lose accuracy if its quartz crystal gets too cold  or wet, changing its wobbling rate. But one area where this could become  a significant problem is computing. That’s because the physical computing that takes place on microchips  keeps getting faster.

That means computers are computing  on shorter time intervals, but they still need to run  processes billions of times, for exactly the right amount of  time, to complete those computations. So as computer processors get faster  and faster, and smaller and smaller, they may start to run into this  accuracy-resolution trade-off problem. That’s especially true for quantum computers.

While they’re still in early development,  they might bring a huge revolution to computing by making calculations  exponentially faster than their predecessors. They do that by manipulating quantum objects. But quantum objects are  notoriously difficult to use, so you need extra-precise control over them, and you need to run lots of processes  for absurdly precise amounts of time.

So one day, we may reach a point where the impossibility of a perfect  clock actually matters. But if you want to use it as an excuse for why you were late to your intro to  philosophy class, that’s alright with me. Thanks for watching this episode of SciShow.

And an extra big thanks to everyone  who supports us through Patreon. Because we can’t do what we do  without you doing what you do. And that also includes freshman  philosophy major patrons.

You guys are… you’re cool, too. [♪ OUTRO]