Previous: Why Do Neutrinos Have Mass? A Small Question with Huge Consequences
Next: How Long Does SARS-CoV-2 Last on Surfaces? What We Know



View count:245,188
Last sync:2022-11-23 09:00
Have a problem with many competing variables? Why not solve it with a computer algorithm based on cooling metal?

Hosted by: Hank Green

SciShow has a spinoff podcast! It's called SciShow Tangents. Check it out at
Support SciShow by becoming a patron on Patreon:
Huge thanks go to the following Patreon supporters for helping us keep SciShow free for everyone forever:

Kevin Bealer, Jacob, Katie Marie Magnone, D.A. Noe, Charles Southerland, Eric Jensen, Christopher R Boucher, Alex Hackman, Matt Curls, Adam Brainard, Jeffrey McKishen, Scott Satovsky Jr, James Knight, Sam Buck, Chris Peters, Kevin Carpentier, Patrick D. Ashmore, Piya Shedden, Sam Lutfi, Charles George, Christoph Schwanke, Greg
Looking for SciShow elsewhere on the internet?

Image Sources:;_magnification_400X.jpg
[ ♪INTRO ].

Say you're starting a business to make and sell…I don't know, cardboard boxes. You have some big decisions to make—like, where should you put your factories, and your warehouses, and your distribution centers?

This might not sound like a mind-bending problem, but it can be ridiculously complex. You clearly want to be close to your customers and to your suppliers to keep shipping and transportation costs down. But you also have to account for those suppliers' prices; the demand for each type of box; how much material each box uses; setup and maintenance costs; production capacity; the costs of power, and rent, and real estate; and, if you're a good citizen, the environmental impact of every choice.

It's a /lot/ to keep track of! This type of situation is called a constrained optimization problem: You're looking for the best balance of competing factors—/while/ making sure your solution is actually possible. Constrained optimization doesn't exactly get much press, like you probably had never heard of it before, but it's the hidden architecture beneath practically every human endeavor you can think of, from cardboard manufacturing to the power grid, airline routing, and even molecular biology.

And one of the most popular ways to solve problems like these takes inspiration from an unusual source: the way metals cool. Now, when I say “business decisions,” metallurgy probably isn't the first thing that comes to your mind. And it also wasn't, like, computer scientists' first thought.

They initially designed lots of efficient algorithms to find the absolute best solutions to constrained optimization problems. But those methods work best when the problem has a nice, friendly mathematical form— like when the algorithm just has to find the mathematical equivalent of like the highest spot on a hill. When the problem involves lots of variables related in more complicated ways, even a computer would take ages to find the best solution!

So eventually, researchers looked for inspiration in nature, which is great at finding solutions that might not be perfectly optimal, but come pretty close. For example, in 1983, three IBM researchers drew an analogy between optimization and, of all things, cooling metal. See, when a metal is heated and then slowly cooled, its atoms tend to settle into the arrangement with the lowest possible energy.

In other words, the arrangement of the atoms gets naturally optimized. That's convenient in metalworking, because if you want to work some metal into a certain shape, it's easiest if the atoms are all neatly arranged in a crystal structure. But normally, there are defects, like atoms out of place and fault lines between different sections of the crystal.

So metalworkers use a technique called annealing: They heat up the metal really hot, then slowly cool it down. When the metal is hot, its atoms move around randomly. But as they cool, their movements become less random: With less energy, they become less likely to jump from low-energy positions into high-energy ones.

So they naturally settle into the lowest-energy configuration, which is a defect-free crystal. Because the atoms cool so slowly, they have the opportunity to find low-energy positions before they get frozen in place. For the IBM researchers, this natural process wasn't just a neat fact about metalworking.

It was also inspiration for an optimization algorithm—in other words, a procedure computers could follow to solve complicated optimization problems. They called it simulated annealing —and the idea was to imitate the natural process. Just like physical annealing goes through different arrangements of atoms, the algorithm goes through different solutions to a constrained optimization problem.

This might not sound like an obvious comparison, but here's how it works:. In the example of the cardboard business, you'd start by generating some random choices of products, facilities, and locations. Then you'd keep making random changes.

Initially, you allow even some changes that make the profit margin or environmental impact /worse/—like moving a warehouse far from its nearest factory. That's because at this point, allowing detrimental changes is like letting the atom bounce into a position that temporarily puts the metal structure into a higher-energy state. Ultimately, that high-energy state might turn out to be a step on the way to the lowest energy one.

It's kind of like giving something a good shake to get it out of a rut. Similarly, when the algorithm suggests a /bad/ solution, that can eventually lead to an even /better/ option than it would have found otherwise. Over time, though, the algorithm becomes less and less willing to accept changes that don't improve the result—just like, as the metal cools, it keeps getting harder for atoms to bounce into higher-energy states.

Eventually, you converge on something very close to the best possible set of products, facilities, and locations. Simulated annealing is extremely popular. It's been used in over a /million/ research papers and real-world projects—and not just for cardboard boxes.

Say you work for an airline, planning flight routes. Deciding where to fly and which planes should fly each route is tricky—because you want to optimize profit per passenger and minimize wasted fuel. Meanwhile, you're constrained by things like travel time and each aircraft's capacity.

How do you balance all those factors? How do you make decisions in that wild world? Maybe simulated annealing!

Or maybe you're a molecular biologist trying to work out the structure of a protein—say, a protein that some deadly virus uses to replicate. You want your structure to match your experimental data as closely as possible, but it's constrained by the fact that the structure has to be physically possible. Again, you could try simulated annealing!

This method has also been used to schedule university exams, to decide when and where to build power generators, to help robots find the best route from point a to point b, and on and on. And it's just one of dozens of popular optimization algorithms—each of which works for a different class of problems. So really, we've just scratched the surface of how constrained optimization powers modern society—and one of the best methods comes from studying nature itself!

Thanks for watching this episode of SciShow! And a special thank you to this month's President of Space, Matthew Brant, who is one of the amazing supporters making it possible for us to keep doing episodes like this. To find out more about how you can support what we do, check out [ ♪OUTRO ].