Live Science spoke with Nobel prize-winning physicist David Gross, who recently received the $3 million Special Breakthrough Prize in Fundamental Physics, about the quest to unite all the forces and why humanity might not live to see a unified theory.
For anyone running into the paywall:
https://archive.is/qjPeo
And just the part about 50 years and nuclear weapons:
spoiler
DG: Currently, I spend part of my time trying to tell people … that the chances of you living 50 [more] years are very small.
Due to the danger of nuclear war, you have about 35 years.
DG: So it’s a crude estimate. Even after the Cold War ended, [when] we had strategic arms control treaties, all of which have disappeared, there were estimates there was a 1% chance of nuclear war [every year]. Things have gotten so much worse in the last 30 years, as you can see every time you read the newspaper. I feel it’s not a rigorous estimate, that the chances are more likely 2%. So that’s a 1-in-50 chance every year. The expected lifetime, in the case of 2% [per year], is about 35 years. [The expected lifetime is the average time it would take to have had a nuclear war by then. It is calculated using similar equations as those used to determine the “half-life” of a radioactive material.]
DG: We had something called the Nobel Laureate Assembly for reducing the risk of nuclear war in Chicago last year.
There are steps, which are easy to take — for nations, I mean. For example, talk to each other.
In the last 10 years, there are no treaties anymore. We’re entering an incredible arms race. We have three super nuclear powers.
People are talking about using nuclear weapons; there’s a major war going on in the middle of Europe; we’re bombing Iran; India and Pakistan almost went to war.
OK, so that’s increased the chance [of nuclear war]. I would really like to have a solid estimate — it might be more, and I think I’m being conservative — but a 2% estimate [of nuclear war] in today’s crazy world.
DG: We’re not recommending that. That’s idealistic, but yes, I hope so. Because if you don’t, there’s always some risk an AI 100 years from now [could launch nuclear weapons], but chances of [humanity] living, with this estimate, 100 years, is very small, and living 200 years is infinitesimal.
So [the answer to] Fermi’s question of “Where are the civilizations, all the intelligent organisms around the galaxy, and why don’t they talk to us?” is that they’ve killed themselves.
You asked me to think about the future, and I am obsessed the last few years, thinking about that — not the future of ideas and understanding nature, but of the survival of humanity.
DG: There are now nine nuclear powers. Even three is infinitely more complicated than two. The agreements, the norms between countries, are all falling apart. Weapons are getting crazier. Automation, and perhaps even AI, will be in control of those instruments pretty soon.
DG: It’s going to be very hard to resist making AI make decisions because it acts so fast. If you have 20 minutes to decide whether to send a few hundred nuclear armed missiles to both China and Russia for “our dear president,” the military might feel that it’s wiser to make AI make that decision. But if you play with AI, you know that it sometimes hallucinates.
DG: People have done something about climate. So that’s something scientists began to warn people about 40 years ago. And they convinced people that’s a real danger.
It’s a much harder argument to make than about nuclear weapons.
We made them; we can stop them.
edit:formatting