We know that complexity is the worst enemy of security, because it makes attack easier and defense harder. This becomes catastrophic as the effects of that attack become greater.
In A Hacker’s Mind (coming in February 2023), I write:
Our societal systems, in general, may have grown fairer and more just over the centuries, but progress isn’t linear or equitable. The trajectory may appear to be upwards when viewed in hindsight, but from a more granular point of view there are a lot of ups and downs. It’s a “noisy” process.
Technology changes the amplitude of the noise. Those near-term ups and downs are getting more severe. And while that might not affect the long-term trajectories, they drastically affect all of us living in the short term. This is how the twentieth century could—statistically—both be the most peaceful in human history and also contain the most deadly wars.
Ignoring this noise was only possible when the damage wasn’t potentially fatal on a global scale; that is, if a world war didn’t have the potential to kill everybody or destroy society, or occur in places and to people that the West wasn’t especially worried about. We can’t be sure of that anymore. The risks we face today are existential in a way they never have been before. The magnifying effects of technology enable short-term damage to cause long-term planet-wide systemic damage. We’ve lived for half a century under the potential specter of nuclear war and the life-ending catastrophe that could have been. Fast global travel allowed local outbreaks to quickly become the COVID-19 pandemic, costing millions of lives and billions of dollars while increasing political and social instability. Our rapid, technologically enabled changes to the atmosphere, compounded through feedback loops and tipping points, may make Earth much less hospitable for the coming centuries. Today, individual hacking decisions can have planet-wide effects. Sociobiologist Edward O. Wilson once described the fundamental problem with humanity is that “we have Paleolithic emotions, medieval institutions, and godlike technology.”
Technology could easily get to the point where the effects of a successful attack could be existential. Think biotech, nanotech, global climate change, maybe someday cyberattack—everything that people like Nick Bostrom study. In these areas, like everywhere else in past and present society, the technologies of attack develop faster the technologies of defending against attack. But suddenly, our inability to be proactive becomes fatal. As the noise due to technological power increases, we reach a threshold where a small group of people can irrecoverably destroy the species. The six-sigma guy can ruin it for everyone. And if they can, sooner or later they will. It’s possible that I have just explained the Fermi paradox.