In 1983 the USA and Russia were engaged in a nuclear arms race and relations between the two great powers were at breaking point. So what happened?
On the evening of 23rd September 198, the Soviet satellite monitoring system reported the launch of an American nuclear intercontinental missile against the USSR. The first report was shortly followed by further reports indicating that four more missiles had been launched.
Had these missiles exploded on Russian soil, they would have wiped out the lives of hundreds of thousands of people. A Soviet countermove to intercept them might indeed have prevented their reaching Russia yet would certainly have provoked a retaliatory nuclear strike by the USA which could have unleashed the unimaginable cataclysm of a global nuclear war. Disaster seemed unstoppable.
What to do? The answer to this question came not from the Kremlin but directly from the command center, from the officer on duty that night, lieutenant colonel Stanislav Petrov.
He choose to ignore both the orders set forth in Soviet military protocol and the reports of the radar system.
Disobeying orders at the best of times is no easy matter and even in everyday private or professional life requires inner strength and a certain degree of courage. Yet in totalitarian systems like the Soviet Union, such courage is seen as resistance and is punishable. It comes with a high level of personal risk that people who live in a functioning constitutional state fortunately don’t know.
Yet the principle of command and obey is found in all armies no matter in which political system they are embedded. There is no military system in the world that makes provision for such an unauthorized operative-level decision as the one Stanislav Petrov made on the evening of 23rd September 1983. Accordingly, in the opposite case, an American officer would unquestionably have found himself in a similar conflict of interest.
Stanislav Petrov didn’t refer the decision to his superiors as military protocol demanded. He stayed calm, thought it over, used logic and came to the conclusion that the data he had been given was wrong. “I don’t trust this computer”, he said and did nothing.
An extremely courageous decision that probably saved the lives of hundreds of millions of people.
Because he was right. The alarm that he ignored was really a false alert, triggered by the software error of a Soviet spy satellite that had interpreted a sunrise and alignment of sunlight on high altitude clouds as a USA missile attack.
The “error-free” system of the USSR wanted to spare itself the embarrassment of having to admit that it wasn’t that infallible after all Accordingly, Petrov wasn’t honored but reprimanded, dishonorably discharged and would have fallen into total obscurity had the totalitarian system of the Soviet Union proven more resistant. Yet with Perestroika and Glasnost his case came to public attention. Even so, he still hasn’t been rehabilitated.
Petrov never saw himself as a hero. He always considered what he had done as pretty matter-of-fact. He was nominated for the Nobel Peace prize, yet never received it. Only the substantial documentation on his case has kept his reputation alive long after his death.
Petrov’s decision could well have turned out to be the wrong one. Nothing in that critical moment gave him the certainty that he was doing the right thing. He trusted his own brain more than the computer. If he had been wrong, it would have entailed exactly those drastic consequences he was seeking to prevent. There was no cast-iron guarantee. Today we know that he made the right decision and we are grateful.
The trouble with every difficult decision is that it always entails a slight degree of uncertainty. Because whether a decision is right or wrong can first be determined when its consequences – sooner or later – are revealed and become tangible. These consequences are irreversible.
Whoever decides by themselves, whoever trusts their own mind and their own perspectives instead of blindly following rules and regulations, is moving on very shaky ground where no rulebook can guide them.
So should we rather stick to the safe side?
We all must find out the answer to this question for ourselves. Yet history encourages us with its countless examples, both small and big, of courageous correct decisions – from continuing to speak truth despite public pressure to civil disobedience, from refusing to obey military orders that contravene human rights and the Geneva Convention to the equally power-conscious and consistent decision of an American president in the 1961 Cuba crisis.
Steps forward and steps backward in this world are based on decisions, both large and small. Every day each one of us makes decisions which have an impact on others. We all bear responsibility for what we think and do. Nothing more and nothing less. Wouldn’t you agree?
Continually making this clear to yourself and bearing it in mind is already a partial answer to the vexed question of the previous paragraph. At least that’s what I think.
At the end of the evening I saw myself confirmed in my own view that learning and innovation are only possible if we allow for and accept imperfection and error. Because perfection is not always and everywhere the optimal solution for everything.
How come, you might well ask. After all, perfection is something we aspire to while mistakes are bad and therefore should be avoided. This is something we all have drummed into us at school where mistakes are marked with the red pencil and make for a poor grade. The message is clear: everything must be correct and perfect. Whether such a message is actually useful is quite another question.
Nobody’s perfect and even our brain makes mistakes – a really surprising number of mistakes in fact – but this is just as it should be. In fact it’s absolutely marvellous! Our brain is no computer with a hard drive and a memory in which everything we learn, know and experience is stored in a fixed form and can be recalled at any time in exactly this form. Our brain is in continual action, linking and networking what we have permanently stored with what is new. Our memories don’t work like a “photo album” of our personal histories but rather help us to deal well and successfully with what’s new in our present. In this process our memories also forget a great deal that could afflict or impede us in our present lives without us being in the least aware of this or indeed able to influence it. Our brains continually act and react to what they find interesting. If our brains don’t find it interesting, they don’t pay attention because whatever else they are not programmed for boredom. In order to develop they love and need an incessant stream of new impulses and indeed they themselves can bring forth the new, creating new ideas and innovations.
It should be very obvious that mistakes happen in such a process. As an instance of this Henning Beck cited in the Markus Lanz programme an example from a film which I came across many years ago and have always used to illustrate that the seeming weakness of our perceptions are in fact one of their main strengths. This instance is known as the “Gorilla-Study“ in which the psychologists Christopher Chabris and Daniel Simons showed a group of test persons a film of six basketball players running about and passing two balls to members of their group. Three of them were wearing white t-shirts, the other three black. The exercise the test persons were asked to do was to count the number of passes made by the players in white. At some point in the film an actor dressed in a gorilla costume runs across the court – an eye-catching apparition you might think simply couldn’t be missed. Yet most of the test persons failed to notice him as did the guests and the audience in the Markus Lanz programme when they were shown the film.
Why do so many people fail to see this unmissable gorilla? Because they are so intent on counting the number of ball passes between the players wearing white t-shirts that they blend everything else out, even the gorilla. Some people do indeed recognise and watch the gorilla but by so doing are no longer capable of counting the number of passes and at the end have no idea of how many passes the players wearing white have actually made.
Our brains are incapable of focusing on both events at the same time. The attention of our brain is always directed at something particular. In terms of this something particular it may well be that the brain is perfect. And that’s an essential quality. But in terms of the overall scene, our brain is far from perfect. This is why two people looking at one and the same thing are not necessarily really seeing the same object. Knowing this and accepting this is very important not just for our understanding of what it means to be human but also for our understanding of ourselves as social beings in interaction with others and for our understanding of the world.
How we can constructively avoid mistakes and by doing so accomplish astonishing things has been shown to us for many years now by Silicon Valley. “You can’t be innovative if you don’t experiment and keep falling flat on your face” is a defining moment of Silicon Valley culture. Each single useful outcome is marked by a great number of failures and setbacks. Get those behind you and then go on to achieve the best possible, the perfect result. Space X and Hyperloop are impressive examples of this kind of mindset.
For the future I would like to see not just a widespread dissemination of these insights but also their practical application to nurture new forms of action – in children’s education in the family, in schools and universities as well as in organisations and enterprise. The more people understand and accept them, the more chances and opportunities will open up bringing with them totally new solutions to meet the challenges of our times. I really believe this will happen.
So what’s actually stopping us from opening up such possibilities ourselves?