Volume 2, No. 5, May 2020
Editor: Rashed Rahman
Professor Dr. Maqsudul Hasan Nuri
As an umbrella term, ‘national security’ encompasses many dimensions. Hence there is a need of it being perceived in the right perspective and proportion. A common prevailing assumption is that security leads to conditions of all-round well-being and economic development in a society. This is true only where the sine qua non is in place: balanced and well thought out security, but if it is lopsided or overplayed, negative consequences could follow, impeding relations with neighbours, national harmony and economic development.
Many economic and non-economic decisions are based on leaders’ view regarding security-related issues. In the past, many development and welfare decisions have been sacrificed at the altar of ultra-nationalism and hyper-security. In fact, the ruling paradigm of national security is contingent upon regional developments, historical traditions, culture, instinctual reflexes, emotional memories of people and national narratives.
No wonder perceptions are coloured by biases of all kinds. In other words, security and perceptions of security are not the same. As researchers and data-analysts show, people’s subjective perceptions about the world may significantly diverge from reality. Objective security (reality) and subjective security (feeling and perceptions), as experienced by laymen, reflect a wide disparity between security perceptions at the elite and grassroots levels.
For example, in terms of the development-security debate, as gleaned from the messages of world leaders and security specialists, the world is becoming an increasingly unsafe place, impinging on socio-economic development and affecting normal economic activity. Dire scenarios are projected to attract attention and divert focus from pressing internal issues to justify policies meant to remain in power or looking over the horizon towards future success.
Why is this so common? First, psychologically, human beings in the evolutionary sense still maintain a primordial fear of harm and anxiety about survival. Second, bad news makes good copy rather than banal reports. Third, politicians and military analysts like to pander to security risks and paradigms based on win-lose realist models that thrive in a ‘near-chaotic’ world. This makes sense to their anxieties and survival instincts. Likewise, security journalists are adept in parleying dire prognoses about the state of the economy and security.
In many major world crises, world security has been invoked in stark terms – political, military, social, economic, and the environment. Flash points attract more coverage. India-Pakistan tensions over Kashmir, the Arab-Israel conflict over Palestine, North Korea-US differences and the Iran-US confrontation get more coverage while many mundane yet more crucial issues may end up either under-reported or ignored.
Admittedly, climate change, the economy and epidemics carry stark security implications. Barring a few human issues, they are more often than not exaggerated and political interests undergird their reportage and analysis. Years of wars, droughts, floods, pandemics and natural calamities in the past used to strike causing million of deaths as they swept across wide swathes of lands.
Regardless of the economic progress in development and human welfare in China, issues are blown out of proportion to denigrate or embarrass the regime and ipso facto the Chinese model of development. But as many admit in the developing world, there is much countervailing evidence about China’s rise as a major economic power by lifting a mass of humanity from poverty in a brief span of three decades.
Small Nordic countries are more security-conscious due to their size and relatively few resources and being encircled by bigger neighbours. The Netherlands’ Integrated International Security Strategy 2018-2022 mentions that the world has become more insecure in relation to certain aspects like shifts in the balance of geopolitical power, increasing instability and insecurity around Europe and the Caribbean, and rise in hybrid conflicts and tensions.
This may be true but again the security dimensions are conveniently hyped up. However, national and international research has demonstrated that the world actually has become in many ways a safer place. If one looks at the long-term trends, the chances that someone is killed by violence have decreased significantly in the last decades. Media and technology are eager to report even minor negative developments, including mini-crises. Global trade and communications have made the world interdependent. Whenever natural or man-made disasters or pandemics erupt, they are taken care of by disaster relief agencies and world bodies. And they lead to research on a cure and future prevention. According to the above argument, this is a neat, declining trend line. Peaks might still occur, but the overall trend towards mass violence is declining.
Human perceptions of security are based on family upbringing, education, parental influence, regional and world developments, and press and media coverage. Another example that substantiates the claim that human perceptions can differ from reality can be derived from a study conducted by Hans Rosling in his book Factfulness: Ten reasons we’re wrong about the world – and why things are better than you think. In this book Rosling shows on the basis of 13 factual questions about the current situation in the world that common people, including highly-educated people, get most of the answers wrong. More worrisome is the fact that a majority of people got even worse results when they pick answers at random. He states: “Chimpanzees, by picking randomly, would do consistently better than the well-educated, but deluded human beings” and that “every group of people I ask thinks the world is more frightening, more violent, and more hopeless – in short, more dramatic – than it really is.” These positive trends are corroborated by Steven Pinker and Max Roser who collect all kinds of data to prove the slow, but long-lasting positive developments. For example, a comparative study of various foresight reports in ten different countries demonstrates that security perceptions and corresponding actual threats to the respective national security of states can differ significantly.
These examples are manifestations of a broader trend: the existence of a discrepancy between the reality of security and the feeling and thus the perception of security. Hence the question arises how the mismatch between these realities and perceptions of security can be explained.
To a certain extent, these differences can be explained due to geography, culture and history, but another important aspect in explaining the diverse security perceptions is in the psychology literature. More tellingly, cognitive biases held by elites in those states, i.e. the people who are outlining and implementing policy decisions, can help to clarify why certain policies are sound and why certain events are perceived as threats to national security.
Security is both a feeling and a reality. The reality of security is mathematically based on the probability of different risks and the effectiveness of different counter-measures. The feeling of security is based on psychological reactions to both risks and counter-measures. Heuristics is simplifying the ‘rules of thumb’ that people use to make difficult judgments. The reliance on the heuristic can cause predictable biases (systematic errors) in their predictions.
Cognitive systems are of two types and reflect different modes of thinking. The first one operates automatically and quickly with little effort and little sense of voluntary control. It draws conclusions based on previous experiences, events, and emotional memory. The other is based on rational assessments full of mental deliberations that demand complex computations. The former are often associated with subjective experience of agency, choice and concentration, the latter are much slower, calculating and deliberative, and get activated when failing to come up with a fast, suitable answer.
Cognitive biases that influence security perceptions are not uncommon; a gap exists between the reality of security and the human perception of security. This divergence is the result of intuitive trade-offs: even though humans are supposed to be good at making security trade-offs, they get it wrong most of the time.
Daniel Kahneman, the Nobel Prize winning psychologist, conducted research on common people’s estimates regarding the principal causes of death after comparing the statistics. One of the conclusions was through the example that people thought hurricanes were bigger killers than asthma, even though asthma caused twenty times as many deaths as hurricanes. Also, being bitten by snakes or attacked by predatory animals are considered more dangerous, whereas automobile accidents in urban areas cause more injuries and deaths.
This is but one of the many examples showing that people’s risk assessments often differ from the possible risks. Impliedly, initial assessments are inadequate, albeit cognitive systems relying on reflexive responses are quick and satisfying but turn out to be generally misleading when deliberate cognition should have been activated.
According to Bruce Schneier, there are several aspects of the security trade-off that humans might calculate wrongly: the magnitude, probability costs, how effectively a particular counter-measure may alleviate risk and how disparate risks and costs can be compared. The bigger human perception diverges from actual reality in any of these aspects the more the perceived trade-off does not match the actual trade-off.
Often, insecure, authoritarian regimes play up negative news and graphically broadcast it repeatedly. Curbs on alternative or dissenting views tend to accentuate the phenomenon of confusion, panic and general uncertainty. In many wars, personalities of actors who committed excesses and killing are linked with Hitler, Milosevic and others, which engraves the message still further.
This calculated bias reinforces overestimation of rare events: they attract a disproportionate level of attention with the few occurring instances splashed all over the news. Goebbels’ principle of repeating lies till they seem the truth is followed. Moreover, because people vividly remember those rare events, they conclude that since they are more frequent, that is the reality. Disinformation, misinformation and pseudo-information play a major part as vested interests are involved in pursuing their ulterior motives.
In addition, the regime-type states are of central importance. In democracies, the existence of a free press, greater and open policy debates, institutional checks and balances, social media and multiple actors involved in decision-making raise the chances that a particular nature of information could be correct as against the ingrained cognitive stereotypes and mindsets.
A bias that is part of the available meta-bias, is the ‘hindsight bias’: “I knew it all along.” The inability to reconstruct past beliefs will inevitably cause one to underestimate the extent of being surprised by past events. This bias is also known as the ‘outcome bias’. One is often prone to blame the decision-makers for good decisions that worked out badly and to give them too little credit for successful moves that appear obvious only after the fact.
This hindsight bias also leads to the belief that the future is predictable. After all, in hindsight it seems obvious how the course of events led to the now known outcome. Here, the basic illusion is that one believes one understands and knows the past and, therefore, implies that the future should be likewise knowable. But in fact we know the past less than we believe we do.
In hindsight, one neglects all the signs that pointed in a different direction and overlooks all other possible outcomes that seemed to be valid at the time. Another common bias of the human mind is known as the ‘negativity bias’, also linked with ‘availability bias’. Negativity bias, as a fundamental principle of human cognition, has a greater impact than positive ones across a wide-range of psychological phenomena.
Why are human beings so susceptible to negativity bias? Psychologists believe that the dominance of bad over good has existed since olden times, as the survival syndrome is an adaptive trait to avoid lethal dangers to human beings in evolutionary history caused by natural catastrophes and threats from wild animals and nature.
This negativity bias could have an impact on the ‘threat sensitivity’ of states: how states identify opportunities and dangers prior to conflict. A heightened reaction to negative information indicates potential dangers compared to positive information suggesting attendant opportunities. This negativity bias helps explain many critical behaviours in international relations, including the security dilemma, threat inflation, outbreak and persistence of war, loss aversion, neglect of opportunities for cooperation and prominence of failure in institutional memory and learning.
A bias that at first glance seems to contradict the negativity bias is the ‘positivity bias’: overconfidence about one’s capacity and abilities, overestimation of own control over events and over-optimism about future prospects. However, this bias could actually go together with the negativity bias because they apply to different contexts. People often privilege negative information about the external environment and other actors but only adhere to positive information about themselves. In fact, the co-existence of these biases can actually raise the odds of conflict. Decision-makers simultaneously exaggerate the severity of threats and are overconfident about own capacity to deal with the situation that could be problematic.
The effect of the negativity bias could be pronounced if it is combined with the halo-effect: the tendency to like or dislike everything about a person or a country, including things one has not observed. The negative form of the halo-effect, called the horn-effect, is a common bias that plays a major role in shaping the views of people about situations. As an illustration, one can see this kind of effect pertaining to Russia: everything Russia says or does nowadays is viewed suspiciously. On the contrary, an act or statement would be viewed as neutral if coming from, for example Germany, but viewed negatively if it emanated from Russia.
The same halo-effect, both positive and negative, can be observed with US President Trump in everything he says or does. In a polarised situation, he is viewed negatively or positively, leaving out a balanced view on merit.
Yet another bias that could increase both the negativity and positivity bias effect is the ‘confirmation bias’. This entails the tendency to look for information that confirms people’s pre-emptive view about the world or a particular person, thereby ignoring any piece of information that contradicts those views.
People will, for instance, seek data that is likely to be compatible with their belief systems. For example, President Trump: people who dislike everything about him will ignore a smart, sensible decision by him whereas the people who love him shall conveniently ignore the lies that he tells or just dismiss it as ‘fake news’.
Variance in the perception of security risks and reality is different while weighing the severity of certain risks. Utility theory assumes that actors are fully rational and make trade-offs on calculation of relative gains and losses. In contrast, the prospect theory acknowledges that people have subjective values towards gains and losses. This clarifies why people tend to prefer a sure gain over the chance of a greater gain, while a sure loss is worse than a greater loss.
The fact that people act in this way can be attributed to the cognitive framing effect. People make different trade-offs when something is presented as a gain as opposed to something presented as a loss. Hence, it can be said that when a trade-off is framed in terms of a ‘gain’, people tend to be risk-averse, while when trade-offs are framed as a ‘loss’, people tend to be risk-seeking.
This outcome can also be explained by two different elements. On the one hand, people tend to place more value on changes closer to the status quo in terms of time than they do to changes further away from the current state. On the other hand, people also attach greater value to something when it is considered to be a potential loss, as opposed to a potential gain.
This also applies to countries because countries are led, after all, by people. Sometimes in the case of ultra or hyper-nationalism, certain countries do not view world economics/globalisation as a ‘win-win’ situation but as a ‘zero-sum’ game. Countries that see the world through a ‘zero-sum’ template are likely to take risks to avoid bigger losses over what they perceive as a certain loss. The US’s protectionist and isolationist measures could be partly explained by this phenomenon.
When applying this to international security, two important implications are apparent. First, people will trade more often for security that lets them keep what they already possess. For example, a country will invest more in maintaining control over territory it already possesses than over territory it can potentially acquire. Second, when considering security gains, people are more likely to accept a smaller, but more certain gain, than a larger but not so certain gain, but when faced with security losses, they are willing to risk a larger loss as opposed to accepting the certainty of a small loss.
Security is both a feeling and a reality. And they’re not the same. On the basis of the above, it can be said that humans are vulnerable to a varied range of biases that influence their decisional output. Consequently, this also affects the policies and actions of countries, as countries are led by leaders arising from the people. Hence, being aware of the existence of biases is the first step in overcoming them.
Biases are usually products of the fast, intuitive and impulsive cognitive system possessed by humans. By recognising situations in which biases might creep in or dominate, it is possible to actively mitigate actual decisions that may affect downstream economic, social and other decisions. This can be done by activating a slower, more calculating cognitive deliberative system.
‘Mindfulness’ is a relatively new, almost three-decade-old term in psychology, and now recommended in almost all walks of life to prevent unsound and reflexive decisions leading to ill-adaptive behaviour. This results from ‘gut feeling’ or common sense and albeit quick and reassuring, is generally impulsive, temperamental and biased. This emotionalised thinking, often termed a ‘Pavlovian’ response, is a poor substitute for calm, logical decision-making. Still, with the best of intentions and precautions decisions can go awry due to many variables at play.
In sum, security issues call for introspective, ruminative and measured approaches in minimising wrong decisions that could adversely affect the lives and well-being of millions of people.
The writer is Visiting Faculty, Department of Defence and Strategic Studies, Quaid-i-Azam University, Islamabad, former Adviser, COMSATS Institute of Information Technology, Islamabad, and ex-President, Islamabad Policy Research Institute (IPRI), Islamabad.