January 2004
"Fears about risks have significant consequences both for the public and for the technologies that provoke the fears." -- Baruch Fischoff, in Managing Risk Perceptions |
That Which Cannot Be Known
Though much can be learned from fact-finding endeavors regarding the important elements of a given conflict, it is nearly inevitable that some elements will be left unknown. The effects of a specific set of actions or the repercussions of a particular strategy may not be completely understood. Intractable conflicts tend to involve so many factors, factors that interact in such complex ways, that outcomes cannot always be accurately predicted. Ultimately, the fact that a conflict involves complex elements and unknowns is often a significant reason why the conflict becomes intractable in the first place, and why these kinds of conflicts receive so much attention. Simpler conflicts are easier to resolve and are, therefore, less notable.
Where factors are unknown, to whatever degree, a good way to examine the unknowns is under the concepts of uncertainty and risk. The two are closely associated with one another, but are not identical. Uncertainty may involve things that are completely unknown, whereas risks are often understood via calculable probabilities. For example, though you don't know exactly what will happen after placing a bet on a roulette table, technically the risk of you losing when you bet on red is just slightly over 50 percent. It also must be noted that uncertainty doesn't necessarily imply risks -- something undesirable might happen, but it might be that any of the possible outcomes is OK. This is seldom the case, though, so uncertainty and risk usually run hand-in-hand.
Uncertainty is a major factor in matters of science, technology, health, and the environment. New forms of technology, new medical treatments, the impact of certain substances on the air, water, and/or soil -- all these things have effects, both long and short term, that cannot be completely predicted. Instead of clear and distinct rules of cause and effect, the best technical knowledge reveals are risk factors -- probabilities that certain consequences will occur (for example, the statement that "smokers have a 50 percent greater chance of contracting a deadly form of adult leukemia," according to a study). And it is not necessarily the case that, when given enough time, scientists can eliminate uncertainty and risk. Some uncertainty is unavoidable, and even technologies and treatments that are quite old still involve uncertain elements. For example, while the fact that radiation exposure is a serious health risk has been known for decades, the precise level of radiation that a specific person can withstand is still unknown -- a level that causes cancer in me may leave you unaffected.[1]
Some decisions involve unavoidable risk and uncertainty. For example, let's say your local government does a study and projects a 100 percent increase in electrical power demands over the next 10 years. If it turns out that, due to some unforeseen events, the power demands in your town don't grow by more than 20 percent, the construction of new power plants promises to be an economic disaster. On the other hand, if they do not build the power plants and the demand does significantly increase, the town may be unable to supply the needed power. There are risks involved in any of the available options.
While scientific and technological issues commonly involve uncertainty, social and political conflicts are far from immune. For example, say a long-standing territorial conflict involves a small country that has recently implemented democratic election procedures. Several of the presidential candidates are known to be sympathetic to reconciliation between the ethnic parties, yet the "incumbent" leader, a known racist, is also on the ballot. As clear as it might seem that, because of the longstanding conflict, the country would be better off if a new leader were elected, elections are inherently unpredictable. There have been many examples of sudden, unexpected "landslide" victories from seemingly unlikely candidates. Also, regardless of who is elected, one cannot know what the new leader will really do after the election. A multitude of factors, ranging from voter turnout to the use of coercion, make it impossible to know what shape the country's politics will be in and, subsequently, what turn the conflict will take in the future.
Reactions to Uncertainty and Risks
Decision-making in the presence of risks is especially problematic. While some are comfortable with risk-taking, seeing it as a sign of strength, many others are quite risk-averse. People perceive risks probabilities in different ways -- especially problematic when the actual probabilities are unknown. Also, people's view of risks varies in light of the possible gains and/or losses that certain risks entail. It can be hard to determine where the major players in a conflict fall on the risk-taking/risk-averse scale, or how they individually perceive certain risks. The subjectivity of risk perceptions alone provides quite a challenge for mediators in that they may be caught completely off guard by the reactions of key conflict figures.
Yet there are deeper issues regarding decisions under risk, issues that psychologists have studied rather thoroughly. They have noticed two common kinds of reactions to risk, different sides of the same coin. One of these reactions is called "Risky Shift." Risky shift is the tendency for individuals, in a group situation, to feel inclined to take more risks than they might otherwise take when alone. This is especially obvious amongst children at play, in competitive sports, and even in the military. In the need to assert oneself, to prove that one is not a "coward" or to show that one is better than others, individuals are pushed to risk-taking. This, in turn, influences the others to up the ante and make an even riskier move. This risk escalation dynamic can result in poor, even disastrous, decisions.
The other side is "Cautious Shift." Cautious shift is an equally escalatory dynamic, but in the opposite direction towards war. The "bidding " is toward an ever more cautious position instead. This happens quite often in politics on policy issues that receive significant public attention. The move toward caution happens because, whether a particular risk is warranted or not, any risky decision can provide an easy opening for public criticism by one's opposition. A slogan of "I am willing to take greater risks than my opponent" seems like political suicide. So, political candidates can win public support by promising to be "safer" than their opponent, motivating each side to use this tactic. The end result of this competition is a shift toward excessive caution. Being overly cautious means missing out on opportunities that reasonable risk-taking affords. Also, it often costs resources to avoid risks. Unreasonably cautious policies promise to be exceedingly expensive and/or extremely low on the cost/benefit scale.[2]
Excessive caution is also affected by the common view that human life is priceless. While this is certainly an honorable moral sentiment, the reality is that economic resources are finite. At the governmental level, public safety and health are but one part of the budget. The safety of human life must be balanced with economic policies that make sense. In many respects, this is really what justice is all about -- an equitable distribution of risks. Most of us would feel quite uncomfortable making decisions that put a price on human life. Yet this is the reality that people in power must face.
Decisions under uncertainty are high-stakes gambling where factors such as human life, health, economic prosperity, or the environment are concerned. Under the pressure of these factors, even people in power can become uncomfortable. Conflict decision-makers may desire to avoid the risks at all cost - they may simply put off the necessary decisions, pointlessly trying to eliminate the risks entirely. One way to do this is to suggest that more study on the relevant topics is needed. Guy Burgess has called this "Analysis-Paralysis," in that the situation remains in stasis while it is analyzed. Analysis-Paralysis can subsequently lead to what Burgess calls the "Delay-Default Syndrome" -- in trying to avoid risky decisions, the difficult choices are continually pushed further and further into the future, ensuring the status quo while waiting on study after study. In this respect, fact-finding can be used as a stall tactic. All this can prove to be a costly game, one that ensures that problems linger in the absence of change.
Sometimes, though, it really is best to require more study and employ experts. This can be an especially good idea since human beings are notoriously bad at guessing probabilities,[3] and a bad guess in many conflict contexts can be exceedingly costly. It is necessary to know the real risks before an effective decision can be expected. But doing further studies to evaluate risks potentially creates its own set of problems. The answers that experts give can reveal even more complexity and uncertainty than previously thought. New options may need to be considered, each adding new variables. Risk probabilities will need to be understood.[4] While experts are used to evaluating variables in terms of uncertainty and are quite comfortable with probability calculations, the layman might find probabilities difficult to grasp or be overwhelmed by the number of options, each with its own costs and benefits. All this may effectively be like opening Pandora's Box -- introducing more information and complexity than the decision-makers can handle.
Another complicating factor is the public perceptions of certain risks. People have a natural inclination to avoid, and even resent, any risk that they feel has been imposed on them by an outside force. An example of this is the controversy over genetically modified foods. While there are virtually no known health problems associated with any GM food, the fact that they are "new" leads to the perception (justified or not) that there are risks involved. What opponents of GM foods tend to focus on, though, is not the fact that food makers have modified the foods but, rather, that they have done so without informing the public as to which food contents have been altered.[5] The resentment is over the perceived denial of the ability to make an informed choice regarding the risks.
This reaction becomes a potentially serious problem when the risks of some technology or practice receive more publicity than the potential benefits. For example, most experts in the field of nuclear energy production feel this is exactly what has happened regarding nuclear power. The expert opinion is, by and large, that nuclear power is extremely safe and beneficial, in that it both avoids the air pollution inherent with the use of fossil fuels and is a renewable power source. Yet public opinion regarding nuclear power is still overwhelmingly negative -- nuclear power has been banned in several states in the United States and it is virtually impossible to build a nuclear reactor anywhere else in the country because of public pressure and the "not-in-my-backyard" reaction.
What Can Be Done
The problems that uncertainty and risk pose are serious ones, not easily overcome. People have deep emotional reactions to risks. Public sentiment is not easily changed. Political leaders have their careers to consider. In general, human beings do not like being "in the dark" and do not react well to having to make decisions based on incomplete information. This is especially the case when the risks involved concern things people care deeply about.
Regardless of the potentially overwhelming data experts may uncover, it is often necessary and exceedingly important to attain their assessments. They may alert us to important risks that had not been considered, or put our minds at ease by showing that certain risks are highly unlikely. Yet, after the experts have been given sufficient time and resources to evaluate relevant risks, it might be necessary to carefully explain their findings to conflicting parties in language they can understand. Scientists' tendency to be brutally honest might need to be tempered with some diplomacy -- some of the options may be completely unrealistic or offensive. The goal is to present a realistic picture of the pros and cons of viable courses of action without alienating anyone or overwhelming the decision-makers with more than they can manage.
This goal, though, may not be possible. The factors in a conflict may present some quite complex, but very real, risks. Options that some might find difficult to understand or even find offensive may in fact be worthy of consideration. If this is the case, it may be necessary to convince decision-makers to let go of stigmas and to think of the greater good. Or it might be necessary to educate decision-makers, imparting them with background skills that will allow them to understand and analyze the data. They may need a greater understanding of the principles of statistics and probabilities. How successful all this will be depends on their receptivity to learning new things, as well as the experts' teaching skills. It may also be a test of the decision-makers' sincerity -- those who sincerely hope to improve or eliminate a conflict will likely oblige.
In addition to facilitating the lines of communication between the experts and decision-makers, one of the most important things that a mediator can do is provide an environment for all parties to voice their fears regarding risks. People generally feel more inclined to make concessions when their concerns have been heard and acknowledged. If one or more of the parties feel their concerns are being ignored, they are likely to "dig in their heels" and oppose any risks to themselves. This tends to eliminate the possibility of compromise and, therefore, the possibility of an improvement in the conflict.
It also may be important to reduce the costs of risks by planning for a worst-case scenario. The worst-possible outcome should be considered and it should be asked, "What do we do if this happens?" There are often ways to provide a "safety net" that accounts for bad possibilities. One key strategy, for example, is to allow for "mid-course corrections." It can be disastrous to have an inflexible plan in which the worst case, if it happens, means complete failure. Instead, building in mid-course corrections to agreements allows people to react to bad possibilities and make them less costly. For example, remember the dilemma with constructing power plants based on growth projections? If instead of building large, costly plants the plan is to build smaller plants and make allowances for adjustments along the way, construction can react gradually to growth demands, thereby minimizing the costs if the projections are wrong.
One of the most important, yet most difficult, things to establish is that risks are largely inevitable, and that they must be balanced with the costs involved in avoiding them. It may be difficult for parties to accept, but in many cases a trade-off of risks and benefits is necessary. Managing risk has become big business. In addition to the insurance business (which is inherently about minimizing the effects of risks) risk management consulting organizations and departments within companies are growing ever more popular. If nothing else, this trend shows that people are increasingly sensitive to risking things they value, meaning that uncertainty and risk are likely to be ever-increasing considerations in conflicts as time goes on.[6]
[1] The current opinion of experts regarding radiation exposure is that there is no "safe" level. Any exposure to radiation carries with it the chance of ill health effects.
[2] An example of cautious shift can be taken from the highly publicized, highly controversial cleanup of the plutonium processing facility at Rocky Flats, Colo. The public pressure caused political leaders to pursue a cleanup strategy that was largely focused on appeasing public opinion. The option that was taken ended up costing between $2 and $3 billion per life saved on the chance that the radiation might cause cancer. Compare that to the fact that the total cost of health care in the entire state of Colorado is between $2 and $3 billion, and it can be easily argued that such a solution was not remotely cost effective.
[3] To show this, watch people at a roulette table. If the ball has come up, say, red several times in a row, people will start betting on black. This is because they think that the more red "hits" there are in a row, the greater the chances are that the next roll will come up black. Unfortunately, this is completely false: since each roll is independent of the other, the odds of any given color is exactly the same as they always have been regardless of previous rolls. For another example, try this test: Imagine a completely random group of 23 people. What do you think the odds are that two or more people in the group will have the same birthday? Make your guess without looking ahead, and then compare it with the answer given in the last endnote.
[4] One factor that makes risk probabilities potentially complicated is that experts often cannot provide absolute probabilities of risk factors (for example, that the risk of an American catching AIDS is two percent). Instead, they commonly offer only relative risk levels (for example, that by engaging in safe sex one reduces the risk of catching AIDS by 96 percent, or that the risk of catching AIDS in the United States is half the risk in South Africa). In other words, while experts can often offer insights into how to reduce certain risks, it is seldom that they can tell anyone what a given risk really is.
[5] Thus, the major issue that opponents of GM food take up is the lack of package labeling.
[6] The answer to our birthday probability question: the odds are 50/50, or a 50 percent probability, that two or more people in a random group of 23 will have the same birthday, and the odds rapidly get better and better the larger the group. Sound surprising? Most people will guess a very low probability to this question (I've had some guess 1 in a thousand!) but this is because most of us do not intuit probabilities very well. For a brief explanation of "The Birthday Problem," go to http://www.mste.uiuc.edu/reese/birthday/explanation.html
Use the following to cite this article:
Schultz, Norman. "Uncertainty." Beyond Intractability. Eds. Guy Burgess and Heidi Burgess. Conflict Information Consortium, University of Colorado, Boulder. Posted: January 2004 <http://www.beyondintractability.org/essay/fact-finding-limits>.