Hypothesis on misconception

Jervis의 국가행동학에 대한 저서이다.

서문

In determining how he will behave, an actor must try to predict how others will act and how their actions will affect his values. The actor must therefore develop an image of others and of their intentions. This image may, however, turn out to be an inaccurate one; the actor may, for a number of reasons, misperceive both others' actions and their intentions. In this research note I wish to discuss the types of misperceptions of other states' intentions which states tend to make.

The concept of intention is complex, but here we can consider it to comprise the ways in which the state feels it will act in a wide range of future contingencies. These ways of acting usually are not specific and well-developed plans. For many reasons a national or individual actor may not know how he will act under given conditions, but this problem cannot be dealt with here.

PREVIOUS TREATMENTS OF PERCEPTION IN INTERNATIONAL RELATIONS

Although diplomatic historians have discussed misperception in their treatments of specific events, students of international relations have generally ignored this topic. However, two sets of scholars have applied content analysis to the documents that flowed within and between governments in the six weeks preceding World War I. But the data have been put into quantitative form in a way that does not produce accurate measures of perceptions and intentions and that makes it impossible to gather useful evidence on misperception.

The second group of theorists who have explicitly dealt with general questions of misperception in international relations consists of those, like Charles Osgood, Amitai Etzioni, and, to a lesser extent, Kenneth Boulding and J. David Singer, who have analyzed the cold war in terms of a spiral of misperception.' This approach grows partly out of the mathematical theories of L. F. Richardson' and partly out of findings of social and cognitive psychology, many of which will be dis- cussed in this research note.

These authors state their case in general, if not universal, terms, but do not provide many historical cases that are satisfactorily explained by their theories. Furthermore, they do not deal with any of the numerous instances that contradict their notion of the self-defeating aspects of the use of power. They ignore the fact that states are not individuals and that the findings of psychology can be applied to organizations only with great care.

Most important, their theoretical analysis is for the most part of reduced value because it seems largely to be a product of their assumption that the Soviet Union is a basically status-quo power whose apparently aggressive behavior is a product of fear of the West. Yet they supply little or no evidence to support this view. Indeed, the explanation for the differences of opinion between the spiral theorists and the proponents of deterrence lies not in differing general views of international relations, differing values and morality, or differing methods of analysis, but in differing perceptions of Soviet intentions.

THEORIES—NECESSARY AND DANGEROUS

Despite the limitations of their approach, these writers have touched on a vital problem that has not been given systematic treatment by theorists of international relations.

The evidence from both psychology and history overwhelmingly supports the view (which may be labeled Hypothesis) that decision-makers tend to fit incoming information into their existing theories and images. Indeed, their theories and images play a large part in determining what they notice. In other words, actors tend to perceive what they expect.

Furthermore (Hypothesis), a theory will have greater impact on an actor's interpretation of data (a) the greater the ambiguity of the data and (b) the higher the degree of confidence with which the actor holds the theory.' For many purposes we can use the concept of differing levels of perceptual thresholds to deal with the fact that it takes more, and more unambiguous, information for an actor to recognize an unexpected phenomenon than an expected one.

An experiment by Bruner and Postman determined “that the recognition threshold for incongruous playing cards (those with suits and color reversed) is significantly higher than the threshold for normal cards.” Not only are people able to identify normal (and therefore expected) cards more quickly and easily than incongruous (and therefore unexpected) ones, but also they may at first take incongruous cards for normal ones.

However, we should not assume, as the spiral theorists often do, that it is necessarily irrational for actors to adjust incoming information to fit more closely their existing beliefs and images. ("Irrational” here describes acting under pressures that the actor would not admit as legitimate if he were conscious of them.)

Abelson and Rosenberg label as “psychologic” the pressure to create a “balanced” cognitive structure—i.e., one in which “all relations among ‘good elements’ [in one's attitude structure] are positive (or null), all relations among ‘bad elements’ are positive (or null), and all relations between good and bad elements are negative (or null).”

They correctly show that the “reasoning [this involves] would mortify a logician.” But those who have tried to apply this and similar cognitive theories to international relations have usually overlooked the fact that in many cases there are important logical links between the elements and the processes they describe which cannot be called “psycho-logic.”

(I am here using the term “logical” not in the narrow sense of drawing only those conclusions that follow necessarily from the premises, but rather in the sense of conforming to generally agreed-upon rules for the treating of evidence.)

For example, Osgood claims that psycho-logic is displayed when the Soviets praise a man or a proposal and people in the West react by distrusting the object of this praise. But if a person believes that the Russians are aggressive, it is logical for him to be suspicious of their moves. When we say that a decision-maker “dislikes” another state this usually means that he believes that that other state has policies conflicting with those of his nation. Reasoning and experience indicate to the decision-maker that the “disliked” state is apt to harm his state's interests. Thus in these cases there is no need to invoke “psychologic,” and it cannot be claimed that the cases demonstrate the substitution of “emotional consistency for rational consistency."

The question of the relations among particular beliefs and cognitions can often be seen as part of the general topic of the relation of incoming bits of information to the receivers' already established images. The need to fit data into a wider framework of beliefs, even if doing so does not seem to do justice to individual facts, is not, or at least is not only, a psychological drive that decreases the accuracy of our perceptions of the world, but IS essential to the logic of inquiry.”

Facts can be interpreted, and indeed identified, only with the aid of hypotheses and theories. Pure empiricism is impossible, and it would be unwise to revise theories in the light of every bit of information that does not easily conform to them.

No hypothesis can be expected to account for all the evidence, and if a prevailing view is supported by many theories and by a large pool of findings it should not be quickly altered. Too little rigidity can be as bad as too much This is as true in the building of social and physical science as it is in policy-making.“

While it is terribly difficult to know when a finding throws serious doubt on accepted theories and should be followed up and when instead it was caused by experimental mistakes or minor errors in the theory, it is clear that scientists would make no progress if they followed Thomas Huxley's injunction to “sit down before fact as a mere child, be prepared to give up every preconceived notion, follow humbly wherever nature leads, or you will learn nothing.”

As Michael Polanyi explains, “It is true enough that the scientist must be prepared to submit at any moment to the adverse verdict of observational evidence. But not blindly.... There is always the possibility that, as in [the cases of the periodic system of elements and the quantum theory of light], a deviation may not affect the essential correctness of a proposition.... The process of explaining away deviations is in fact quite indispensable to the daily routine of research,” even though this may lead to the missing of a great discovery."

For example, in the astronomer Lalande did not follow up observations that contradicted the prevailing hypotheses and could have led him to discover the planet Neptune.”

Yet we should not be too quick to condemn such behavior. As Thomas Kuhn has noted, “There is no such thing as research without counter-instances.”” If a set of basic theories—what Kuhn calls a paradigm—has been able to account for a mass of data, it should not be lightly trifled with.

As Kuhn puts it: “Lifelong resistance, particularly from those whose productive careers have committed them to an older tradition of normal science [i.e., science within the accepted paradigms, is not a violation of scientific standards but an index to the nature of scientific research itself.

The source of resistance is the assurance that the older paradigm will ultimately solve all its problems, that nature can be shoved into the box the paradigm provides. Inevitably, at times of revolution, that assurance seems stubborn and pig-headed as indeed it sometimes becomes. But it is also something more.