The death of former President Gerald Ford on December 26, 2006 produced a collective outpouring of sympathy and remembrances across the United States that culminated in a formal state funeral January 2, 2007. And while many dignitaries attended, the service at the National Cathedral was, in effect, an “American” one led by President Bush.
The ritual associated with the death and burial of a former leader normally is a communal or relational undertaking–unless the former leader fell from power. This latter circumstance also played out, ironically, on December 30 in the unseemly hasty execution of Saddam Hussein, a sentence not so much of justice but of revenge, which is an individual as opposed to a collective act.
But there was another significant death at the end of December 2006 that could not be completely masked by either Gerald Ford’s death or Saddam’s execution. However much the White House might have wanted to deflect attention away from the event, it could not control the reality that on December 31st the U.S. military reached its 3,000th fatality in Operation Iraqi Freedom.
Gerald Ford and that U.S. soldier were in different wars and of different generations, but both were caught up in a “war gone wrong.” By this I mean armed conflict stemming from a fundamental misunderstanding by U.S. policymakers of foreign cultures that lead to unanticipated or unintended reactions to the application of U.S. power. Lyndon Johnson and Richard Nixon bear responsibility for the morass in Vietnam that Gerald Ford inherited when he became president. George Bush cannot claim the equivalent for Iraq. By his failure to foresee that Iraqis would react violently to foreign troops on their soil, Bush must assume full responsibility for dividing not only Iraqis but also the U.S. public.
Moreover, he seems to be attempting to hijack this fatality milestone for policy ends, transforming what otherwise would be a private or semi-private ritual of loss into a communal affair whose message is “honor the dead by more dying.” A proposal to “surge” forces, expected in an address to the nation next week, would undoubtedly initiate other clashes that could only “go wrong” in terms of lives lost and resources consumed.
Can anything be done to stop this drift toward continued armed violence in Iraq and elsewhere in the region? Yes.
War is a communal activity; this opens the possibility that a people, a community, can stop wars. What is needed first is knowledge–“Know thyself” and “Know thy enemy”–organizationally, historically, mentally, and emotionally–particularly when different cultures engage each other. To the extent that the public understands the sociology powering its communal (and also the individual) evolution–the better prepared it can be and grasps the reality and that these differences can be critical, the better prepared it can be to resist government policies that increase the chances for or result in war.
Iraq attests to the need to be alert. It was only four and a half months after September 11, 2001, on January 29, 2002, that George W. Bush used his second State of the Union speech to identify–“create” might be more accurate a new mini-bloc of nations: the now infamous “Axis of Evil.” It was perfect: easily remembered, didn’t require more than two of the typical adult’s 10-second attention span, and was brief enough to go on a bumper-sticker in big letters.
It was also disastrous reductionism.
Considering the World War II referent for that night’s rhetorical creation, the rulers of Iraq, Iran, and North Korea–if they were watching Bush–must have marveled at the ignorance and insanity that could lump their three countries in the same foreign policy basket. Iranians and Iraqis had been at each other’s throats–again–less than 15 years earlier in their eight-year (1980-1988) bloodbath. The super-reclusive North Korean regime may have sold conventional arms to Iran and Iraq and transferred nuclear technology to Iran, but its primary partner in the nuclear field was Pakistan, a country that didn’t even exist until after the first “axis” lay in ruins. To maintain the “war fever” that developed during the attack on and defeat of the Taliban, Bush had to find a way to re-focus the public’s post-September 11 “fear factor” on a credible “new” enemy. The World War II axis had possessed both the intention and the capability to try world domination. Applying the old image to the new targets served as a springboard in the effort to induce unquestioned acceptance of the administration’s war policy by U.S. and world audiences.
Clearly, the phrase is simple speechwriting sloganeering. Had it ended there, the subsequent military and diplomatic disasters stemming from this psychological reductionism might have been avoided. Unfortunately, as the sound-bite settled in the psyche of Bush administration officials and the U.S. pubic, it quickly eclipsed the relatively subtle differences in morés and culture among the three states in Bush’s axis. Worse, the phrase also created a propensity among administration policymakers and negotiators to forget (if they ever really understood) that cultural differences are not fantasies created out of thin air. They do exist and have deep sociological roots. For example, in some cultures “form” seems almost as important as “substance,” while in others only the substance matters. So, when the two collide, especially if the “substance only” culture has “required” actions or makes other demands (e.g., “with us or against us”), not only can the immediate result be less than satisfactory, the entire relationship can turn so sour as to become downright poisonous.
This is not to say that at the analytical or operative levels the Bush administration has been oblivious to the differing capabilities of each country–e.g., North Korea has exploded (or half-exploded) a nuclear device while Iran has not. But, judging from the extensive failures in advancing historic U.S. foreign policy goals, either those serving in foreign affairs positions are inept or they don’t make adequate allowance for and adjustment to cultural or societal differences.
The problem is not a lack of available information or having to spend weeks and months decoding a special sociological vocabulary. To that extent, there is still time for this administration to mitigate the effects of bad policy and programs and start the process of healing U.S. relationships with other countries and cultures. With that in mind, what follows is a short look at one small element of human interactions.
Cultural anthropologists have long understood that the form and conduct of interpersonal relations are unconsciously governed by a set of expectations or “rules” that can vary from one society to another or even within the same society, depending on its level of complexity. One such variable, variously labeled “hierarchical-relational” or “individualistic-collective,” considers the set of expectations that shape an individual’s social development and harmonious integration into structures as simple as tribal to as complex as modern bureaucracies.
At first glance, this may seem a facile dichotomy, for every human “unit” from the family to modern governmental bureaucracies and transnational organizations seem to possess a hierarchical structure. But I suggest that this dichotomy is real for the following reasons.
* A sketch of relationships within a tribal power (and thus social) structure typically is vertically compressed into only three or four levels from top (sheik, sub-chief, Council of Elders) to the bottom, where the majority of ordinary members are found–and further “ordered” by gender and age;
* Westerners are conditioned by their upbringing to accentuate any sign of “individualism” and to overlook the degree to which human relationships (as opposed to institutional ones) are actuated by the pronounced predisposition toward mutual interdependence for survival as well as mental and physical development.
* Because relational societies are vertically compressed, a large number of people can be at all but the very top levels of the organization, greatly attenuating the “irrational” struggle to beat out possible competition as there is little to gain and much to lose trying “to get ahead.”
* In relational societies, all are in their places and there are places for all–as long as one does not roil traditions and practices.
In short, what facilitates maintenance of the group is more important than any individual desire. This conditionality may never be explicitly taught, but it is part of the individual’s knowledge base by virtue of being a member of the specific society in which one is born and raised. In turn, individual survival is enhanced because each person discovers where he or she fits in the social structure; discovers what the responsibilities of their “place” are, what they are entitled to do, have, and be; and the acceptable methods for competing for influence and eventual leadership.
Such societies may appear to lack sources of motivation, inspiration, and achievement. Nothing could be further from the truth. The human drive to achieve goals and to have one’s accomplishments recognized is re-channeled from the Euro-American concern with accumulating personal fortunes and power into other areas of life. For instance, in battles between Native Americans, to rush or ride up to an enemy warrior, strike him a non-lethal blow, and escape back to one’s war-party was counted a greater accomplishment and brought greater prestige than killing the enemy–until the European settlers came on the scene. Not quite as clear-cut is the traditional labor union, an organization that exists on the basis of a shared occupation across many corporate hierarchical structures. In this case, “recognition” of member accomplishments (greater individual productivity) is sought in a greater share of the corporate profits for the union members, which in turn theoretically raises the standard of living of the members.
Relational societies regard “honor” as a non-negotiable collective or tribal attribute that must be defended and, if “lost,” regained at all cost–including in some groups the death of the one victimized (dishonored) because he or she is a constant reminder of the dishonor, as well as the one who did the “dishonoring.” But unlike the West, where swift justice and “closure” drive retribution, in relational societies the figurative “sword of Damocles” can hang for years, even decades, over the neck of the offender because of time, distance, or other physical impediments. But the “crime” is never forgotten let alone forgiven.
Hierarchical or individualistic societies are another way to organize human endeavors. The more complicated and the bigger the organization, the more likely the evolution of hierarchies and their administrative support structures, regardless of whether one is in the Orient or the Occident. An important organizing feature of these entities is the “line and block” chart, either actual (as in modern bureaucracies) or virtual (as in ancient, highly structured empires such as Egypt). Particularly in modern bureaucracies, these specify the channels of authority and responsibility of each person in the group as well as which individual occupies which place on the chart. This, essentially a directed or selective process made at the top of the organization, is far different from the relational “discovery” process.
Unlike relational societies, hierarchical ones have many more levels and therefore many more channels for interaction among peers on the same level and with those on levels above and below. Another difference is the widespread development and sometimes intense use of networks held together by informal (nonlinear) communication links (the ubiquitous “back channels”).
The Roman Catholic Church and the armed forces of every country in the world are examples of hierarchical entities. Within the different levels of the organization there may be many individuals doing virtually the same type of work. Those who outperform their peers (or obtain an advantage in some other way) can be selected to fill a vacancy at the next higher level. What one does often is not as important as how well the work is done. Advancement comes when performance exceeds expectations of the contribution made to achieving the organizational goals AND because those in authority notice the achievement. Thus individual reputation becomes the critical key for advancing even as the number of positions available at higher levels becomes fewer. Eventually, there are simply not enough places for the number of “qualified” individuals in the organization since “place” is contingent on continually ascending the levels on the chart–or staging a coup and creating a new chart.
The Church, even with its “militants,” has traditionally differed from armies in that its members, once within the fold, tend toward a posture that is spiritually-based and relational. That is, once a Roman Catholic, always a Roman Catholic–except for the occasional heretics, the “Great Schism” of 1054, and the Protestant Reformation that began in Europe in 1517. The military, however, tend to be “rotational”; that is, when called to duty they gather, perform their mission, and disperse or, in more modern times, move periodically from one post to another and from one unit to another, thus attenuating the formation of a relational society.
In such hierarchical systems, less emphasis is on “honor” and “honesty” as collective or relational virtues. These become “objective” virtues. But because they depend primarily on and are observed by imperfect or seducible individuals who make decisions that are not always in the best interest of the organization, these virtues are supplemented with positive or negative (or both) incentives to help motivate observance of the “virtue” by the individual. In the end, the overall goals of the institution are met by the reciprocal integrity of each individual’s contribution.
Surviving The Encounters
Before going further, a cautionary note is necessary.
While a “civilization” may exhibit a proclivity for either the relational or the hierarchical paradigm, some societies within each civilization inevitably will be contrary to the dominant model. Moreover, where both exist within a single culture, it is possible for the paradigm to flip as older generations pass from the scene. As astro-physicist Dr. David Darling notes in his book, Equations of Eternity, Plato’s relational universe of pre-existent perfect “forms” and their flickering shadows in the material world gives way to (but is not destroyed by) Aristotle’s view that true “reality” is a combination of the singular objects that comprise the material world and the abstract characteristics of objects that allow for generalizations and classifications. In China, highly developed relational philosophies such as Taoism existed along side the hierarchical Confucian “civil service.”
When two hierarchical societies confront each other–where “confronting” does not have to include armed conflict–one usually will be seen to “win” and the other to “lose.” Such judgment can flow from one-on-one negotiations between the disputants, arbitration–e.g., decisions of the World Trade Organization–or from intervention by outside parties trying to prevent the start or mitigate the effects of armed conflict.
But when a more relational social order encounters or is challenged by a hierarchical order, westerners invariably assume–even predict–the demise of the relational model. Yet its structural bias toward extensive interdependence ought to be a formidable barrier to such unraveling since its very existence rests on relational bonds–as in today’s Iraq. Is there some mode of thought or interaction that is so naturally a part of the culture and social development of non-western children that it parries effortlessly the assumed strengths of hierarchies? And if this is the practical reality of the 21st century, does it constitute enough of a modification to the traditional relational structure to constitute a third model to which western leadership elites have yet to respond?
One prized characteristic of a great leader is the ability to project and, at critical junctures, demonstrate near-perfect competence, defined as exerting personal influence, overcoming any opposition, or meeting the expectations of his followers–at a calculated cost that is less than the expected benefit. A classic (literally) illustration from ancient Greece is Agamemnon’s sacrifice of his daughter Iphigenia to placate Poseidon and assure good winds for the deployment of his soldiers during the opening days of the Trojan War.
This is an example of the Greco-Roman-Western interpretation of the concept of “reputation” or what, in 1992, Professor Stella Ting-Toomey termed “face saving.” The Occident’s emphasis on the individual in terms of obligations, rights, responsibilities, and competency within a hierarchy reflects preoccupation with the “self”–self respect, personal credibility, and values. Thus a Westerner who “loses face” suffers first and foremost personal humiliation.
The more traditional relational societies understand “face saving” in a collective context. It is the sense that one does not do or say anything that would reflect poorly on the carefully crafted honor or image of the organization, group, or family. Thus while a dishonorable act can be purely individualistic, that the offender was not constrained by the group’s values and peer pressures to uphold collective values and practices is seen as a group’s failure, rather than being a personal one.
This is not an inconsequential distinction for business leaders, diplomats, and presidents who have to work with people whose cultural paradigm is less hierarchical. The individualistic western model tends to trap westerners in a rigid dichotomy: when challenged (or challenging) on the basis of individual status, the context becomes win-lose. One either saves or loses face. And, because the apparent choices are limited to two and are so intimately tied to self or ego, those who come from hierarchical societies or even sub-cultures rarely can disengage sufficiently to see that a third option is possible when the “self” no longer is the focus.
This third option Ting-Toomey calls “face giving.” Although the term sounds like something from Eastern philosophy, the idea it describes is not completely unknown in the West–it simply is as rare as hen’s teeth in the conduct of U.S. diplomacy. “Face giving” is the ability to sustain a sufficient breadth of context so that, when the participants complete their discussions, each can claim a positive outcome–e.g., a potential confrontation has been avoided or a dispute resolved. But this can happen only if rigid positions are avoided, if a discussant does not press an advantage so far that others feel backed into a corner. Moreover, only if face giving extends beyond an individual problem can it establish the type of “win-win” mentality that will keep the discussions going until a resolution agreeable to all emerges. And that solution may be as simple as redefining the issue so as to provide a more inclusive context that draws in the resources of additional stakeholders.
But for someone who sees only black and white, good and evil, in the world, face giving can easily be demonized as a form of “evil” appeasement–Neville Chamberlain’s sin. He tried to mollify Hitler by relinquishing something valuable (e.g., Sudetenland’s liberty) that was not his to give or to take to then give. Fred Hutchinson, writing in 2005 about the two forms of appeasement, saw Winston Churchill and Christ as “virtuous” appeasers because their actions were undertaken at significant personal cost and generated enormous “good.” Yet here too, there is danger, for an evaluator may consciously or unconsciously incorporate the premise that whatever provides the greatest happiness for the greatest number of people is ipso facto virtuous.
Face giving is also part of the process of recovering face in non-Western countries. Again, the East looks at “win-win” as the goal. Because westerners tend not to be practiced in this art, they are more prone to move from the initially confused state that accompanies the loss of face to a defensive and even an “offensive” stance–each of which is an individualistic response rooted in a confrontational, hierarchical culture.
The result is predictable: growing if not immediate distrust of motives, means, and goals.
Unless quickly reversed, these negative influences can whirl out of control–not in the emotional sense but through intra- and inter-personal alienation. The complex, hierarchical society founders on centrifugal alienation that finds each person standing alone relative to every other person. But this is an exposed and therefore inherently dangerous, even potentially self-destructive, scenario. Escape is necessary and is achieved by rationalizing the experience of alienation as the more admirable Western myth of “standing tall” or “standing on principle.” Conversely, alienation in more relational or traditional societies tends to be intra-personal and centripetal. It is a loss of self–in extremis a loss of identity as “self” is subordinated in the process of conforming to the collective.
One could, in fact, do much worse than to survey the myths of a people or culture to gain an appreciation of the balance between hierarchical and relational propensities that distinguish a particular society. For example, a prominent U.S. myth is the inevitable triumph of the “exceptional” experiment summarized in the phrase “Manifest Destiny.” The April 1859 Democracy Review opined:
“We are governed by the laws under which the universe was
created, and therefore, in obedience to those laws, we must of
necessity move forward in the paths of destiny shaped for us
by the great Ruler of the Universe.”
This is the substance–the laws that govern the course of nature. The form, the image might change and did change: for the Puritans it was “the city on the hill,” for the colonial revolutionary leaders it was Enlightenment principles. Ralph Waldo Emerson advised “hitch your wagon to a star,” while John Soule, editor of the Terre Haute, Indiana Express, titled an 1851 editorial “Go west young man, and grow up with the country.” By the end of the 19th century, “Manifest Destiny” served as the guiding narrative of the American experience.
While clearly glorying in the power the U.S. amassed, late 20th and early 21st century U.S. leaders too often seemed to disregard the fact that the ascendancy of the U.S. was in part due to the widespread devastation of Europe, Asia and much of Africa in 20th century armed conflict–made possible by building the cold, efficient, rigid organizational structure that in turn built the means of destroying the planet many times over.
But this is not the worst failure. That can be found in Nathaniel Hawthorne’s “Ethan Brand,” a short story whose title character embarks on a quest to identify the “unpardonable sin.” Eighteen years later, having indulged in every imaginable evil, he returns home, his quest ended. Ethan Brand found that the unpardonable sin, “a sin that grew within my own breast [and] nowhere else,” is the excising of all human emotions and relational feelings in an unrelenting, highly focused pursuit of an idea or ideology.
And this brings the issue full circle back to the Bush administration. For in the act of developing and applying the neocon orthodoxy, the true believers became so invested in attempting to make events conform to their vision that they could not admit even the smallest variation or interpretation to their creed. They mercilessly manipulated every available force–including fear of an enemy that allegedly had the intent and capability to cause extensive death and destruction–in defense of their “truth.” In such an ideological fog, ordinary people and even government officials were not inclined or motivated to try to understand or to meet the “other”–which of course left no opportunity to “give face.”
For the neocons, but for no one else, it was “win-win.” Success was its own justification. Failure did not mean the vision was flawed but that the public lacked the will to carry through and reach the vision offered them. Everything depended on isolating the I.S. public from contact with the “enemy.” For without contact, it was easier to exaggerate threats to the point that fear became communal hatred–cold, calculating, the type that consumes all feeling, sunders all charity, and excises all mercy as it tramples all justice.
This is the emptiness, the unpardonable sin, of Hawthorne’s Ethan Brand.
Will it also be the sin of George W. Bush’s administration–and the nation’s?
Col. DAN SMITH is a military affairs analyst for Foreign Policy In Focus , a retired U.S. Army colonel, and a senior fellow on military affairs at the Friends Committee on National Legislation. Email at email@example.com.