Center for Strategic Communication

[ by Charles Cameron — it occurred to me to ask ]
.

I have a question for the assembled horde — but first, the shoes:


Getting your feet x-rayed and fitted for a new pair of shoes, ca. 1950

**

You know the way they say (Elizabeth Kübler-Ross, eg, with no implied claim of veracity here, just interest) that you go through various stages of grief: denial, anger, bargaining, depression and acceptance?

Suppose there are stages of response to terror that governments, agencies, leaders, pundits, analysts & journos tend to go though. Suppose at the start they lean to the vengeful and are therefore prone to see things in black and white, no nuance, confrontational, response intense & military rather than diplomatic — and in later stages get calmer, begin to see motives less single-strandedly, catch details previously missed, suggest responses that are more measured, more proportional, etc.

If we got really clear on how this tends to work, could we begin to have an understanding of the ratio between “heat of the moment” and “after the fog of war clears” thinking, which in turn could allow us to discount initial reactions, look for “next stage” signals in the cognitive periphery, and get a more accurate read through the fog from the start?

We know now, eg, that the first reaction at OKC was to expect Muslim blame, but it become clear that McVeigh did it — and first expectations were dashed. With WMD in Iraq the clearing of the fog took longer, but it still happened.

I’m suggesting that people who have just been affronted or attacked will understand better, later, and that for more appropriate response, some time lag may be required. But does the lag time have formal features, styles of assumption that gradually give way identifiably and reliably to more nuance and accuracy as certain formal issues are addressed — so there could be a checklist, and a kind of 2 week, two month, two year, two decade look ahead / lookback methodology devised, charted, and implemented, eg as a part of scenario planning and / or red teaming?

Is some of this implicit in the second O in the OODA loop? Can we take it usefully further?

**

Yes, when I was a boy, you stepped up to the x-ray machine in the shoe store, pushed your feet in and peered into the viewer at the top of the machine to see how well your new shoes fit your ghost-of-a-skeleton feet.

Later on, this was viewed as an unhealthy way to judge the fit of a shoe, and life and choice in shoe stores became more complicated.

Share