(by Adam Elkus)
Apropos of a conversation I had with infosec provocateur The Grugq last night and a previous conversation with Nick Prime, a short comment on this piece on US covert aid to Colombia (mostly quotations from others):
Only then would Colombian ground forces arrive to round up prisoners, collecting the dead, as well as cellphones, computers and hard drives. The CIA also spent three years training Colombian close air support teams on using lasers to clandestinely guide pilots and laser-guided smart bombs to their targets. Most every operation relied heavily on NSA signal intercepts, which fed intelligence to troops on the ground or pilots before and during an operation. “Intercepts .?.?. were a game changer,” said Scoggins, of U.S. Southern Command. The round-the-clock nature of the NSA’s work was captured in a secret State Department cable released by WikiLeaks.
In the spring of 2009, the target was drug trafficker Daniel Rendon Herrera, known as Don Mario, then Colombia’s most wanted man and responsible for 3,000 assassinations over an 18-month period. “For seven days, using signal and human intelligence,” NSA assets “worked day and night” to reposition 250 U.S.-trained and equipped airborne commandos near Herrera as he tried to flee, according to an April 2009 cable and a senior government official who confirmed the NSA’s role in the mission.
The piece mainly focuses on the use of intelligence, precision weapons, and targeting to kill off key FARC leaders, even if the quoted paragraph talks about a drug lord. I included it because it was the most succinct summary of the methodologies used. What the piece really shows is the exporting of “industrial” counterterrorism and counter-guerrilla targeting methods pioneered in Iraq by special operations forces. These methods differ from older ones in Vietnam in their speed and technological sophistication, and they differ from the “Killing Pablo” mission the sheer scale of the problem (a resilient insurgent group, not just a drug kingpin). And it’s based in large part on metadata, as Jack McDonald argued:
For me, the importance of Prism, and like efforts, isn’t the question of government invasions of privacy, but rather the ability of the government to use violence against a population. Regardless of the strategic end, analysis of metadata allowed the American government to pull apart Baghdad bomb networks in a way that would have been far more difficult without it, if not impossible. If a couple of thousand special forces soldiers could do that in a foreign country, think what the same capability could do in a domestic context. This capability, I think, is what re-writes the social contract in favour of the government. The reason for this is that it alters the latent balance of violence between the state and the population. I think, however, that this takes place alongside another changing relationship, which is the balance of violence between individuals and the population. In the security/liberty debate we tend to focus on the former, sometimes forgetting the latter. We don’t like big states because they can oppress us, but at the same time, these days, individuals can do that, too.
The NSA caper is one outgrowth of the increasing legibility of social systems (in one respect) that the rise in graph analysis technologies, databases, and improved intelligence collection techniques brings. Here’s a bit on legibility, with Venkatesh Rao riffing off James C. Scott’s Seeing Like A State:
The book is about the 2-3 century long process by which modern states reorganized the societies they governed, to make them more legible to the apparatus of governance. The state is not actually interested in the rich functional structure and complex behavior of the very organic entities that it governs (and indeed, is part of, rather than “above”). It merely views them as resources that must be organized in order to yield optimal returns according to a centralized, narrow, and strictly utilitarian logic. The attempt to maximize returns need not arise from the grasping greed of a predatory state. In fact, the dynamic is most often driven by a genuine desire to improve the lot of the people, on the part of governments with a popular, left-of-center mandate. Hence the subtitle (don’t jump to the conclusion that this is a simplistic anti-big-government conservative/libertarian view though; this failure mode is ideology-neutral, since it arises from a flawed pattern of reasoning rather than values).
The book begins with an early example, “scientific” forestry (illustrated in the picture above). The early modern state, Germany in this case, was only interested in maximizing tax revenues from forestry. This meant that the acreage, yield and market value of a forest had to be measured, and only these obviously relevant variables were comprehended by the statist mental model. Traditional wild and unruly forests were literally illegible to the state surveyor’s eyes, and this gave birth to “scientific” forestry: the gradual transformation of forests with a rich diversity of species growing wildly and randomly into orderly stands of the highest-yielding varieties. The resulting catastrophes — better recognized these days as the problems of monoculture — were inevitable.
The picture is not an exception, and the word “legibility” is not a metaphor; the actual visual/textual sense of the word (as in “readability”) is what is meant. The book is full of thought-provoking pictures like this: farmland neatly divided up into squares versus farmland that is confusing to the eye, but conforms to the constraints of local topography, soil quality, and hydrological patterns; rational and unlivable grid-cities like Brasilia, versus chaotic and alive cities like Sao Paolo. This might explain, by the way, why I resonated so strongly with the book. The name “ribbonfarm” is inspired by the history of the geography of Detroit and its roots in “ribbon farms” (see my About page and the historic picture of Detroit ribbon farms below).