
Applied Thinking for Intelligence Analysis: A Guide for Practitioners (2014) [pdf] - yarapavan
http://airpower.airforce.gov.au/APDC/media/PDF-Files/Air%20Force%20Publications/AF13-Applied-Thinking-for-Intelligence-Analysis.pdf
======
Animats
Interesting. Chapter 5, on "Knowledge, Information, and Evidence" is probably
the most useful.

There are two things I looked for there and did not see. These are standard
issues in intelligence analysis.

1\. Duplicate confirmations. If you have two reports which agree, are they
from different sources, or from the same source via different paths? If the
latter, it's not a confirmation.

2\. The difference between noise and deception. Some information is noisy, and
statistical methods for filtering may help extract signal from noise. Some
information is deliberately deceptive, and noise filtering will not help. The
average will still be biased.

We've reached the point in politics where you need to know this just to read
news.

The paper does mention the strong human tendency (all the higher mammals,
really) to see patterns in noise. This is a useful survival instinct, in that
fleeing before you're certain there's a threat is a benefit. It's a big
problem in complex situations. It's also the basis of religion. There are so
many classic examples of that in intelligence history. The paper mentions
over-analysis of V-1 bomb targets, although I'm not sure that was a real thing
at the level of people who had access to maps of the hit points. The
inaccuracy of the thing was clear; "Greater London" was about as good as it
could do. Air bases were never attacked; too small a target. If the V-1 had
been accurate enough to hit air bases, it would have been used against them in
the Battle of Britain.

There is, however, a classic example of that mistake. During the 1960 Cuban
Missile Crisis, US analysis of aerial photos of the missile sites showed a
layout more suitable to the Russian tundra than a tropical island. So there
was a theory that the USSR now wanted the US to recognize them as missile
sites, following a long period of deception as the equipment was moved into
place. That was considered an open threat.

In 1987, there was a sort of Cuban Missile Crisis reunion, with people from
both sides present. That subject came up. The officer who'd been in charge on
the Soviet side said, "No, we just did it that way because that was what the
field manual said to do". Everyone on both sides who'd ever worn a uniform
nodded in understanding.

------
jonahbenton
I have not seen this one (AUS-based) but can strongly recommend a similar
piece from a US intel analyst pairing- Critical Thinking for Strategic
Intelligence, by Katherine Hibbs Pherson and Randolph Pherson. Niche domain
but for discussion of the work of strategy production of any kind, very
valuable.

------
handedness
Good guide.

The late Dick Heuer (referenced a few times in this paper) wrote the
Psychology of Intelligence Analysis, the gold standard:
[https://www.cia.gov/library/center-for-the-study-of-
intellig...](https://www.cia.gov/library/center-for-the-study-of-
intelligence/csi-publications/books-and-monographs/psychology-of-intelligence-
analysis/PsychofIntelNew.pdf)

------
tschiller
Great resource, just added it to the reading list wiki for Open Synthesis:
[https://github.com/twschiller/open-synthesis/wiki/Reading-
Li...](https://github.com/twschiller/open-synthesis/wiki/Reading-List)

------
schappim
Thanks for sharing

