As anyone who’s watched Homeland knows, deciding how to respond to intelligence is a serious challenge for military organisations. You may well have a wealth of conflicting information, from sources of varying reliability, and are required to make a decision, without knowing all the facts, to save the lives of many.
Sometimes, the call made isn’t the right one. Back in 2003, the US invaded Iraq based on erroneous intelligence that suggested Saddam Hussein had weapons of mass destruction.
The person who made the call did so based on an array of information, no doubt including everything from satellite photos to spy intelligence and information from defectors to surveillance data. Not all of it was reliable, but it all had to be taken seriously.
“Current intelligence analyses are often based on information with a considerable degree of uncertainty. Intelligence analysts are constantly struggling with the reliability of circumstantial evidence,” explained Audun Jøsang, professor at the Department of Informatics, University of Oslo in Norway.
“The sources may be unreliable or directly misleading. When intelligence services in one country attempt to find out what another country is planning to do, they need to take into account the credibility of the information.”
Jøsang believes he may have the solution, in the form of a new type of mathematical logic designed to handle the uncertainty that is an everyday part of intelligence analysis.
This mathematical logic is being used to develop a new form of intelligence analysis based on subjectivity. At present, the models used by intelligence organisations require each piece of circumstantial evidence to be paired with a measure of how reliable it is: its probability.
However, Jøsang sees this as a flawed approach, as it incorrectly assumes a sense of certainty about how reliable a piece of information is.
“What they really ought to say is “we don’t know this”. However, such input arguments are not permitted in traditional analytical tools,” he said.
The solution is to provide an estimate of how certain this probability is, thus providing an extra level of subjectivity to intelligence analysis, and bringing quantification to uncertainty itself.
“With subjective logic, the input arguments may be completely uncertain and estimates can be made with these probabilities, even though they are fraught with uncertainty,” he said.
“Unless this is done, the uncertainty is swept under the carpet. We humans are stuck in our preconceived notions and follow the beaten track. We are unable to see things objectively.”
The system is designed to visualise the level of uncertainty using triangles, where a taller triangle shows a greater level of uncertainty. This provides a strong visual indication of the level of reliability, and so more accurately informs decision makers about the level of uncertainty, reducing risky decisions based on incomplete data.
“When we implement this logic, we can see the aspects of the theory that are incomplete and need to be straightened out. If the Americans had been in possession of this tool, perhaps they would have found that there was too much uncertainty with regard to the hypothesis saying that Saddam had weapons of mass destruction before taking such a momentous decision as to invade Iraq,” said Jøsang.
“Unless the uncertainty of the circumstantial evidence is taken into account, the analysis tool may erroneously conclude that there was a clear probability that Saddam had access to weapons of mass destruction.”
The research, which is sufficiently advanced to be made into workable analytical tools for active intelligence analysts, has attracted considerable interest from military organisations, including the US Army and the Norwegian Military Intelligence Service.
“By incorporating uncertainty, subjective logic has the potential to revolutionise automated probability reasoning and improve intelligence operations. The method may enable the decision-maker to realise when the responses are too uncertain and that more information needs to be collected,” said Dr Lance Kaplan, from the Networked Sensing & Fusion Branch, US Army Research Laboratory.
The US Army has even invested in the research, providing Jøsang with NOK 2m ($262,000 / £173,000).
“We still need answers to a number of fundamental questions,” added Kaplan. “The US Army Research Lab is therefore collaborating with Professor Jøsang through the project “Advanced Belief Reasoning in Intelligence” to determine whether and how his idea can be realised.”