Lessons in Intelligence with Sir David Omand GCB

Media Thumbnail
  • 0.5
  • 1
  • 1.25
  • 1.5
  • 1.75
  • 2
This is a podcast episode titled, Lessons in Intelligence with Sir David Omand GCB. The summary for this episode is: <div>Producing high quality intelligence depends on discerning meaning from information - being able to describe what is happening, why, and what will happen next. This is the art of the intelligence officer and in this episode of the Janes podcast we talk to Sir David Omand, former head of GCHQ and the UK's first security and intelligence coordinator, about his new book, "<em>How Spies Think: Ten Lessons in Intelligence</em>" and his model for how individual analysts can produce better intelligence. The lessons and skills discussed here are just as valuable for anyone who wants to apply intelligence methods to making better decisions in all areas of life.</div>
How has intelligence changed over the last 10 years?
02:04 MIN

Terry Pattar: Hello, and welcome to this episode of the Janes World of Intelligence Podcast. I'm Terry Pattar. I lead the Janes Intelligence Unit. I'm joined on this episode by a very special guest, Sir David Omand, who has agreed to come and talk to us about his latest book, How Spies Think, which is a fascinating read. I want to say, actually, rather than" How Spies Think, if I were to name the book, it would have a much less catchy title, but one that's probably more descriptive, which is, " How Intelligence Officers Should Think If They've Had the Right Training and Have Applied It." But David, it would be great to get some of your thoughts, and some of your description about the book and what went into it. But just to give a brief introduction to yourself, as a former director of GCHQ and somebody who's held various senior roles within the intelligence community in the UK, you've seen it and done it all really, I think, in terms of intelligence work. And then, having worked in academia over the last decade or so, I want to say? I think you've probably also had a lot of interactions now with people outside of the intelligence community, doing similar things. So, this is a conversation I've been really looking forward to for a number of weeks, actually. So, thank you for joining us, and welcome.

David Omand: It's a pleasure to be here. It's a pleasure to be here. I'm going to hold up the front cover of the book.

Terry Pattar: I have a copy here with me, yup.

David Omand: There we are.

Terry Pattar: I particularly liked the cover, actually. I like the sort of cipher style, and it's a little bit of a marketing thing, no doubt. But the cipher is quite nice.

David Omand: That's true. crosstalk That's the bookplate. That's the front cover. But if you wrap the cover over it, of course, you then get, " How Spies Think." And I do-

Terry Pattar: So, for anybody who's listening to the podcast and hasn't seen it, there is... Yeah, so there is a wraparound cover that is quite a neat indication of, I think, the art of intelligence, in terms of trying to pick out what you're looking for in amongst the mass of information.

David Omand: Yes. You're quite right that the title may be slightly over- the- top.

Terry Pattar: Well, I suppose it's crosstalk

David Omand: There is an important message. So I want people to read the book, because it's an important message. I mean, I started thinking about this book after the Brexit referendum, and then the 2016 US presidential election. And I found I was getting, as I think many people would be, rather angry at the way in which, in social media, these very important events were being reflected, and the kind of rising tide of half- truths and distortions, some outright lies. Not all coming from Russia, I may say. And the way in which social media was being used to widen divisions in society and set us at each other's throats. An important driver was that we're beginning to lose touch with what I call rational analysis, in favor of the emotional impact, which is what catches attention on social media. And when we have to take a decision, any of us, on where to live, or what job to apply for, what to make of a situation, whether to wear a mask in the street this afternoon. When you come to take a decision, there are two kinds of thought that you have to hold together in your mind and somehow bring together. One is what you do, which is rational analysis of the situation, trying to advise customers on what's going on and why. And the other part is emotional. When you come to a decision, what do you want to get out of the decision? Why are you taking it? What are your hopes and fears? Do you fear something, and you hope the decision will avoid something bad happening? And of course, you have to bring those two together. And my fear is that it's really in the nature of social media that the emotional is tending to leech over into the rational, and rational analysis is increasingly being distorted, because the basic information is not accurate enough, or not correct, or downright deceptive. And because the quality of thought that is being applied to it is driven by these irrational impulses... Now, you have to have the dispassionate and the passionate. The interesting thing about government is the lengths to which we go to separate those two kinds of thought. So, we have the Joint Intelligence Committee. I spent seven years sitting on the Joint Intelligence Committee, desperately trying to be as impartial and rational as possible in the judgments we often administered. Then you've got ministers with democratic mandate. They are rightly the people who get the final say on what is to be done. And of course, the book was virtually complete before COVID hit us, COVID-19 hit us. But that is a perfect example of this tension between a dispassionate, rational analysis of the SAGE committee scientists, and the government attempting to respond to the mood of the people with a democratic mandate, knowing they are going to be held accountable, very visibly accountable for the decisions that get taken. So, it is genuinely hard. But my fear, and why I wrote the book, was really to say, " Can we please have a little on the analytical side?" And that's why I've spent quite a lot of the book explaining some of the analytical tools that intelligence analysts use, so that we can all, I think, have better grounded, evidence- based decisions.

Terry Pattar: I think that's something that chimes very much with my own thoughts, but also the thoughts of a few other people we've spoken to on this podcast in previous episodes, where a lot of people have lamented, I think, the lack of critical thinking that we're seeing people display increasingly. And not necessarily people working in the intelligence field, but from people we encounter in all walks of life. And one of the things that's come across strongly in a lot of the conversations we've had around topics like disinformation, and how people are working with the current information environment that we're all involved in, using social media which you've mentioned there. And trying to make decisions using all of that information, that there just isn't enough attention paid to basic critical thinking skills sometimes, which are really the heart of a lot of the analysis and the techniques that you've outlined in the book. And I think that it's a welcome contribution to that, in terms of trying to get people to think more carefully, and to think in a way that is more structured, and to be able to do that... which, as you said rightly, is difficult. It is difficult to separate the more rational, analytical mode of thinking from the emotional drivers that shape decision- making sometimes. And that's the same for individuals as much as it is, I think, for organizations and governments. What I like though about the way you've gone about explaining the process, explaining those techniques is that you haven't tried to bore people with too much theory. I think you've made it very practical, in the sense that you've talked through some case studies. You've talked through some real examples, things that you've been involved in, in your career. But also, things that are, I think, more tangible for people to understand. Was that a deliberate thing that you wanted to do? You wanted to try and communicate that and get across some of these ideas by not being too theoretical or too opaque in the way you describe them?

David Omand: Yes, I think it's very important. And indeed, explanatory skills are something that, in this sort of area, you really need to work on. I mean, we can all think of the way in which some of the situation over COVID is being explained. Putting a graph up on television for about 10 seconds and simply pointing to a line that is going upwards is meant to send an emotional signal. It's not a rational signal, because you can't actually see what the scale of it is.

Terry Pattar: Yeah.

David Omand: This is very basic. But we need both kinds of thinking. I mean, don't misunderstand what I'm saying in the book as meaning, " You have to be cold and hard- hearted." It's the passion that makes us human, and woe betide us if we ever end up with government where you don't have leaders that are passionate about the things that we elect them to do. But I'm not naïve. Politics is a contact sport. The public has always aimed off for, in traditional debate, for exaggeration, a bit of political swagger. We know that rivalries and personal ambitions, that's part and parcel of democratic politics. But that's not the problem. The problem is that we are being faced increasingly with politicians and others who pontificate on these things, who blur, and in some occasions even deny the very nature of the fact. The Rand Corporation wrote a very interesting report a year or two back on this, where they called it" the spread of truth decay."

Terry Pattar: Yes. Yeah, it was a fantastic report.

David Omand: And that's a lovely way of putting it. And you can actually see it happening. Some statement is made at the supporters of a particular point of view. They think, " I'd like that to be true, because it fits with my worldview." And with constant repetition on social media, that becomes, " Well, it might be true." And that slides too easily into, " Well, as far as I'm concerned, it's as good as true." And that is extremely dangerous for a healthy democracy, let alone to have stability within a nation, and not have people falling into conspiracy thinking, another subject that I tackle in the book.

Terry Pattar: I think you've touched on a few really important points there, just in that description of what you were aiming at with the book. And you described this on the graphs we've seen recently in some of the government briefings on COVID, and you've hit my thought exactly, which I had when I saw them, which is, " Unless somebody's got a really good understanding of statistics, they're not going to understand those graphs, quite frankly, and there's no point displaying that data in that way." But what struck me was that the briefings have led with that, that they've started with that data. And it's almost like a blizzard of data being thrown at people. And I wonder how well- planned the communication strategy has been. But as you said, part of getting across the thinking behind a government's decision- making is to explain it clearly to the public. And I don't think they've really helped themselves in this case. But that probably is an aside in terms of what we're talking about here, but...

David Omand: I can remember Tony Blair and the speech he gave at the inaudible festival one year. Coming out with this statement, " What matters today is impact." And it is that emotional impact. The graph is going up, or if you're trying to convey the opposite impression, you show a graph that's going down. And in the book, I call this" framing." And every television producer knows that the intro music, the little clip you show at the beginning, frames the viewer's expectations of what is going to follow. Now, if you're an analyst on the other hand, or if you're just a citizen and you've got a big decision to take, you can't afford to think that way. You've got to separate out and say, " Is the price of the used car that I'm being offered comparable to similar prices for similar models in the marketplace? Am I getting a bargain, or the opposite?" And that kind of very basic rational thinking, and it takes you into more advanced thinking where you're thinking further ahead, " Should I insure against the possibility that my house burns down?" That's a careful calculation. In some instances, you might say, " Well, the old banger that I'm driving, I have to take out third party insurance. I'll make sure I do that. Do I want expensive, comprehensive insurance? No. If I have an accident, I'll bear the loss myself." And these are everyday decisions that we can all understand. Which is why, as you say, in the book, I've tried to illustrate throughout with some, if you like, rather corny examples. But ones which bring home that this is not an esoteric art for people in Janes Intelligence alone. This is what you do. You have to keep telling other people, " This is how you should be thinking." The book, essentially, is in three parts. And the first part is explaining a very basic model of analytical thought. The second-

Terry Pattar: Well, you say " very basic," but I found it very actually useful. And I liked the fact that you started with... I mean, I think you're referring there to the SEES model that you describe, which might be worth just unpacking a little bit, I think, for people listening. Because I really like it, and I think it's a great way for analysts and anyone who's trying to think around problems that are similar to intelligence problems, to be able to understand the world around them and be able to make decisions. And I liked the fact that, for me, it's a more practical application of intelligence methods, rather than... What I liked was that you didn't start off by talking people through the intelligence cycle, which for me, is one of those intelligence cliches, which I think is not necessarily useful or relevant anymore.

David Omand: Well, I wrote an article some articles about the intelligence cycle, saying it was completely redundant in the digital age, because everything... It's a network. It's not a cycle. But that's by- the- by. Yes. I decided that the best way of trying to put over a rational analysis was to think about the outputs. So, if you're an analyst, you have customers, whether they're in government or the private sector. If you're an individual, of course, you're your own analyst. But analysts have customers. So, what are the potential outputs that an analyst could provide for a customer? And the first... So have this acronym, SEES. The first S in" SEES" is" situational awareness." So, can you tell the customer what is going on, where, and when? Very basic stuff. Particularly important in cyberspace. Who's in the network? When, where did you detect this intrusion? Or whatever it might be. And we all know that situational awareness is not always straightforward. Some of the material you may pick up is deceptive, deliberately. Some is just confusing. So, my subtitle of my book is" Ten Lessons in Intelligence." And the first lesson is that our knowledge of the world is always fragmentary, incomplete, and is sometimes wrong. And that's the starting point. So, there are very few areas where you have complete certainty. And even if you think you have, just remember, the only way information can ever get inside your head as a customer, or indeed as an analyst, is through one of your senses. You saw it. You read it. You saw the dial on some sensor twitch. So, it's your own senses. And the moment you start talking about information reaching you by your own senses, you've got to recognize that your brain is unconsciously filtering that information, which is what I call" framing." So you see something on the ground, you are liable to jump to a conclusion pretty quickly. Or there may be some bits of information which fit your prejudices. Those are the bits you select to present to the customer. And the bits that don't quite fit, well, you sort of downplay those. Every historian knows that's what you do. Your choice of sources to go and research will condition the answers you come back with, which is why you keep getting books about the origins of the First World War and they all disagree with each other. Because if you pick a different bit of the history to focus on, you're liable to come out with a different take on what was happening. So, situation awareness is the first one. Recognizing that it's fragmentary, incomplete, and sometimes wrong. The first E in" SEES" is really important. That's" explanation." Because facts are dumb. Correlation is not causation. We all know that. Facts are dumb and they need explaining. So, let's take a fact. There's a young man in front of a magistrate's court accused of throwing a bottle at the police patrol. His fingerprints are on the fragments of the bottle. Does that mean he threw it? Or did the mob, as they rushed past his house, pick up a bottle from his recycling bin, which is the explanation that the defense lawyer will produce to counter the explanation from the prosecution lawyer? And every lawyer knows that narrative, that connects facts together and gives them explanatory value, is what you are trying to convey. So explanation, I think, is the hardest bit of the analytical art. And when you think of some of the situations British troops have been in, to explain why a patrol was attacked in some corner of Helmand Province, you would need to know the local languages, the local culture, psychology, sociol psychology, history, the geography of the region. All of that, really, to come to a convincing explanation of why some event took place. And it's even more difficult if you start to think about geopolitical issues, and whether it's Syria, Crimea, or the South China Sea, and so on. It is difficult. But you need to do it. And then, the second E in" SEES" is the" estimation," a word that analysts, I hope you all prefer to"prediction."

Terry Pattar: Yes, most definitely.

David Omand: "Prediction" conjures up, in the mind of the customer, that you've got a crystal ball, and you paid good money for this prediction. It's actually an estimation of how events might unfold on different assumptions. So I tend to bracket together in the book" estimation" and" modeling." And we see this perfectly with COVID- 19. You've got the SAGE group of scientists. They make different assumptions about the rate of spread, about mask- wearing. On the basis of those assumptions, you then model what's likely to happen over the next month of lockdown. If you chose slightly different assumptions about compliance with, say, mask- wearing, you come out with a different answer. And so, making those assumptions explicit to the customer: " It's on this basis that I'm offering you this estimate of how things will unfold." And that takes you into probabilistic language and how you describe something as being likely or very likely or probable or whatever, which we deal with in the book. So, estimation isn't always necessary. Sometimes, for the purpose the customer, it's enough just to know what's going on, and have a decent explanation. And some things, you just can't call. It's too difficult. It's too hard to call. But nonetheless, it's still really valuable to give the customer that grounding in, " This is the evidence we really think, and this is how we explain why we're seeing what we see."

Terry Pattar: A lot of that comes down to the customer and what sort of decision they need to make off the back of the intelligence, isn't it? Whether it's something that's current, and therefore they just need the situational awareness and the explanation, or whether they do need the estimation to plan for the future.

David Omand: Where it gets very complicated, of course, is where you've got some international relations situation. My favorite example here is the 1990 Gulf War, where Saddam has his tanks poised on the border with Kuwait, and the job of the analyst is to tell the foreign secretary or the secretary of state in Washington, " Is he going to cross the border with these tanks or not?" And of course, the determinant is probably what Saddam himself thinks the United States will do if he does cross the border. So if he thinks the United States will stand aside, he'll do it. If he thinks he's going to end up with an international task force trying to depose him, he won't. So, here you've got a situation where it's a genuine mystery, because you can't get inside his head. Even if you've got good spies close to him, you're still not really going to know to that mystery. So, in the end, you have to make some sort of... and that's where assumptions come in. You have to say, " On the assumption that this is how we think he'll act. But if, on the other hand, he is assuming that the United States will stand by and not intervene, he may well cross the border." So that's rationally, how you go about dealing with those sort of complexities. But of course, whilst you're working away on your situational awareness, your explanation and your estimation, something creeps up behind you and hits on the back of the head you weren't expecting. And that happens time and time again.

Terry Pattar: Always.

David Omand: Which is why I put the final S in" SEES," which is" strategic notice." So, some part of your mind, or the corporate mind in business or in government, has to be looking out for the next nasty coming over the horizon, or the next big opportunity. And if you started early enough, you could position yourself. You could carry out the right kind of research. You could tune up your sensors, your intelligence community to look out for the first signs of this happening. And if it's COVID, you might decide to invest a little bit in public health and stockpiles and plans for rapidly expanding track and trace. And of course, the truth is that, certainly when I was in the cabinet office after 9/ 11, and we published the government's risk register, we published risk matrices of likelihood against impact, the coronavirus pandemic was always in the top right- hand corner. It was the most dangerous potential event, and dominated terrorism, major accidents, and so on. Yet we ended up without enough stockpile and without those plans. Now, the characteristics of most strategic notice events can't really be predicted in advance. We couldn't have known the exact characteristics of the new disease, because it hadn't occurred. But you could have inferred, from having strategic notice, that that kind of species- jumping disease mutates, and if we get one, we will need plans to do A, B and C, and we will need to have some PPE stockpiles and so on. And the same is true of many other potential disasters which happen. So, if you have a nuclear power station in your district, the local authority, the police services, the emergency services, all have plans for what to do in the case of an accident, and they rehearse those plans regularly. As I mentioned, insurance companies know there are certain kinds of risk. They could insure. They could buy their foreign currency forward. There are things they could do to manage those kind of strategic risks, which if they happen, they happen. For the most part, there's nothing you can do to stop them.

Terry Pattar: No, indeed. And this is probably an aside to the main topic that we're discussing, but to what extent do you think that some of the decision- making around planning for those kind of... maybe what would have, at one point, have been considered low probability, but we know will be very high impact events? Planning for those and preparing for them requires a focus on resiliency. And to what extent do you think that actually, over the last decade or couple of decades, from what you've observed, whether inside or outside government, do you think there's been a cultural shift away from being prepared for that type of event and having the resilience? Because we've become more focused on the here- and- now, rather than thinking about those sorts of things that might come up and be massively disruptive?

David Omand: After 9/ 11, we put together CONTEST, the government's counter- terrorism strategy. And the preparation for those kinds of major, major events involved thinking about resilience. The preparation, the final P in the famous four Ps of CONTEST, involves two things. One is preparing for the immediate. So, do you have the emergency services trained together? Do you have the right immediate resources? Will you manage the disruption very quickly? But the second is investing in resilience, so that your normal life is restored as quickly as possible, as well as just being able to absorb the initial impact. How quickly will it be before the telecoms system is back online, London Underground is running, and so on? Airlines are flying again, or whatever it might be. And we know that you can, over time, invest in more resilient infrastructure. You can't afford just to do it all at once. It's when something comes up for replacements. You say, " If we're going to replace this electrical switching center, let's move it off the flood plain. Let's put it on a hill." If you're building a telecoms system, the 5G system, a lot of thinking is going into, " How could we make this resilient so that even if one or two components or nodes or even suppliers fail, we will still have a system?" And it's a little more expensive, and it takes more safety engineering, but it's well worth doing. And in some areas, such as the physical construction of buildings, that's been taken to quite a fine art. So the slogan is" secured by design." So if you put up a skyscraper in London these days, you have to apply certain standards, which will ensure management of the major risks of something going wrong. The thickness of the glass, the positioning of the car parks or whatever else it might be. And those kind of secured by design, resilient improvements, aren't necessarily visible to the public, which is a good thing, because in terms of the objective of British counter- terrorism, it is normality. It is to maintain normality, so you deny the terrorists what they are seeking, which is to dislocate and disturb us. If they're not doing that, we're prevailing and they're losing. So, if you can improve the resilience of the infrastructure, but not do it with, as it were, barbed wire and barriers and armed guards, then you're actually both enhancing resilience and reassuring the public that normal life can safely continue. That's just an example of the way this kind of thinking, as you imply, leads you into some very practical conclusions of things to do. What I think the nation suffered, of course, was 2007 and 2008, the long period of austerity that followed the great crash, the fact that resources for public services, and indeed for even planning and thinking about that, were severely restricted. So, we've had a bit of a pause, but I'm sure we'll get back to really, really working on where best to invest in resilience.

Terry Pattar: So you've described the SEES model, and that's in the first part of the book. But then, it would be useful to get an idea of how you then progress through the other sections, to talk about that whole application of intelligence analysis skills.

David Omand: Yeah. Well, the second part of the book is, if I can paraphrase, how you can get it wrong. Even with the best of intentions, analysts slip into error. And sadly, in the intelligence world, the general public knows more about things that go wrong than they do about things that go right, because things that go right, the agencies want to preserve and keep secret so they can do it again. So there's an inherent bias in your situational awareness of what's going on in the intelligence world. But if you just think about the run- up to the Iraq War in 2003, and all the problems there were in correctly assessing what Saddam Hussein was up to, what ambitions he still retained for weapons of mass destruction, particularly in chemical and biological, but also, his wish to reconstruct his nuclear program. And then you think about the data points that were pored over by the intelligence analysts around the world, and how some of those were deliberately misleading, like the Iraqi refugee, Curveball. And some of them were over- interpreted. So something which could have more than one meaning were assumed always to have the worst possible meaning. And then, other phenomenon which we see in everyday life, which is, you explain away the evidence that doesn't fit your prejudice. So if you have a firm view grounded in the reality of 1990 and the first Gulf War, and the arms inspectors that went into Iraq after the war and were astonished to find how much chemical and biological capability he had, and his nuclear ambitions, then when something comes along a few years later, that's the framing that you're going to give it. I was on the Joint Intelligence Committee from, well, September 2002. And we all fell into those traps. So the second part of the book is really, what you need to watch out for. And one of my lessons in intelligence, I think it's the fifth, is: It's our own demons that are most likely to mislead us. Which is about cognitive biases, and there's a vast literature of applied psychology research, showing just how easy it is to fool yourself that you think you're seeing what you want to see. So, confirmation bias. But those... As I explain in the book, you have problems at an individual level. The individual analyst, their prejudices or preconceptions, perhaps that's a kinder word than" prejudice." Their preconceptions. You've got group level biases, so you have an analytic group. It's very, very rare that it'll be an individual. It'll be a group from across the intelligence community, but they can fall into groupthink very easily. And then you've got the institutional level, where government itself or certain agencies may be rivalrous. They may be arguing a point, not willing to concede a point. So you have these potential sources of bias at an individual level, a group level, and an institutional level. And you can find historical examples of all of them. But of course, once you begin to explain that, a lot of the risk evaporates, because you know you may be falling into groupthink. And it only requires a sensible leader of a group to say, " Look. Let's pause this conversation here. And blokes, I want you, when we resume, to make the case against." So you empower somebody to argue the flaws. And you create a safe space, and this is one of my hobbyhorses, inaudible that if you want good decisions, there have to be safe spaces where people can genuinely speak their mind without fear of retribution or privately being marked down as a troublemaker.

Terry Pattar: Yes. I couldn't agree more. I think that's such a vital concept.

David Omand: I'm well out of government service now, but I hear that it is harder and harder, with relationships between civil servants and ministers, and special advisors, to have that safe space where you can actually talk truth to power. And that, of course, is one of the key things that intelligence analysts are trying to do with their customers, is talk truth and not shy away from writing something down because you know it's going to annoy the customers, because they were about to spend a lot of money investing in a particular country, and here you are saying that actually, that may not be such a wise idea. There are two other chapters in that section of the book. One is about obsessive states of mind, and...

Terry Pattar: It'd be worth unpacking that a little bit, yeah. Obsessive states of mind.

David Omand: Occasionally, during the Cold War, particularly the early part of the Cold War, you could see that we were falling into this obsession with the Soviet Union as being 10 feet tall. And even in the later Cold War, as the economic of the Soviet Union began to be really be under pressure, the state of training, clothing, morale, the diversity of different ethnic groups that were present in the Soviet Armed Forces, tended to be underestimated. And the sheer number of tanks or caliber of some of the excellent weapons that the Soviet Union produced, those tended, of course, to be highlighted. So you, without really knowing you're doing it, you're systematically falling into this sort of obsession. The example that I quote in the book in some detail was James Jesus Angleton, this long- term head of CIA's counterintelligence department. And he became convinced that there was a Soviet master plan. And of course, there was some evidence there. Burgess, Maclean, Philby and all of that. But it led him seriously astray. And when defectives arrived, the suspicion of those defectives was extreme. " They must be plants. They're trying to divert us from seeing what's going on." And Angleton really infected Peter Wright of the MI5, the author of Spycatcher.

Terry Pattar: Spycatcher, yup.

David Omand: That conspiracy, this obsessive state of mind, " There are spies under the bed." And so, Wright became convinced that Harold Wilson was a long- term Soviet agent. He then had to convince himself that the director general of MI5, and the deputy director general of MI5 must both be Soviet agents to have covered up the fact that the Prime Minister was a Soviet agent. And then they convinced themselves that Hugh Gaitskell, leader of the Labor Party, had been assassinated by the KGB, or their predecessor, in order to allow Wilson to become Prime Minister, because he was a Soviet agent. And it's entirely circular logic. We know now, very firmly, and indeed from very convincing Soviet sources, or former Soviet sources, that this is all nonsense. And indeed, in Moscow, they couldn't understand any of this, because they knew he wasn't. But nonetheless, people's careers were destroyed. We had the same effect in the United States, with the purges of un- American behavior and McCarthyism, these obsessive states of mind. And then the third chapter of that section is really about manipulation and deception and thinking, and having to be aware of that, that there may be people out there who do want you to draw certain conclusions, and they're manipulating the evidence to persuade you. And then the third part of the book is really about, given all of that, how can you actually apply both the methods and your little warnings about falling into error, when it comes to, for example, negotiations? How do you get win- win negotiations? And when it comes to partnerships, whether these are industrial partnerships, the US- UK intelligence relationship, or just your personal choice of partner in relationship with a partner. What does that depend on? And again, trustworthiness. Regular, reliable behavior is key to that. And then, at the end, my 10th lesson is that subversion and sedition are now digital. So, very traditional ways of trying to undermine public confidence in the state: set one group of citizens against another, distract governments, these can all be done digitally. And therefore, we have to learn to live safely online, including, as I think you alluded to right at the beginning of this conversation, teaching critical thinking. And I would start in schools, because the upcoming generation has known nothing but the digital world. They've got no background of the analog world to set it against. All they know is that they're in a magical world where information is almost infinitely available at the touch of a button. And that the very nature of the internet, the ad tech that we now are beginning to understand how these work, which are selecting, through auction, the most clickworthy material, and are selecting the advertisements, including the political messages that are directed personally at you as the holder of anything up to 1000 different dimensions of characteristics. " You're in that group, therefore you're the intended target of this message." So that's the book.

Terry Pattar: Yeah. No, no. And it's a fantastic journey through all of those aspects you've mentioned. And it's so important and so vital right now, because we have come through certainly the last... just over a decade, I would say, probably a decade and a half, of having seen this real increase of the availability of information that you touched on there. And I think for a long time, people haven't necessarily understood how manipulated some of the information is that they're seeing, whether it's by an algorithm which is just simply trying to get them to click on an advert, or whether it is something that's deliberate deception that's trying to get them to think differently about an issue, or to make a different decision when it comes to walking into a polling booth. These sorts of issues are much more prevalent now, and I hope that some of the recent stories we've seen, things around the Cambridge Analytica scandal, and how these things have been described in various places, have helped people's awareness that actually, what they're looking at on the screen isn't always going to be reliable. I mean, in the open source intelligence field, we used to joke about people saying, " Well, I read it on Google, so it must be true." But I think there has been an element of that, certainly. That people are sort of just taking at face value information which they should question more thoroughly. And I hope that some of the techniques and methods that you've outlined in the book will help people to do that more thoroughly. But also, what I liked about it was that your focus is on... not necessarily on the obsession collection of information, which I think too many organizations, too many analysts as well actually, fall into that trap of trying to collect all of the information. And it was described quite wonderfully by John Graves, a guest on our previous podcast episode, where he said, " It's like, as analysts, we're trying to gather up all of the atoms. But it's not like Pokemon. We're trying to catch them all. We don't necessarily need everything to be able to build out that picture sufficiently to be able to make a decision." Has that been a change that you might have seen in your career, that people have become... or analysts generally, and maybe people in the general population, have become a little bit too obsessed with just gathering the information, rather than thinking about what they're going to do with it?

David Omand: Yeah. If you were to look at the intelligence communities of the major nations, certainly the US and the UK, you could probably, I think, say that we've over- invested in the collection side, as against the analytical side. So we are still resting on quite a narrow base of analysts. I'm very pleased that over the last few years, the number of analysts in government has increased. There's now a professional head of intelligence analysis in the cabinet office. There are more analysts, and the sort of training that is provided is light years ahead of the almost negligible training which was around when I was sitting on the Joint Intelligence Committee and watching the analysts at work. So, that's all very positive, I think, that the pendulum is swinging back a little bit. There are some areas of investigation like counterterrorism, where you have to go for bulk.

Terry Pattar: Right, of course.

David Omand: And then rely on the cleverness of the people writing the algorithms to be able to filter and then selectively question enormous quantities of data. What actually appears for the human analyst is still a manageably small amount of information, but with a reasonably high probability that it's relevant to an investigation. I don't think that's going to change, and the same is true of work on serious crime, where to get at communications, communications data, internet usage and so on, you have got to go into the big numbers. But I still come back to what I said a while ago, that it's explanation that, in the end, is what counts.

Terry Pattar: So I guess the art there is, as much as you say, okay, it's vital to collect the bulk of information. Within that, though... I think what people sometimes misunderstand is that... and this leads to that fear of bulk collection of information, is that people think that it's all being looked at and it's all being read, which is not the case at all. I mean, that physically wouldn't be possible in any case. But what is selected and what is picked up, and you describe their algorithms, and in the book, you talked a little bit about that, getting the balance right between the machine and the human, and ensuring that the analyst is able to effectively see information that otherwise, they would miss. If they were trying to do this themselves manually, and trying to conduct investigations just by following one lead to the next to the next, it would be too slow, in the instance of counterterrorism investigation, to be able to effectively capture or prevent some of these incidents taking place, or identifying people who are liable to potentially carry out a terrorist action. And so, there is that aspect of it, I think, which is that the algorithms themselves have to be well- trained in analytical thinking, in that sense. And they've-

David Omand: And the data that they are fed in order to get machine learning has to be representative of the population that's going to be sampled, and there have been some well- publicized cases where people just haven't been able to get the right kind of data, so the results you get are biased. But over time, as these lessons are learned, people will get better at that. But one of the important reassurances, I think, for the public in a liberal democracy like the UK, is that right from the start of even thinking about designing such a system, the designers have to have in mind privacy considerations, and our Human Rights Act. They've got to have safeguards built in all the way through, and audit trails. And that's one of the differentiators between how we would use intelligence, and how, for example, the Chinese would use bulk data, which of course, they also have, and access. It's that point, which... and care that, at every stage, you're not actually falling into the trap of mass surveillance, and just roaming through data, hoping somebody's done something wrong and you will find it. It's specific investigations for specific purposes, with algorithms that are targeted on a probabilistic basis that have the best chance of pulling out data that will help an investigation. So, I'm reasonably confident that we'll go on being very responsible. There's a final point that I would quite like to make, which is, we've talked a lot about the problems of the modern internet and social media. And those are driven by the business model of the internet. And that's not going to change. But I think I do... and I have tried to do this in the book, that we are completely dependent for our future economic and social growth on the internet. Just imagine COVID without Zoom and Teams and all of it the inaudible and how we would have lost touch with our nearest and dearest. So, it's a huge boon and a benefit. There's going to be a further, big internet usage. It will mostly be in the global south, and it's going to be liberating in every sense. So it's a great thing. And my final plea in the book is, " Just learn to live safely in that world, so that we get the benefits, but we don't have to pay the price of all that misinformation and disinformation and threats to our democracy."

Terry Pattar: Yeah, indeed. And that's such a useful and optimistic point, I think, to make. Because you're right, sometimes we do focus too much on the problems, and sometimes we focus, especially on... You alluded to this earlier, when we were talking about intelligence and learning about intelligence, we focus too much on the failures, which tend to be more public, and the successes don't tend to be talked about. The other aspect of it that struck me was that, within the book, all of the lessons that you're describing and the challenges inherent in doing analysis, and conducting intelligence work generally haven't necessarily changed over time, in terms of dealing with the incomplete picture, not having all of the information you would want, or having information which is contradictory, and then needing to make sense of it. But do you think that ... Well, firstly, obviously as you talked about, the information environment has changed, which has meant that intelligence methods and the way organizations deal with intelligence has to adapt. And for intelligences, we've got to be more conscious of how we're taking in information, and we discussed critical thinking skills being one example of that. But what do you think has changed, or have you seen changes in maybe the last 10 or 20 years in intelligence and the way that it's conducted, and things that people are doing now, which perhaps they weren't doing 10 or 20 years ago, which are improvements, which are ways of doing things better that analysts can apply, that perhaps they weren't necessarily using 10 or 20 years ago?

David Omand: It's a very interesting question. One of the ways of looking at this would be to say, " What's new, that the Bletchley Park veterans wouldn't recognize?" For example, direction finding is as old as radio. Now it's mobile phones and they call it" geolocation." Imagery. Imagery is as old as photography. We have traffic analysis in the old days. Today, we talk about communications data analysis. It's the same thing. But the bits that are new, I think, are firstly, scale. You can look at things at scale, if it's in numbers, because you can crunch numbers and process them and display them in a way that, in the analog era, you couldn't. And the best example of that is imagery. So you had Constance Babington Smith poring over, in the 1944, a small number of photographs of strange rockets on launchers on the coast of Europe, the famous V- 1 launchers, looking at them through a stereoscope by hand. Now, you have high resolution imagery of virtually every corner of the world. There is no conceivable way that you can look at it all. So, it is possible at scale to do that if you digitized the process.

Terry Pattar: Yes, indeed. And we try and do-

David Omand: And if you applied some very smart algorithms, so that it is known that this farmstead in Syria is associated with a certain rebel group, a 4x4 drives up at the front door, you want a bell to ring and the analyst to be shown that photograph. That's sort of in the realms of the just about possible. So, scale. Timeliness is another, because it's all happening at just under the speed of light. And that makes a big difference to what customers actually expect. So the days when you had to transmit the news of the Battle of Waterloo by horseback, now it's... Now, that's quite dangerous because it tempts people to think that remotely, you can, as it were, take charge of situations and so on, and disempower the person on the ground if you're not careful. But that scale and pace, I think, provide entirely new dimensions to the business of analysis. inaudible the old days, but the rest of it, the rest of it, what you're trying to achieve and the sort of ways, and my outputs, my four outputs from the SEES model would have been recognizable to Francis Walsingham advising Queen Elizabeth I.

Terry Pattar: Yeah, indeed.

David Omand: He wouldn't have expressed it that way at the time.

Terry Pattar: No, no. Probably not.

David Omand: But that's essentially what he would be trying to do.

Terry Pattar: Yeah, yeah.

David Omand: Right. Well, it's been wonderful.

Terry Pattar: Well, that's been fantastic. Yeah, no, this has been wonderful. Yeah, thank you. And thanks for taking the time to come and talk to us about this. And like I said, really enjoyed the book. I'm sure plenty of our listeners will do. And yeah, we'll hopefully get you back at some point in the future to talk more about intelligence, because there's so many things that I've love to spend more time delving into that we've just touched on in this discussion, but could deserve entire episodes of their own. So, thanks again for joining us and for taking the time.

David Omand: It was a great pleasure, and thanks for having me on the podcast.


Producing high quality intelligence depends on discerning meaning from information - being able to describe what is happening, why, and what will happen next. This is the art of the intelligence officer and in this episode of the Janes podcast we talk to Sir David Omand, former head of GCHQ and the UK's first security and intelligence coordinator, about his new book, "How Spies Think: Ten Lessons in Intelligence" and his model for how individual analysts can produce better intelligence. The lessons and skills discussed here are just as valuable for anyone who wants to apply intelligence methods to making better decisions in all areas of life.