Technology, innovation, and intelligence for future warfare
Terry Pattar: Hello, I'm Terry Pattar, where I lead the Janes Intelligence Unit. Welcome to this episode of The Janes World of Intelligence Podcast. I'm joined on this episode by my colleague, Davey Gibian, who's our Head of Innovation at Janes. Hi, Davey.
Davey Gibian: Hey, Terry.
Terry Pattar: We're also joined by Tyler Sweatt, who is somebody who's got a lot of experience working in the tech sector and has previous military experience, and we're going to be bringing that together to talk a little bit about what that means now, what it means for the future, and what implications it has for Defense Intelligence. So, Tyler, thanks for joining us, it's great to have you on the podcast.
Tyler Sweatt: Thanks a lot, Terry and Davey. I'm really looking forward to the conversation.
Terry Pattar: I've probably not done you justice at all with that quick introduction. It'd be great to get an idea from you of your background and the current company you're involved in and some of the work you're doing right now, so that we can get an idea of some of the context of this conversation and the discussion we're going to have, and then we'll go on from there.
Tyler Sweatt: Absolutely. I started my career in the US Army, all focused on counter explosives, was the recipient of a whole bunch of supposedly cutting edge technology while I was deployed. What I found is, while a lot of stuff came over, very little, if any of it was actually useful. When I transitioned out, made it a mission to figure out what is that chain look like? How do people buy technology? How do we program the funding for it? How are we setting priorities on organizations that are building or investing in it? From up at the large heads of government, labs and science and technology departments, all the way down to that entrepreneur who's trying to pitch a venture capitalist on funding? Then once I figured that out, it's been about how do we transition it quicker? How do we get the right technology to the right war fighters hands, in time for it to be useful. That's the mission of the Second Front Systems where I'm currently at. We focus on what we call acquisition warfare, because the belief is that the next war is going to be won by the entity that can optimize that loop on finding, on procuring and on integrating technology quicker, with greater precision and with lower costs, while imposing effects on the technology OODA loop, if you will, of our adversaries.
Terry Pattar: In terms of the questions you were asking there, how do those technological developments get into the hands of war fighters? Just the detailed questions and systematic questions you were going through then, it made me think you must have had some pretty deep frustration at the time to be thinking, I don't want to go through that process. I don't want to really dig into military procurement in that way. That's driven by some frustration there, which it sounds like you've tried to go on to do more with that. I think, a concept I want to unpack a bit more in terms of what you're saying there in terms of what's coming next. The work you're doing with Second Front and the work you've done up to now. Then what you're saying is coming next that we have to be much quicker, or whoever is going to be much quicker at speeding up that loop is going to have that advantage. Maybe give us an example, how do you think that plays out?
Tyler Sweatt: Yeah, it's a really good question, and I think it represents one of the biggest transformations that's required to get large governments better synchronized with commercial industry. If you think about, traditional commercial software. Software as a Service, you build that back end, you build the infrastructure, and you really build it once, and then you can sell it or customize it for different use cases, a whole bunch of times. Now, flip that on its head, and the government will build something once highly customized, and then if they need something that's just a little bit different, they will rebuild the entire thing. Again, highly customized. Right now, in US government major defense acquisition programs, I think you've got upwards of 72% custom code in the software and custom software developed, which, for anybody who's listening, who understands how tech is built and scaled and improved, that, that does not bode well for speed. Then you start to think about interoperability with allies, because nobody goes to war alone. None of this is a one on one type of a conflict. The same way historically, we thought about the need to have similar baselines in terms of ammunition caliber, for example. We've got to start to think about that with software guardrails and codebase. But that is, I think, one of the largest barriers where you have an industrial age approach to major platforms and technology coming out of the government, and you have the rest of the world moving in more of a knowledge age in a more agile approach. That lack of synchronization causes quite a bit of friction.
Terry Pattar: In terms of approach and how that's changing, and the difference there you describe, is a lot of it driven by that kind of continuing mentality of, we've got to be able to do this for ourselves, we can't rely on others do it for us, because they don't want people to come in from outside to do that for them because of security reasons. Is that still part of the mentality behind it in terms of the reluctance sometimes to outsource more effectively, if that makes sense?
Tyler Sweatt: It does make sense. I think it's a combination of things. I think one, it's way easier. You think about the whole buying chain. All the way up to Congress, and all the way down to that procurement officer who signs the contract, it's way easier to picture what you're doing and what you're trying to build, if you've got this highly rigid, very well defined schedule, that here's the widget, the thing we're going to build, you can reach out and touch it. When you're thinking about, hey, I'm not really going to have requirements for this, it's going to be agile, it's going to change, and oh, and we're going to let the users define what this looks like. That just doesn't fit into the paradigm that so many folks have come up through and understand. Then, if we're being honest, we start to think about the F- 35 is a great example. It's built, there's some part of that built, and I think every congressional district in America, which again, probably makes voters happy, but for folks who understand supply chains, it's chaos, and it just creates such a large attack surface, that's just inevitably going to result in delay. I think it's... I keep going back to industrial era, long factory lines, and when the power was the person who saw all the information and controlled it, it goes back to outdated perspectives on control. Where if I define this, and I set the schedule, and I control the budget, I am in control, versus the new paradigm of, if I set conditions for success, and let it go and let it change and don't control it and just bring the right people together, that's where I think we get a massive breakdown, and there's a tech literacy problem across government, where there's a lot of buzzwords. Davey, and I have seen it with artificial intelligence, right? Sprinkle some AI on it, and you'll fix it. That's a word like that.
Terry Pattar: I can see that. That's certainly been my experience outside of the US in the sense of, one of the massive frustrations just as an example, we often are delivering intelligence analysis training, regional and open source intelligence. With open source intelligence, why it's called intelligence, and is part of the intelligence sphere, it's really focused on information collection for most customers. When they think of open source intelligence, that's actually what they're thinking about. They become so technology dependent, and so hardware bound, rather than thinking about it as an information or an intelligence problem, it becomes a technology problem, and they overcomplicate it, I often find out. I don't know if that's replicated in the US, or if that's something you guys have seen, but it strikes me that, yeah, that lack of tech literacy that you mentioned, is universal.
Tyler Sweatt: Yeah, I would agree. I think that's how you end up with the highly customized software, which then, if you're thinking about it, from a smart buyer standpoint, it results in vendor lock. There's only one person who can probably come in and fix that, which, while it may feel secure, if I were to take an adversarial look at that, it shows me one point I need to penetrate versus, a broader open source community that's got different sets of eyes, and that's getting run through different de- risking and control where if I want to compromise a major weapon system, I got to get inside one team and one entity. The software's probably three years out of date. So, it's probably not secure. But yeah, I would agree.
Davey Gibian: I also end up seeing that the lack of tech literacy limits the possibility as well, because people aren't very good at future scoping. When they think, okay, what is my problem today, versus what is the technology? There's a mismatch there, where they say the answer, especially in open source intelligence tends to be, we'll just collect all of it, which doesn't really make a lot of sense, and just adds to complexities down the line. Then a lack of usable open source intelligence as well, and a lack of context around what you're creating. But that also, again, when it comes to artificial intelligence, it tends to be the same, it's, well just give me the click button to kill terrorists. Why can't the AI do that? The lack of tech literacy means people are making these illogical leaps between what they want to see as the end state, and what they think technology should be doing, as opposed to not only working within the bounds, because I don't think we have to do that, we can build the future as well. But they have to understand, what are the actual capabilities here, and then be able to craft with the developer or craft with the company, what that solution looks like from a user base perspective. I think the limitation in technological sophistication across the government, we look for silver bullets when none really exist.
Tyler Sweatt: Let's talk about the failure of imagination. I think you both got it, which is, a massive limitation. I think that to your point earlier, Terry, that's why we default to things we understand, and we spend billions of dollars on big platforms. The Navy is a great example here in the US. Instead of building a bunch of highly attributable fast boats, or drones, we're building a whole bunch of aircraft carriers. Now, I am not a Navy guy, and I often make fun of the Navy, but I have yet to see a situation or any type of future scenario, that's the technological advantage is going to be a very slow, very high signature producing floating city versus a technology capability. I think that's to Davey's point, then we try to just throw technology on these platforms without thinking about what types of environments are we going to moving in? What does A2/ AD actually mean in practice, and what are adversarial capabilities?
Terry Pattar: Yeah, that's really interesting. Like you said, the failure of imagination, the lack of tech literacy, and the... I guess, the inability to switch from just having that one procurement mindset to something which needs to be more flexible for the current era, let alone before we get into the future, what that's going to be like. But, to what extent is that lack of tech literacy down to customers in the defense sector, customers in the government being bamboozled by the offerings that are out there? Why would technology companies buy solutions that look like they are silver bullets, but turn out not to be? Because, you don't really seen that... That happens a lot, too.
Tyler Sweatt: Yeah, no, it's... Davey and I have been in a few meetings together, where we made the joke that artificial intelligence is on PowerPoint and machine learning is in Python. That's how you can tell the difference. But, honestly, I think it's it's two sides written. You have that smoking mirrors, the snake oil salesman that is calling things AI or advanced technology, when it really isn't. You've got some buyer's remorse of people who've made large investments, and maybe haven't seen it, and that could be from snake oil, or it could be from the tech literacy where there's just misaligned expectations. On the other side of that, you've got a pretty heavily entrenched defense industrial base that's been attached to these programs on the benefit of building large platform that is highly inflexible procurement and takes decades to build and billions of dollars, is there's probably a 20 year inaudible sustainment contract behind that. You have a very interested party in having minimal change to the business model. Anytime someone will want to come in, that team that's really sitting there on that program of record or that program executive office, the contractor is whispering no, we can do this. We can do this. We can do this. Here's my team of 50 analysts, here's my cube farm of people, you don't need to build technology for that. You've got that coming from both sides and it almost starts to build the biases, or reinforce the bias of we can do this ourselves, and I think it makes the buyers a little more shy, because they don't want to rock the boat. The incentives for those folks right now are spend their budget and get more budget. Not to innovate, not to generate effects, not to save money, to spend the budget. That's where you get into the Pavlovian problem of why the market is the way it is.
Davey Gibian: I think one thing that's very true when it comes to problems with Defense Acquisition, especially of nextgen products, or capabilities, is that the problem is fractal. We can probably investigate this problem space all day, just from procurement, to the education, to the incentives, to the long term incentives of staying in government versus moving yourself to the private sector. There's a podcast in each one of those fractal inaudible that you can go down. But let's... I do want, Tyler, because you're focused on it these days, what is the bridge? Because there are a couple macro Schrödinger's X. We're in this Schrödinger technology for example, when it comes to government. On one hand, the Silicon Valley hates working with the DOD. At the same time, the massive Silicon Valley companies competing for massive DOD contracts. The DOD is behind in technology. At the same time, we have the F- 35, and other very sophisticated supercomputers that fly. There are these competing narratives at all times between what is happening in technology and what is happening. I have two primary questions for you that we can dig into. One is, what side of Schrödinger's technology are we on? Is the DOD fully failing, or is it just that it's a narrative problem? Then two, if there are gaps or where gaps exist between the technology sector and government, what are practical methodologies through which we can bridge those gaps?
Tyler Sweatt: Yeah. To stay on brand with the Schrödinger answer, I don't think we'll know until we open the box. I'll joke-
Davey Gibian: That's fair. You need the observer.
Tyler Sweatt: But I think the big challenge is, can we scale? I keep going back to scale. Whether it's an F- 35, or whether it's ABMS, or JADC2, or all these new innovations we have going on, I want to see the scale because on the same other side, we've got all these new acquisition pipelines, like SPIRs and OTAs and stuff like that, and there's hundreds of millions of dollars going into there, which sends a very good initial signal. But if you start to parse out some of the, are these technologies transitioning, and are they coming into programs of record or systems of record? All of a sudden, that number starts to go way, way, way down. The question becomes is it... The risk of sounding pejorative, like, is it vanity? Are we actually doing it? I don't know the answer to that, and I don't think we will until we're forced to prove it on more than just a PowerPoint slide. From the competing narratives, I see the valley wanting increasingly to work with Department of Defense. I think it's in vogue to say, hey, there's this massive divide, because there's been a couple of public blow ups. You can't walk 10 feet in the Pentagon right now without seeing folks with hoodies, and ripped jeans walking around, trying to keep looking like they're in the valley coffee shop, just to make sure they don't lose the image. So, it's happening. To get to your second question, from a practical standpoint, there are a number of barriers to scale that where as you take the perspective of the venture capitalists, or of the startup, I think they cause folks to self- select out of the process. That's things like, burdensome cybersecurity requirements that aren't really aligned with what's going on today and also just cost a significant amount of money and time to go do. If you're a startup, are you going to invest millions of dollars in achieving some certification on the hope of maybe getting some money? I have not met many venture capitalists that would be comfortable investing that much money and going to get a certification so that we could then go hunt in a market without achieving the product market fit and the customer inaudible and all that. The authority to operate, how you actually get your software into production in a customer environment is so complex that if you go to three or four different digital warfare offices in the government, they will give you three or four different answers. Increasingly, they will tell you not to use their own solution, because it's so complicated. Then you end up with another process that you can't really time box, you can't tell me how much it's going to cost, and it usually requires some reconfiguring of whatever the commercial technology is. Those two, I think, are the lowest hanging fruit. Then it's that, as I alluded to earlier, it's that acquisition pathway. What's my path to transition? If I can understand the path to transition, then I've got a way to bring enabling support in, to think about how to actually scale the technology. But right now, I think that there's a desynchronized, the Department of Defense, and the government says, hey, we're giving these grants out, we're investing in technology, it's great, we're connecting with the market. Then the market sees, hey, this is interesting, but in order to build a government prototype, I have to configure a whole bunch of stuff, which is time, which is money, which is focus off primary product development. Then it's probably going to take me four or five years to figure out even how it gets into a program. That's fighting the folks who are already entrenched in the program, that's understanding some of the Congressional funding and the procurement challenges, and it's making sure that someone doesn't grab your IP along the way. If you ask me to pick, it'd be the ATO, it would be the biggest one. I think that's a silly problem to have.
Terry Pattar: With those issues in mind, if we go and take an almost blank sheet of paper perspective on it, is the current system set up to help solve the problems that exist? In terms of, what are the problems right now that technology solutions are well geared to solve, where they're not being used, versus, actually, what are the kind of solutions that are out there that are available, that are not being used in government, or that... Sorry, that are perhaps being used, but are being used in the wrong way? Where, as you described, the procurements haven't been done correctly, and maybe people are buying or trying to do things themselves, which actually are fairly commoditized, which they could buy in. Where is that mismatch between what's actually needed versus what's available out there? I think it speaks a little bit to what you certainly mentioned earlier, that failure of imagination, are there technologies available now that defense needs to be thinking about, that it's not already thinking about? I know, it's kind of... And, Davey, I think you made a great point, in terms of actually, there's a lot going on that it's easy to get caught up actually their behind the curve, but actually, to not, there's a lot going on within defense, which is at the cutting edge, right?
Davey Gibian: Well, I'd almost... I don't want to take an even pokier question inaudible which is, will this problem be solved without a war?
Tyler Sweatt: No. I think the challenges we're going to have to realize we're in the war. That's the question. Will we know we're in a war?
Davey Gibian: Okay. Because I completely agree with you, where I think ATO problems on new technology adoption, whether that's a new piece of hardware, or a hardware and software integration, a new UAS capability, or a new cyber tool, a new artificial intelligence widget, all of those get solved, if there's a practical or tactical or operational reason to suddenly deploy that technology. I fully agree. Now, let me poke you on that, what will war look like in the future if we don't know that it's a war? What is your fear or what will you get hinting at when it comes to not knowing if we're in conflict?
Tyler Sweatt: Yeah. I will try to make this not like a theoretical inaudible but I think we've struggled with things that aren't kinetic for as long as I've been in any way related to national security. We're going on 20 years at this point. Things like unrestricted warfare when it was written 30 years ago at this point, and it's thinking about intellectual property and non- kinetic effects, and commercial espionage, and things like that. I think if you went around... I include the UK, our partners here, and we had a robust discussion on trying to define war. You'd either end up at the simple Clausewitz definition that just encompasses everything, or is there be a really big argument of where are the boundaries, and what is warfare? We used to explore all the time, if I'm an external party, and I hack Boeing, or Raytheon or BAE, is that an act of war? Is that an act of commercial espionage? Is that just a phishing and it's a criminal act? What is that, and where does that fall? Because I could make an argument that we're in a war right now, and we're not paying attention to it, because it's hard to paint a picture of it. I think that's what scares me the most, is that by the time we realize it'll be too late. I spent two years at inaudible going down this rabbit hole, and arguing with big service strategists on what advanced technology and warfare means. For me, the easiest part is the kinetic side of it. It's everything else, and that's the part that everybody struggles with, because you can't see it, you can't touch it, you can't smell it, you can't hear it.
Terry Pattar: Is it the case that we'll only feel the effects of it after its happened?
Tyler Sweatt: I think some of this goes back to the literacy. This is why education is so important, and this is why education at a practical, at a readily consumable level is so important for leaders across all of our nations. Those in the respective chambers who are setting laws and regulations and allocating funding. Those in different executive branch and heads of state have to understand what all of this means. That, hey, it's not about a bomb got dropped. Maybe it's about, we're years past it now, but when Russia shut down the power in Georgia, things like that, that hey, okay, that was science fiction before then, and we talked about it, and nobody thought things like that were real. I think there's an education on how this all fits together and what it means that we've got to get into. Then I think it's an understanding of the tech literacy, where it's not purely like a preventative measure. Most of those technologies that would increase resiliency of critical infrastructure will also allow us to scale the effects of service delivery from a government standpoint. It's not a different widget all the way across. It's thinking about it with that integrated backbone, where we've got information sharing, and then you can sprinkle some AI on it after that, because you've actually got the data flow in the right way. That's where I think, if I had a hope on how we wrapped around it, there's a high amount of education, there's some legislative work that has to occur. If we're being... A lot of these behaviors are reinforced or incentivized by how you're allocated funding at a national, and at a service level. I think that's an area to push. While I do believe that a significant event will be what changes this, I think, especially if that is the case, the education is so much more important, so we don't overcorrect like post 911, where everything overcorrected to one side of the spectrum. I spent a lot of time over there, we don't have too much to show for it. We learned a lot, I think, but if this next one has the ability to be at scale, where technology allows us to reach around the globe, just from a click on our computer, how much more important is understanding how to oscillate the different focus levers there? That's where I think the valley can help a lot, and that's where I think commercial industry can help a lot. Because you're starting to think about scale and reach and the ability to generate insights. The fact that right now, when we send a soldier to war, they have more technology in their pocket with their iPhone than in the entire platform we send with them, their entire weapon system. There's more tech in the $ 1400, I think that's the cost of the new one, the$ 1400 phone versus the probably$ 14 billion ecosystem that they're delivering effects in. That, one, it's not sustainable from an investment standpoint, and two, that should never be the case. That's just a terrible way to send people to war.
Terry Pattar: That's been the case for a while, I guess, in terms of, like you said, the device in somebody's pocket when they go to war, being more powerful, being more useful to them, having more utility than all of those more expensive items they're sent with. But also, it's a vulnerability that people are taking, that always connected device into a war zone with them. But is that something that is just inevitable? That's just the risk, that's always going to be needed to be catered for. You're not going to stop war fighters taking electronic devices in because they need them, right?
Tyler Sweatt: Yeah, and I think it's a balance. This is a never ending conversations, like, what's the balance between access and security, and how do we get data to decision? Where are we willing to accept the risk, saying, hey, it's just more important that we get this insight at the right time, and where are areas we can't? Because there are also commercial solutions for low bandwidth, low latency type, where you can bring petabytes of data out to the edge, and you can run models locally on stuff. Then there are situations where I think it's perfectly appropriate to be having just commercial tech out there. Again, it's about getting the right answer. In Afghanistan in 2008, 2009, the locals out in the east understood what was going on in American politics better than I did, because they just Googled it. They'd be like, your president said this, and I'd be like, " Hey, that's news to me, guys."
Terry Pattar: Yeah. That's definitely not going to change, is it? In terms of access to information now being much more open, I won't say level playing field, but an open playing field in the sense that everyone's going to have access to the kind of information that in the past they wouldn't have had, and they'll have more sophisticated abilities to do more with that information. But just coming back to what you described at the outset, when you talked about being in that situation where, as somebody who's on the frontline, you're conscious that you're not being supplied with the right technology, and there's a gap between what's available, what's the right solution for you, or what would give you better capabilities, and the fact that you're not getting it. Has that situation improved? Is it something that, as we've talked about, because the nature of war is changing, is there something that government and defense, in general are just going to struggle to keep up with? Even if there is a war, are they going to be able to gear up quickly enough to catch up fast enough, I guess, to not be left entirely behind or vulnerable?
Tyler Sweatt: Yeah. There's definitely efforts happening with it. There's an increased amount of participation from the users. When you're building software, you're bringing the users in, you're doing personas and stories and figuring out what that journey looks like. While that concept is pretty simple for us building tech, that is a completely foreign language in defense. There's this entire group in the middle, that is just supposed to magically know what everybody at the tactical level wants, and you can understand how that happens, right? A PowerPoint slide gets sent, probably through 63 different offices, a little bit of change is made, it's made smaller and less flexible and less flexible. You asked for an iPhone first, and then when it comes back, you get a 12 year old IBM computer that runs MS DOS, and they're like, hey, here you go.
Terry Pattar: Yeah, put on some wheels so it's mobile.
Tyler Sweatt: That's right. Now you've got some actual user centered design and innovation happening. You've got the actual system users and the generators of the requirement involved in the Congress. Areas like AFWERX, and what Dr. Will Roper is working on, and Hondo Gertz in the Navy. There's efforts to do it, and you need folks like that, that are willing to push the envelope, that are willing to thumb their nose at the institution and say, hey, this is the wrong way. I'm going to keep moving the right way until you just kick me off the game. I think that's got to continue to happen. Because, just to give you an anecdote, we were doing some research on a software for a major Defense weapon system, one of the top three or four. We went down and did interviews with the actual firing officers. So, the folks who were responsible for delivering the effects, literally pushing the button to understand what that process looked like. Instead of just the weapon system interface, they had two laptops next to it on either side. They had a grease pen, and then to finish the kill chain, they had to move chairs twice, to other computers. That's exactly... We're like, wait, what? That's the magnitude of the challenge is literally people are moving chairs, and unable to share information or access information at their firing terminal. Then you asked about catching up, I think, I hope, I wouldn't be doing this if I didn't believe, it will be hard, it will be a challenge, it will require letting go of a lot of notions that you just alluded to about about security. If we lock up information and put it on a top secret network or a secret network, then nobody finds it. Information is only valuable if it's consumed and used to support a decision. If it's locked away somewhere in a vault, we've already removed all of the value from it. So, what's the point in securing it? The amount of information that moves right now, the latency on that value, isn't what it used to be. It's not, hey, I have a secret, I can keep this locked up for years and years and years, we're talking about seconds, and minutes, and hours, days, not months and years anymore on the value of a piece of information. I think once we get our mind wrapped around that a little inaudible to scale.
Terry Pattar: Interesting relation to what you're talking about earlier, when you were discussing, will it take a war for this to change? I think we've all seen... We've seen it numerous times at Janes, that whenever there's urgent operational requirements, procurement processes go out of the window, right? Things can get purchased quickly and easily. Then as soon as that's over, it goes back to the previous system. There may be good reasons for all of that, without digging into the details there. But what about, if there's a war situation that's going to drive demand or change demand, the nature of demand. But is there anything on the supply side that will change this dynamic, in the sense that is there going to ever be or do you foresee in the near term future, a technological development that is going to change or offer such a great leap in capability for defense, that it's just a no brainer that they got to go out and buy it. There's no point actually trying to build it themselves. They've got to switch how they're doing things because of the way that technological capabilities are changing. Whether in the US or the West generally, or from a competitor or adversary standpoint?
Tyler Sweatt: Yeah, crosstalk
Davey Gibian: ...change in tech coming along there.
Tyler Sweatt: Yeah, I think there is. I've spent far too many hours debating quantum, to go back to Davey's Schrodinger example. But you start to think about, what quantum resistant encryption, and what the processing power of the quantum allows you to do. That offers potentially a pretty significant breakthrough, which might actually solve some of the problems just by its sheer capability. Now, that area is something where everyone's always arguing on, is it coming? Is it not? Is it worth it? Is it not? I recognized that. I think the other side, it's less a specific technology, and more a vertical or a subset, I don't know the right term for it, is around inaudible teaming where we can build trust and autonomy. Even just the appropriate level of trust. If you had one manned fighter pilot, and seven autonomous that were key enabled to follow, and they understood how to work together, or instead of 5000 sailors on 20 different ships, if you had a few sailors and then you had an entire swarm following it, the key of it, things like that, where it starts to change the human cost. You think about the true cost of war, so much of it goes into personnel, on training and readiness, and then the tail behind it all. Decreasing the cost of that ramp up and really flattening that acquisition kill chain, where I can deploy swarms at scale, and if I lose half of them, that's great, I can throw out another 100, that type of a transformation, I think really closes the gap. But it comes... There's a ton of bias. If it was easy, we would already have done it. There's security questions. There's the argument, I think the new Top Gun is probably going to have some of this in it. Is it manned or unmanned? What's the value of a pilot versus the value of a machine? But all that being said, if we're really talking about generating global effects at scale, we've got to let go of a human having to wrap its arms around from a control standpoint. That to me, is the next big leap ahead.
Davey Gibian: Yeah, Terry, to your question, I don't think there's a single piece of technology, a single technological innovation, that is going to fundamentally drive everything forward. Now, quantum maybe, in the same way that nuclear weapons certainly changed the way in which we operate, but even with nuclear weapons, it's still about the infantry on the front lines in Eastern Afghanistan. Actually doing work in the field. It changed great power competition in some regards, but didn't really change how war is fought. I think what's really going to change the way war is fought in a similar capacity is that machining. Is going to be the direct integration between the man and a machine, and that's going to need to be not just... I know, the big push right now is around the hyper enabled operator. But, it's going to be the hyper enabled everything; airman, Marine, sailor, operator, whatever, it's going to need to be hyper enabled across the entire board with the capability to draw on lots of different tools and lots of different capabilities, in the same way you have a toolbox, but these are going to be tech tools, tech innovations. I think that teaming element is where we're going to see the biggest shift in the coming few years. It's going to be everything from a single pilot on an aircraft, to directing 12, unmanned, all the way down to the infantry officer with their fleet of robots that they're sending out for a variety of close combat operations as well. I think that's where it's going to be very interesting, because the country that gets that right, that's where we suddenly move back into the asymmetric warfare capabilities, even within great powers, where the country that is able to effectively human machine team, is able to greatly close the kill chain much faster, as well as to accelerate the kill chain. Then at the same time, put fewer individuals in harm's way, thereby decreasing the human cost of war. I think that's going to be where... In return, elevating the human cost of war on the opposition. I think that's where we're headed, and I think that... To the point prior, though, I don't know if we're going to see that effectively, until we're forced to in a combat or conflict setting.
Terry Pattar: What does all of that mean for intelligence in terms of... hat does it mean for the way intelligence is conducted, and the way that it's produced and the way it gets to those on the front line? There's a whole lot of things there that we've obviously talked about, and that's a huge topic in and of itself. But did you see that being as affected by those same drivers, or is there something independent going on in the intelligence field, that means that actually that's going to improve, regardless of whether some of these other technologies come on stream?
Tyler Sweatt: I think in a similar vein, where we talk about the teaming from a more traditional kinetic way, it's about effective teaming, but in a slightly different way in the intelligence side. Because we've alluded to it a bunch during this conversation, just the sheer volume of data that's out there, easy to result in information overload and the buyer-
Terry Pattar: I guess, we're only heading to a world where we're going to have more sensors, right? More sensors, more information coming in. What happens with all of that?
Tyler Sweatt: I keep going back and Davey's heard me say it 1000 times, I keep going back to trust, because if you can get trust, the same way we talk about it from autonomous systems on kinetic capability, if we can get into trust in the fusion and the analytics behind sifting through all this big data, it allows us to move faster, and to get to decision quicker, and it's only about context. Honestly, that context only matters at the point in time you're making that decision. The ability to establish that trust, where we can move at a relevant pace, and we're willing to accept risk. It's never going to be perfect. In my lifetime, we're never going to have perfect intelligence, I doubt we've ever had it before. The pursuit of that is probably unnecessary. But again, if we get comfortable with the fusion, and trust in machines, to do some, whatever the appropriate side of is, and we're willing to let go a little bit, because it's less about control and more about enabling the right person to make the right decision at the right time, then I think we're in business. Now, again, that goes back to the same reason we buy giant planes and boats, because it's easy, and we understand it. We've got to get off just trying to grab a volume of information as a discriminator, and get down to quality.
Terry Pattar: Yeah, because especially in a world where there's going to be so much more information-
Tyler Sweatt: And misinformation.
Terry Pattar: Yeah, and misinformation. While we might be better at processing it digitally, cognitively, that's where we're still going to lag. It's still going to be hard for people working in intelligence to analysts, et cetera, they're still going to have the hard job of actually making sense of all of this. Like you said, the context, putting it in context, the speed at which we need to make decisions.
Tyler Sweatt: One more stump for education and literacy, the one side of the triangle we haven't really thought about yet is society, those that elect and voted and vote out everybody who sets this. If we're struggling to understand it on the intel and on the kinetic side, we've got to make sure that we're educating society on what all of this means, because our struggle with information is only exponentially greater on the average citizen in society who's just being bombarded with noise. That's why I think that being able to separate signals from noise is critical.
Terry Pattar: But AI is going to save us, right?
Tyler Sweatt: Somebody's telling that, yes. There's definitely somebody cutting a check right now that believes that.
Davey Gibian: Yes. Given that Tyler and I spent about two years of my life learning how to hack, and disrupt artificial intelligence systems in a variety of operational contexts, I think we're both a little terrified of that.
Terry Pattar: I was going to say, what does that do for the trust element when you know there's somebody cutting that check?
Tyler Sweatt: Yeah. That's the downside, right? There is going to be no panacea, there's no perfect tech. The same way we've got to accept potentially imperfect capabilities, or imperfect information, we've got to recognize it. Artificial intelligence, it's vulnerable. It's as vulnerable as any other technology. I can trick a person, I can trick a computer. We've already all seen it at all the conferences, just the public examples of, hey, I've got a t- shirt that makes me not show up on facial recognition, or I put a sticker on the stop sign and fooled Tesla. That's the easy stuff. That's the vanity, I want to get some likes or some retweets, or whatever it is now. That's not the real stuff. If that's the vanity stuff, and it's breaking multimillion dollar systems, what do you think's happening behind the scenes?
Davey Gibian: One thing... Just going back to the intelligence elementary, Terry, that both you and Tyler talked about was the context. I am excited to see what that looks like in robotics, as well. I talk about that. Intelligence on the battlefield, there's the tactical, the operational, the strategic levels of the intelligence, but on the battlefield, you're looking at tactical intelligence, and then obviously, you pull that up to try to understand what's happening. Within robotics, you also have this added element of the supply chain of that. How do you start to disrupt and put in not just, large scale bombings of weapons facilities, what additional targets suddenly are on the strategic list when it comes to disrupting an adversary's capability? And how do you more rapidly understand that in context? When you see... Right now, you see a Russian weapon system, an S- 400, some big bulky piece of equipment. In the future, it could be a small bot or an individual robot. Who built it? Who designed the software? What are the global supply chain elements that get affected by that? Then your kill chain suddenly needs to be put into that context where, okay, if this is having an outsized capability on the battlefield, how do we suddenly start to disrupt this globally, or how do we start to eliminate the adversary's capability? Because I have the feeling that in almost any country, or I'd say in probably every country, there's never going to be a single, fully intra country supply chain for a piece of highly sophisticated technology equipment. That suddenly means that in order to disrupt what we're talking about, these human machine teams, your disruption capabilities aren't just going to be firebombing through Dresden, it's going to be disrupting a highly complex set of hardware capabilities, which will have spill on effects. It will be disrupt software development and their capabilities will have spill on effects of targeting civilian companies. There's suddenly the context of human machine teaming spills globally, and into society, far faster than I think we're willing to admit right now, when it comes to waging war against those capabilities. Because not just shoot down the robot, it's disrupt your adversary's capability to effectively deploy those robots in the battlefield, which is going to need to take both a societal and a global view. That's where, I think, warfare looks very different, because it greatly increases what is a legitimate target when it comes to intelligence collection, and then obviously, the action against that intelligence.
Tyler Sweatt: Yeah, you get into some of the supply chains. Remember the rare earth minerals and all the arguments around one country's investments and a whole bunch of mines, and another few countries not making investments and what that was going to mean. That's a really nice framing on hey, how do we drive up cost and drive down speed of the ability to produce. It goes right back into that whole... We call it acquisition warfare here. That is the goal of it. How do we preserve the integrity of that chain for our friendlies, our partners, our peers, all that? How do we adversely impact that for adversaries? Whether that's information, whether that's a manufacturing capability, whether it's a sub component for additive manufacturing, or, you're looking at rare earth, or you're looking at the human side of it, it's understanding what is that actual supply chain, and what are the critical paths? It gets into a little bit of a consulting drill on what matters and what can you affect? But we've got to understand that ourselves. To keep going back to the tech literacy and education, I don't know if we do, if we understand what increased battery power means and where that comes from. If we understand lighter sensors, and different optical capabilities, and SAR and all that, what actually comes into that, and how water inaudible That's a great point.
Davey Gibian: We've been trying to understand what that looks like at Janes quite a bit, which is, you pull in a piece of intelligence content, and that content has a name of the unit. Okay, cool. So, unit's pretty easy. You have the ORBATs, you have the commander's, you have the capabilities, and then the effective range. But what happens when it comes to a piece of dual use technology? What is the context for dual use technology look like, and how does that suddenly play into the intelligence collection lifecycle? It's wild. You're looking at systems and subsystems and manufacturers, and then also the fact that, what is it, 96% of everything is shipped on a boat at some point? You're looking at the global perspective that gets these components places. Context balloons, but within context, that means you have to have at all times a global understanding of what those linkages are, before you can start to make sense of things, and otherwise, you're just collecting all this information. But what is that global sophisticated map look like? It looks like ORBATs on steroids, or ORBATs on crack inaudible standing out all over it.
Terry Pattar: Isn't that the name of our new product, ORBATs on crack?
Davey Gibian: ORBATs on crack.
Tyler Sweatt: Sign me up. I think, Davey, you guys get at it, where what is dual use anymore? I can argue that everything's dual use and then by default, nothing's dual use.
Davey Gibian: I've always argued that all technology applications are dual use, because you can always... Any bow and arrow you can hunt and kill, same with the first person to pick up a rock to make a beat, banging on another rock to make music. They could also club someone to death with it.
Tyler Sweatt: You make a joke about the iPhone, like, drop a pin on something, and now your iPhone is a targeting device. Does that make the telephone dual use?
Davey Gibian: I think it was with some of the Islamic state actors who had stolen a piece of the ATAK codebase. Well, not stolen but found it on the internet, released on the internet, and they were actually able to use-
Tyler Sweatt: Borrowed.
Davey Gibian: ...yeah, borrowed. They were able to use relatively global Android Tactical Assault Kit capabilities on their own devices on the battlefield against coalition forces.
Terry Pattar: That code was released, was it to allow NGOs and other actors like that... It was released for benign reasons, if I remember correctly.
Davey Gibian: Yeah-
Terry Pattar: Rather than being leaked, or hacked or whatever, it wasn't. But you're right, it's taking something that's-
Davey Gibian: It's not making things better about the fact that they had it, I guess.
Terry Pattar: Right. But it was benign... It was something that was put out there for benign purpose, but then misused or used for nefarious purposes, which I guess is at the heart of everything these days, right? Well, you said, everything could be dual use. I guess it comes down to, for me anyway, I think of all of that context that you've just described, Davey, and I'm thinking, as an intelligence analyst, how do I make sense of all of that, and how do I understand the significance of these developments? If something, at a factory in Taiwan, what does that mean for the supply chain, for these components that go into... That's impossible for even a team of individuals to understand or even a division of intelligence analysts to understand without having the right technology available to help them condense all of that information, right?
Tyler Sweatt: No, I think we talk about that ecosystem. How everything's increasingly connected, how there are things that are commercial, inherently commercial, inherently military, and there's no longer a line in between them. That's where it becomes critical having the ability to rapidly drill down into, or to automate some of that drilling into, and understanding each of those nodes and their capabilities, because then it's the analysts job, to maybe contextualize those capabilities and the operational scenario that they're dealing with, but not necessarily have to go dig up what exactly are those? Hey, what does it do? Okay, I can use a little bit of the human ingenuity, to think about why would this matter in this scenario? But the ability to have that information at your fingertips to get you there faster, shortens that OODA loop, which in effect flattens the kill chain, which is good, to use all sorts of buzzwords in one answer.
Terry Pattar: I think anyone we haven't mentioned-
Davey Gibian: OODA loop, kill chain, AI.
Terry Pattar: The only one we haven't mentioned there is blockchain. They're going to play a role somewhere, surely.
Tyler Sweatt: Yes. I'll bite my tongue.
Davey Gibian: Yeah inaudible I don't even want to.
Tyler Sweatt: Yeah, don't make me do it.
Terry Pattar: Brilliant. This has been a terrific discussion, guys. I really enjoyed this. I realize we're up against time, but was there anything else that you thought actually, you really wanted to add into or go back to that you think would be of interest to our audience, and we should cover in this episode? I'm conscious that there's probably a lot we could cover in other episodes. But yeah, it'd be great to have some final thoughts from you guys.
Davey Gibian: We went from what the problems are in technology procurement to, what is the actual situation between technology and the Department of Defense, all the way into the future of war. I think that the really interesting thing is that we all agree that, at least Tyler and I, Terry remained a little bit neutral that they're barring without a conflict, and without a conflict that we admit is a conflict, we may not innovate fast enough, and we might not actually be able to change the situations. What I think, though, that I really want to start exploring is how do we, in the community, innovators and practitioners in the intelligence community at Janes and in the national security community at Second Front with Tyler, how do we start to at least remove the barriers, so that when that happens, as I believe it probably will, we are in a better position today than if we do nothing? For us on the Janes side, I think that that context piece, how do we put these massive global supply chains and dual use technology and emerging capabilities, how do we start to put them in a heuristic and in a taxonomy such that we can close that kill chain faster when there is a threat, resulting from those capabilities? And how do we better integrate the massive sources of disparate, unstructured and semi- structured inaudible out there, and then be able to put that against that context? I think that's our challenge is, how do you put something that is as complex as a dual use capability in a usable, actionable intelligence context on an ongoing basis that when a conflict comes, we're able to act against it? I think that's going to... If we can solve that, then I think we have done the best job we can, barring purely operational acquisitions, where, hey, we just need to throw money at the problems because we have a true risk.
Terry Pattar: Yeah, that's a good answer.
Tyler Sweatt: I was going to talk about so many questions or scenarios that I'm faced with, have to do with, how do I bridge the gap between emerging technology and national security? It's usually containerized down at the Silicon Valley to DC level, I think we've got to extend our framing of that problem, and think about what are partnerships and what a jointly developed technology starts to look like in the future, and how are we building things that will allow everything Davey's gotten on, everything we've talked about today, but not just across a US built or UK built or Australian built platform, but across all of them simultaneously. We start to think about true situational awareness, because none of us go to war alone. There's none of the Intel communities are working in silos, without working with international partner. I think that's where we've got some really interesting opportunities. One, from a policy standpoint, think about what does control look like there? But two, when we think about where there's really fantastic opportunities for commercial technology to scale, I think it is in some of those platforms and information sharing capabilities that will allow, and set the foundation for what Davey is talking to, and allow for global context across all of our partners simultaneously, which would be pretty transformative.
Terry Pattar: Yeah, I think there's an entire episode, if not more, we can talk about that interoperability element. Because that strikes me as being a really key thing. Because otherwise, if everyone just keeps doing it in their own silos, we're only going to compound our problems. This has been a great discussion, guys, I've really enjoyed it. This is given me a lot of brain food, and I'll go away and have a lot to think about after this. I'm sure everyone listening will do. So, thanks again for the time and joining us. I hope we can follow up on some of these discussions and delve into some of these subjects in more detail at other times. But, Davey, always great to talk to you and Tyler, thanks for taking the time out to come and join us on the podcast.
Tyler Sweatt: Absolutely. Thanks for having me, guys. This was fantastic conversation.