Discussions with DTRA: Episode 3

DTRA podcast profile image

Discussions with DTRA Podcast: Around the Microphone

DTRA, the premier agency for meeting the challenges of WMD and emerging threats.

The DTRA Podcast series provides agency members with a platform to discuss interesting mission-related, morale-boosting or special interest item topics. The goal of our program is to deliver cross-talk that educates and informs audiences in an effort to support employee engagement and target potential outreach opportunities. Listeners can anticipate hearing conversations that are agency director-supported, amplify agency's core functions and convey mission intent in segments that range from 20 to 40 minutes.

Episode 3: Artificial Intelligence, Challenges, and Advice

Length: 22:08

This episode discusses the challenges and advice for emerging practices associated with acquiring Artificial Intelligence. Also, it provides specific AI guidance for successful AI projects and the reasons some AI projects fail.

 
 
 

Interviewer:
Michael R. Howard
Chief, Acquisition Systems, Training and Support/PM/COR Acquisition
Management Department Defense Threat Reduction Agency

Interviewee:
Rhonda Maus
Professor of AI Software Engineering and Agile Coach Instructor
Defense Acquisition University

Public Affairs Facilitator:
Darnell P Gardner
Public Affairs Specialist
Defense Threat Reduction Agency

 

Acronym Terms
DAU Defense Acquisition University
DTRA Defense Threat Reduction Agency
WMD Weapons of Mass Destruction
DTRA PA Defense Threat Reduction Agency Public Affairs
AI Artificial Intelligence
IT Information Technology
DevSecOps Development, Security and Operations
DoD Department of Defense

Transcript

Announcer:

Welcome where the defense threat reduction agency brings together subject matter experts to discuss meeting today's challenges of WMD and emerging threats, increase awareness, and deliver morale boosting information. And now today's show.

Darnell Gardner:

Greetings, and welcome to discussions with DTRA. I am Darnell Gardner from DTRA PA, and I will be facilitating this event. Today's podcast host will be Mr. Michael Howard, chief acquisition systems training and support at DTRA, and he will be accompanied by guest speaker Ms. Rhonda Moss, defense acquisition, university, AI software engineering, and agile coach instructor. A topic for today will be artificial intelligence, challenges, and advice. Please, if you will.

Michael Howard:

Well, good morning. This is Michael Howard with the acquisition management department. I'm here today with Rhonda Moss of DAU. Professor Moss, it is a delight to see you today actually physically after much communication over the past year. Would you share a bit of your background with our audience today? Thank you.

Rhonda Maus:

Certainly, Michael. Yeah. Thank you to you and DTRA for having me in today. I've been with DoD for about three years and I'm part of a cadre of people from the software world who are moving over to DoD to help with our software problem. So I had a 31ish year career in Silicon valley and Wall Street, all in software engineering or software program management. And I retired early as we often do at Silicon valley jobs and decided to come back and help. I'm a 27 year military brat and wife. So I was raised exclusively on military bases, and I really feel at home here and wanted to come back and help.

Michael Howard:

So DTRA is very engaged and as you know, have our recurring artificial intelligence machine learning and data science sync meeting of all interested parties in the community here at DTRA and some of our external partners as well. So with that stage set, and before I ask the first question, I was listening to the chairman of the joint chief staff presentation at West Point graduation the other week. And he made a comment that I think is relevant today. He stated that artificial intelligence was the mother of all technologies, where machines are actually developing the capacity to learn and to reason. The chairman said these rapidly converging developments in time and space are resulting in that profound change, the most profound change ever in human history.

Rhonda Maus:

Wow, that's impressive.

Michael Howard:

It was. That's why I wrote it down. With that, there's a lot of discussion about AI as you know on the media, within DoD, all around us. There's a lot of discussion about AI. However, what is the correct definition for artificial intelligence?

Rhonda Maus:

Well Michael, that is sort of the $64,000 question really, isn't it? I think it's something that you and I spoke about in our very first phone call during COVID a year and a half or so ago. There really is no single definition that the entire world agrees upon for AI. Even science and academia don't agree. So it makes it tough to explain AI. In some ways it's a buzz term or an umbrella term that has been around since the 1950s. The technical techniques so to speak that fall under that AI umbrella, those capabilities that we innovate, they're transient, meaning something we would've considered AI five or 10 or 20 years ago is no longer AI today. Something like traditional autopilot that all airplanes, virtually all airplanes have today, used to be considered AI. And it isn't now because it's become commonplace. Google maps, something we use on our phone almost every day used to be considered AI.

Rhonda Maus:

And they do use some AI technology, but generally people won't put Google maps under the AI umbrella anymore. So we have a situation where we don't always agree on the definition and the technology that lives under that umbrella is transient. There's a challenge. That's a challenge for DoD and it's one reason it's difficult to come up with best practice or procedures or education for AI, because it's hard to separate AI out from the rest of IT. I will give you a couple of definitions though, that DoD has settled on and that I think work for us.

Rhonda Maus:

The DoD strategy for AI says it's the ability of machines to perform tasks that normally require human intelligence. And that one leaves a lot to the imagination. There's a better one, I think, in the 2020 national AI initiative act. And it says that it's a machine based system that can make predictions, recommendations, or decisions. And that's key. Predictions, recommendations, or decisions. Influencing real or virtual environments. And it goes on to further describe it. But I think ultimately that one is a better definition if you're a program trying to figure out whether you're buying AI or not, or whether you have AI or not already in your systems.

Michael Howard:

Very good. That leads me to my next question. How does modern software practices apply to artificial intelligence?

Rhonda Maus:

So at its foundational level, we could think of AI as modern software running on modern hardware, hopefully using better data. It's not always better data today, but we will get there. So all AI is some form of software running on some form of hardware, whether the hardware is an autonomous vehicle, a server, or a robot. So we like to try to separate those. But in reality, at the end of the day, for the folks who have to implement and maintain these systems, they really are better software running on better hardware with better data. That means that modern software practice is crucial for AI, absolutely crucial.

Rhonda Maus:

Iterative, agile software development, developing in small increments and automating the software development life cycle, which we like to call DevSecOps in DoD, those two practices are really what we think of when we talk about modernizing modern software. They are absolutely the bargain basement foundation needed for AI. And the reason is AI brings new automation and new kinds of reasons that we need to be fast, that we need to turn releases around quickly. And if we don't have those modern software practices in place, it makes it tough to get started with your AI program.

Michael Howard:

That makes sense. How can we optimize artificial intelligence at DoD? And second part, how should we be thinking about AI as an enabler for DoD?

Rhonda Maus:

Well we almost have a mandate that we need to not use AI if we don't need it, but certainly we need to be thinking about AI all the time. Developing AI technology for the battlefield is absolutely a top priority for the department. We know our adversaries are doing it, and they continue to scale up their own AI and machine learning capabilities at an exponential pace. So as the innovations come at us, we don't have five or 10 years anymore to sort of get started using a new technology. We have maybe two years before it might even itself be obsolete and we're on to the next thing or the newest version of that. Our adversaries aren't waiting as long as we are, at least China isn't. I think we've seen with Ukraine we're not really sure if everything we've heard about Russia over the years is actually happening to modernize their weapon systems.

Rhonda Maus:

But we certainly know there are a lot of capable Russian and Chinese software developers because many of them worked in the United States. So I will say in my 30 years in software development, the majority of people I worked with were here on visas from India, China, or Russia. So in some ways you can think that they built a lot of what we have today, whether it's in the private sector or DoD. So the imperative, the enabler for us is we've got to keep up with our adversaries. AI is already really ubiquitous across the department today if you think about it.

Rhonda Maus:

Tools that we use every day, when I talk with you, Microsoft teams, that has AI in it. Even the latest version of Microsoft Excel, if you use the Microsoft 365 version, it has AI in it. So AI is ubiquitous already in terms of our business systems and our systems that we use to work in our everyday lives. Much of DoD's AI will be for the defense business systems I believe. Doesn't mean we won't use AI in those types of weapons systems and special systems that DoD does. We absolutely will and we are. And we certainly look to enable machines to become those trusted collaborative partners of war fighters in the future.

Michael Howard:

So then, is it best to buy or build AI?

Rhonda Maus:

That's a great question. You and I are both in the acquisition business, acquisition education even. I would say initially, most of what we will do is buy AI. The vendors, particularly the Cloud software vendors are coming out with more and more packages. You'll hear of buying AI as a service or machine learning as a service. So you'll be able to get packages of AI that you can sort of use against your problem. And so certainly for DBS, we will, I think most of the time buy AI. Many of our vendors that we already use today are touting AI. We call this applied AI.

Rhonda Maus:

There's sort of two ways to think about AI. There's applied AI, where you're applying AI to a problem, and it could be an everyday problem. It could be how to use AI to better engineer a system, how to use AI to better test software. That's applied AI. Certainly most, if not, all of our vendors are touting AI. So we're buying it already. And in the cases where they can employ experts who understand AI and data science better than we can, we will buy more AI initially, I think, and probably for a very long time.

Michael Howard:

Okay. That leads me to my next question, and it fits perfectly. What type of support structure does AI technology required for full implementation in DoD?

Rhonda Maus:

That's a great question. Josh Mark Hughes, when he did his time on the defense innovation board, creating early guidance for DoD on AI, he said that we need to pay attention to the AI ecosystem. And what he meant by that was, and I think by the way, they ended up calling it AI readiness. So are you AI ready? And what he meant by that was four pillars really. People, do we have the right skill sets? Data is so important in AI. It's so important to get your data ready long before you ever plan to implement AI. Organization, culture, and technology, the technical infrastructure. So those four areas to support AI are key. It's difficult because as Josh so wisely said, he said that average program manager today in the department who starts an AI project is almost certainly starting with a deficit.

Rhonda Maus:

And that means the infrastructure that we have and the data that we have that they will need, and potentially the skill sets of the people and the culture and the organization aren't AI ready yet. So that program manager will not only own what is the eventual AI solution, they will also own getting their part of that world of that AI ecosystem up to snuff before they can really effectively implement an AI solution. An example that might be networking. So we've all noticed some slow networking occasionally around the department. And by the way, that happens everywhere sometimes. In order to pass around the large amounts of data that an AI project will need and to pass it around quickly, whether we're pulling it in off of sensors or whether we are rerunning machine learning operations over and over to improve it, we'll need significantly better bandwidth and infrastructure than we have today, quite likely. So it'll be on that program manager to get that upgraded for whatever their part of the world is all the way out to their edge in order to effectively create and implement that AI solution.

Michael Howard:

Well, it sounds like infrastructure is critical to AI.

Rhonda Maus:

Absolutely. Yes.

Michael Howard:

Very important. So I hope that everyone gets that message. In our time left, I've got just a couple of more questions. What are the five most common reasons AI projects fail?

Rhonda Maus:

That's a great question. There's a lot of research on this and it mostly comes from Silicon valley, but the number one point of advice really is ensure that the problem you're trying to solve really needs AI. So sometimes in DoD, we like to issue mandates. We really want to move everyone onto emerging technology. And somebody might say implement that AI system by end of year, or use AI by end of next fiscal year to get that going. We can't skip the part where we discover whether that problem needs to be solved by AI. So there are a lot of data analytics problems in the world, and there's a fine line in some cases between data analytics and AI where we can really just use data analytics and you'll have an easier project on your hands, quite likely. So don't use AI when you don't need to use AI or until you need to.

Rhonda Maus:

And I'd say the other four areas really fall within those AI readiness pillars that the defense innovation board created for us. Data. There's a lot of sayings about data in AI. You've probably heard data is the new oil or data is the new electricity. Data, data, data. We have some very strong data strategy statements. One of them being all leaders in DoD will treat data like a weapon system. So what does that mean? To me, it means we better start thinking about our data before we get to the AI project, because it's the same data we're going to use that we're using today in our other systems. We're just going to have access to more of it. So not preparing your data, failing to get your data cleansed, labeled, failing to get access to it. Everything that you need for your project is a big failure point for AI projects.

Rhonda Maus:

Sometimes they'll start up and say, we just can't get the data from the contractor that we need. So when we're contracting for new systems, we're always thinking about data and intellectual property and what we'll have access to. Think about also needing access to data you may not think about today that you're going to put into an AI system. And we can help teams go through that if needed. Call DAU or come onto our DAU website. And we're going to be putting up in the next month or so job aids to help programs figure out what they need to ask for in terms of data. Technology infrastructure, same thing. We know we have these large infrastructure projects going on in DoD. Make sure that your infrastructure, that you need to pass that data around and that you need to bring in and get back out to the edge so the war fighter can use it. Make sure you've got that bandwidth and those pathways.

Rhonda Maus:

People. Don't start a project without AI knowledgeable people, identifying which ones you need or where to get them. And they're scarce and they're expensive, but there are a lot of good folks in DoD working on that problem. The joint AI center, which we now call as of two weeks ago the CDAO, they have data experts that you can tap into. And we don't think we will be able to buy a lot of data scientists for instance out there in the world. But we certainly have some within some of our agencies who are tasked with working on AI that we can loan out or who can advise your data teams. So there are ways to get around some of the roadblocks for skill sets, and you can also start upscaling your own people. And I think, Michael, that's what you and I talk about quite a bit. And so we're looking constantly at that, and then your culture. So those really make up the areas that contribute to AI project failure.

Michael Howard:

On the flip side, let's talk about, and you've actually alluded to that to some degree, can you provide some advice to achieve successful AI projects?

Rhonda Maus:

Yeah. Sort of circling back to what we talked to at the beginning at the foundational level, let's think of AI as better software running on better hardware with hopefully better data. So let's make sure we have things like agile and DevSecOps running. We think they are required, virtually required for AI. And I say that because in the department today, some of us might think of DevSecOps as that's something new we need to get to, but in the software world, that's something we should have had running like an eight day clock five years ago. We should have automated our software development life cycle. Well, I will tell you that if you take on an AI project and it will most likely have some machine learning, we know today that most of our AI projects are machine learning, machine learning in itself has its own new operational cycle that you have to add to the software development life cycle.

Rhonda Maus:

You're going to need to do some things that you don't do today in DevSecOps, and you're going to need to do them iteratively and very quickly in order to get those models to work in machine learning ops, in order to get that outcome to be what you want it to be. If you haven't already implemented DevSecOps, you're going to have a tough time implementing machine learning ops. So our advice would be to achieve successful AI projects is modernize your software process and increase your skill sets. So you're right. It is pretty similar to what we just talked about.

Michael Howard:

Well, that actually concludes this session. And I want to thank you again for sharing with the digital audience today. It's been a pleasure, as I said, especially for me to meet you personally, as we've been communicating by landline and online. But again we're delighted.

Rhonda Maus:

Thank you. And I feel the same way, Michael, too. I have to say that I see a lot of AI inter workings across DoD. And I think you do too. I have to say that DTRA's community of practice, so to speak, what you call that AI ML data science meeting that you have is tremendous. I have not seen anything else like it across the department where an agency values that emerging technology and learning about that emerging technology and currency on the emerging technology as much as DTRA does. So hats off to you and your agency, because you're doing a great job with your AI readiness.

Michael Howard:

Well, thank you. I will bring that message back to our chair of the working group, Aaron Smith.

Rhonda Maus:

Please do.

Michael Howard:

He'll be delighted to hear that.

Rhonda Maus:

Please do. Thank you.

Announcer:

Thanks for listening. To hear more podcasts, don't forget to subscribe on Google Play or wherever you listen. You can find out more about DTRA at dtra.mil.

ABOUT DTRA

DTRA provides cross-cutting solutions to enable the Department of Defense, the United States Government, and international partners to deter strategic attack against the United States and its allies; prevent, reduce, and counter WMD and emerging threats; and prevail against WMD-armed adversaries in crisis and conflict.  

DTRA logo

CONNECT WITH US

Facebook Twitter YouTube LinkedIn DTRA Webmail

8725 John J. Kingman Rd., Fort Belvoir, Va. 22060-6221

Welcome to the Defense Threat Reduction Agency’s website. If you are looking for the official source of information about the DoD Web Policy, please visit https://dodcio.defense.gov/DoD-Web-Policy/. The Defense Threat Reduction Agency is pleased to participate in this open forum in order to increase government transparency, promote public participation, and encourage collaboration. Please note that the Defense Threat Reduction Agency does not endorse the comments or opinions provided by visitors to this site. The protection, control, and legal aspects of any information that you provided to establish your account or information that you may choose to share here is governed by the terms of service or use between you and the website. Visit the Defense Threat Reduction Agency contact page at Contact Us for information on how to send official correspondence.