Episode Transcript
Robert Wood (00:01)
Hello, everyone. This is Robert Wood. I'm the CEO with Sidekick Security, and I am joined here today by Joe Lewis, who is a friend and former peer who is the CISO over at the Centers for Disease Control, the CDC. I will turn it over to Joe and let him introduce himself because he's going to do a way better job than I will. And I don't want to steal all of his thunder before we get into it.
Joe Lewis (00:28)
Well, not much thunder to be had, but my name is Joe Lewis. I'm the Chief Information Security Officer and Director of the Cybersecurity Program Office at the U .S. Centers for Disease Control and Prevention. In that capacity, I'm responsible for the cybersecurity and privacy program for the entire agency, which is spread throughout 60 different countries and multiple domestic locations.
Robert Wood (00:48)
I love that. I love that. Thank you, sir. And I appreciate you being here. And what I wanted to focus on today.
is really how you lead through a complicated organizational structure. Like there's all this pontification in the industry around how to build a security program, what the most effective security program looks like. You you go to a Gartner conference or any RSA, Black Cat, whatever, and there's just, you're just berated with all these.
things, all these point solutions. I liken it to try to, my kids are really into Legos. they're trying to build some kind of masterpiece and they've inherited my old Legos. They bought some from the Lego store recently. Maybe there's some like off market, kind of off brand Legos that maybe they don't fit together and they're trying to build their masterpiece. And it's kind of like that. You're trying to build the security program that
does something for the organization, and we'll get into that, and you're trying to do it given all of these crazy work dynamics. And so that's what I want to focus our conversation on today. And so my first question, to kind of tee things off here, is you guys were obviously very involved in the world of COVID response and play a big part in the overall functioning of the health care system for the United States. Now,
As part of that, how do you think about balancing this technology and mission enablement with security, like the very real needs of security to make sure that things don't blow up in anyone's face, but at the same time, people are able to get things done?
Joe Lewis (02:36)
Sure, so first and foremost, you threw out the Lego thing nice and early, which I'm glad we're gonna get into, because I am a huge Lego fan. Like the nerd in me wants to tell you that Lego is both singular and plural. So that's the level of diligence that I'm gonna apply to this conversation as it relates to Lego. No, and actually the metaphor is pretty apt, right? Because you have an existing structure that you inherit. And then while...
Robert Wood (02:52)
I
Joe Lewis (03:01)
you're trying to make that structure fit what you envision the program to be. You have all these outside requirements and needs and kind of evolving changes. And so thinking about how do you rapidly and continually evolve, especially in the government space where we are notoriously overly bureaucratic and things take too long to get done. I think that's a really, really apt metaphor. You asked me about how do you balance security against the needs of
being able to allow the mission to get done, especially in such a complex space. And I do that by going back to your textbook definition of risk. Risk is probability times impact. And one of the things that are often lost on cybersecurity professionals is understanding what impact means. Impact is not, well, you're going to be in violation of a policy. You're not going to follow, you're going to have bad FISMA numbers, right? It's impact is the mission does not get done, right? Hey, there are real
public health impacts to decisions that I make. And so I'm the agency's authorizing official. So I'm the signatory on all federal information systems and their authorization to operate, their authority to operate. And so I make decisions by weighing those real, very real operational impacts against what is perceived to be in some cases, pure compliance versus security. And that's a different space I think is worth getting into as well. Where is the balance between security and compliance?
But for me, I have to approach this job from one as the ignorant outsider. I'm not a public health professional. I jokingly say I'm not a public health professional, but I play one on TV since I work for the CDC. But in reality, I don't know public health. I'm not a public health professional. I'm not an epidemiologist. I'm not a virologist. I'm not a disease pathologist. But what I am is smart enough to know that I'm dumb in those areas.
ask the right questions, develop the right relationships and understand the missions. So that way I can try to tailor the program to meet those needs.
Robert Wood (05:08)
Yeah, and I love the focus on mission impact when we use the term impact. One of the things that always drove me a little bonkers when I was at CMS was the excitedness that would happen over FSMA findings or policy gaps and things like that. Paper findings, basically, is how I refer to them.
I could never really reconcile that with real, exploitable, some other disruptive impact. one of the other things that I think is really interesting about the federal space, and I think this exists in a lot of organization types. It's just at a different scale, and there's slightly different politics sprinkled in, is
The authorizing official or the CISO, the head of security, the big cheese, if you will, in security gets looked at as this individual, and I think it's reinforced through the authorizing official title or designation or responsibility, whatever we want to refer to that as, where you are accepting risk on behalf of the agency or on behalf of some system. And this was another thing that I wrestled with a lot is like,
it in some ways alleviates the ownership of risk from the people who are building the systems or operating the systems or handling the data. And it kind of shoves it over in your court as the authorizing official. Now, there's a few things in there that I think are really interesting. One is,
What do you think about that dynamic of risk transfer, and is that the right thing? And then two is, as the CISO, how are you trying to immerse and educate yourself into the mission areas and understand them well enough so that if you are playing this role, you're playing this health expert on TV, how do you understand that mission well enough to make well -informed
risk decisions instead of just, you know, kind of treating everything as a generic, best practice kind of decision.
Joe Lewis (07:38)
Sure. So risk transfer is a really interesting concept. And if you go back to doctrine, right, and it says that the authorizing official should be the one that can truly accept risk on behalf of the business, which, you know, you could be argued that I don't necessarily have that level of fidelity as maybe a business owner would. But what we found, I think, and you your mileage may vary. This is my third federal agency, but a federated authorizing official model is extremely complex and very, very cumbersome.
because of the skill and knowledge gaps that exist. The closer you get to mission and the further you get away from cyber. And so we toyed with the idea very early on in my tenure here about creating very, very complex, comprehensive training programs and then federating out the authorizing official role to bring it closer to business because who else understands the business better than the business owner? The problem with that is already there are existing
skill gaps that I think would only get exacerbated. And then what we would have is unmitigated, unchecked, unmeasured risks that are being blindly accepted by folks that don't really understand that. whether I'm as the authorizing official, the right person, I'm what we got, right? And at least I understand the true nature and the understanding of how cyber really does enable business to get done. And to your second part of the question, the way I figured that out is
My first year here, I went on what I would call a very, very comprehensive grip and grin tour, where I just I met as many people as I possibly could. And I wanted to know as much as they were willing to tell me, tell me about your business, tell me about your mission, tell me about your what I visited a number of CDC locations. I went to the the Pittsburgh office where the National Institute for Occupational Safety and Health lives. And I got to do a tour of the coal mines.
And they were showing me how they were doing respirator certification and how they had a sweating dummy that they could use to validate perspiration. it was fascinating work, But then you juxtapose that against very, very broad case or patient surveillance systems where we're gathering data from state, local, tribal, territorial governments. And then we're using that in order to make informed analyses and visualizations and decisions.
on how and where does CDC intervene in order to protect lives and save health. I'm sorry, protect health and save lives. And so there's a very wide mission set at CDC, I guess is the point. And I did as much introductory crash courses into those as I could by meeting all of the executive leadership, meeting the mission owners, talking to them about what are the pain points that my office presents them with and how can we help them with those.
by virtue of those conversations, I learned more about CDC than I think I probably anticipated. And I use those. And then I'll also say this. The other thing I think that I have going for me, and I would be remiss if I didn't mention this, is I have a fantastic deputy. My deputy, Nathan, we hired him from program. He worked for many, years. I think it was like 10 or 11 years in the Center for Global Health. And so he understands
uniqueness about CDC's global mission. He understands program impact. He understands the challenges on the submitting side for those that are having to participate in the authorization process. so early on, I made it pretty clear to him that any decisions that we were going to make about the landscape or the way we approach cyber would always be informed by his perspective as the program person. All right. Tell me what this does if we make this choice. And is that good, bad, net zero?
and then we would adjust based on that. And I can tell you that many, many times we adjusted fire because I didn't want to unduly negatively impact the programs.
Robert Wood (11:40)
I like that a lot. I also cannot over or under, like the value of a good team and deputy in your case, and I mean, I have fantastic deputy myself, just tremendously, tremendously valuable. Now, we did touch on this whole.
you know, mission impact and understanding the mission is obviously important. Being able to filter all that risk signal that you get through some kind of prism of understanding that you have and make a decision on the other side is important. The reality of working in health care and of course working in the federal government or any big regulated space is that compliance is
elephant in the room and it's always there. Whether or not we love it or hate it, most of time we hate it. But how do you think about balancing compliance, and I don't want to call it risk, but compliance gaps with mission impact? Is it a prioritization thing?
Yeah, is it a prioritization thing or do you have process in place for kind of suppressing or accepting compliance gaps or at least communicating compensating controls or something to that effect?
Joe Lewis (13:10)
So I'll tell you, know, the relationship between compliance and security is one that I think is, it's, it's very personal. Everybody has their own interpretation of how those two, sometimes what some people say is competing forces, work against each other. And, but for me, I very much see them as complimentary. And what I do is I say that compliance is the floor and security is the ceiling. Right. So at the very, very least we need to have, if we're compliant, if we're doing compliance, right.
then we're at least have foundational baseline security in place, right? You go through the NIST 853 catalog and you say, well, I'm compliant with these AC controls. Well, those are access controls. That means that you're assigning accounts properly and you're doing principles of least privilege. And I mean, if done properly, compliance can at least get you a start towards a good security. But what I think the difference is I'm not hamstrung by compliance. I can't be, okay?
The mission of the organization is such that, you know, and I'll give you a perfect example of this. FedRAMP, right? There are 300 in, last time I checked a little lower, 315, 320 cloud service providers in the FedRAMP marketplace. How many hundreds of thousands of potential solutions are we excluding from the federal government by demanding everybody only use FedRAMP approved solutions? And then further complicating that to say,
Robert Wood (14:21)
Mm
No, for sure.
Joe Lewis (14:35)
you know, even vendors that are FedRAMP approved, the feature set on the commercial side is so much better because of the slow adoption on the federal side for the FedRAMP approvals, where we're actually accepting more risk because we can't get the latest and greatest tooling. takes six to 12 to 18 months for us to get those things. So I don't approach my role as being hampered.
by compliance. I'm informed by compliance, we aim to be compliant, but I treat compliance risk like any other kind of risk. in private sector, you'll see similar things with compliance risk or regulatory risk or legislative risk, legal risk. Those are things that in the grand scheme of things get elevated to a common risk register and common definitions and tolerances are identified and then risk decisions are made. Right? If the cost of becoming compliant
exceeds the value to the business, then you need to accept risk on not being compliant. And that's, think that's a, that's a really hard thing for somebody in the federal space to say out loud, but we need more people to be willing to say that out loud. So that way we can make smart decisions rather than those that are purely informed by the compliance standards.
Robert Wood (15:50)
Yep, I agree with everything that you just said. I'm going to try to flip the question around a little bit here. Do you think that there's times in which, in the pursuit of security, in the pursuit of building these programs, where
compliance can be a really useful tool. Normally people are kind of wrestling against it and making it this kind of output of necessary activities and all of that stuff. And I guess, how would you think about using it as a point of leverage, as a tool, as a, yeah.
Joe Lewis (16:29)
Yeah. Yeah, no, no, no, that's a great point, right? Because ultimately, I try to think of my role, and anybody that has any true authority or power in an organization, if you have to wield said power, then you've already diminished it to some degree, right? And I think a lot of people, they think the opposite, they have this big stick, and they're going to go wield it, and they're going to go bash in skulls, and people are going to do what they say, because damn it, I'm the sizzle. But in reality,
I think you've lost the battle if you're approaching the job from that perspective. However, I think there are instances where my statutory authorities, the compliance mechanisms that are in place that I have to adhere to become strategic enablers of decision making. And I think that's where it gets really, really fascinating. If you can get to the conversation early enough.
where you can have these compliance and security requirements identified and baked in at procurement. Because I think that we don't state enough how important it is to get everything at procurement. Because if you can bake these things as security requirements for developers or for a contractor or whomever, then it doesn't have to be bolted on in some Frankenstein monster later.
Right? And so those are the places where compliance really helps because during those requirements generation meetings, during those discussions with procurement, you say, you can't really do that because SSDF says this and NIST says that and all these other things. Then all of a sudden you can massage and get those things into a better state prior to any actual action getting done, which I think is the most important place to get it done.
Robert Wood (18:13)
pull on that a little bit more. Having gone through a few federal procurements myself, least like over -seeing the preparation for them. And I think this is one area where the federal and public sector world is quite different than the private sector world. Unless, you know, I'm kind of discounting or setting aside big enterprise, or almost like mini governments, in terms of their process and bureaucracy.
There's already a decent amount of work that goes into writing a SU or an SOW, PWS, all those necessary artifacts that go into preparing for a procurement. I want to do my justifications, do my market research, do this and that. so you're laying the groundwork for something to happen in the procurement process.
Do you think that some cost fallacy sets in in a meaningful enough way that it can be combated through compliance and security shifted upstream to the procurement cycle versus Mr. or Mrs. business owner?
I did all this work, I did all this research, I'm invested. I can't pivot away now, whether or not that's real or just more of an unconscious bias working against them.
Joe Lewis (19:49)
That's pretty fascinating, right? The idea that I guess I could see that perspective, right? Now, well, and I'll tell you, you have the benefit of having sat on the other side of the table where I have not yet, right? So I mean, can totally see that color in your opinion to present that as a very real and valid concern. What I will say is this, I think I look at it more along the lines of internal stakeholders that are
Robert Wood (19:55)
That's what they just play with, as advocates.
Joe Lewis (20:20)
wanting to procure or develop tools that don't necessarily understand. And that's where I think my job is to help get them to the left. Now on the procurement side, like on the external side, I can totally appreciate how that might be a limiting factor to some right, especially when you get into the very small businesses, eight days hub zones and like, where they say, well, I'm already trying to do all these other things. And now you're adding more requirements, that just makes it harder for me to compete. And I think there are mechanisms in place to try to help with that things like the set asides and the
know, the I mean, there are there are mechanisms for that. But I can I can totally appreciate how that that would color that side of it, too. But for me, I approach it more on the internal side. Like I said, it's very much like, hey, I need to go and buy a disease surveillance system. It's like, okay, well, let me get to you all of the things that you need to get to in that procurement cycle. So that gets done properly, right?
Robert Wood (21:10)
Yep, yeah, you want to build a data system, some kind of data environment. are the things that we are going to want to think about. Yeah, I like that. All right, now I'm going to unpack this or build on this whole procurement security compliance triad here. And I want to look at it through the lens of an organization innovating.
Innovation is a hot topic. I mean, it's always kind of a hot topic, but especially with government crises, nationwide crises, really, like COVID, there's a lot of stuff happening really, really, really quickly. It was kind of unparalleled on many levels. And so the interesting thing about that is not that process could be short -circuited and things could be sped up.
But I think, or at least the interesting thing for me is the trade -off that ends up happening between mission and getting things done and the potential buildup of tech debt, process debt, some kind of debt that you're going to have to pay down later. You're of swiping the card just to make things happen faster. Now, what is, and then you have this kind of risk as an undercurrent to all of that.
what's risky or not achieving the mission or achieving the mission with a clunky thing on the other side of it that might fall down later. I don't know. So I'm curious how you think about building on all the procurement once you're in there and you're building.
How do you kind of balance those things? And maybe it's, maybe it's an education thing back with the business owner, or maybe it's something that is happening inside of the technology or the security office, but go.
Joe Lewis (23:11)
Well, so no, I'll tell you that historically, I think security and especially in the federal space, we have not been good stewards of agility and being responsive. Okay. So when you've, when you talk about innovation, you, think I inherently think of a rapidly evolving, rapidly iterating process, in order to achieve some, expected business goal. And, know, necessarily, I don't think
Social Security has ever been the place where you're like, hey, those folks are quick. Hey, they're going to help us and we're going to get right on schedule. It's usually the opposite. They're like, well, let's work around security because they're going to slow us down. They're going to tell us all the things we can't do. And so what ends up happening is we get boxed out of conversations and we accrue that debt that you talked about, whether that's technical debt, whether it's commitment to legacy process debt or worse, it's an accrual of risk unmeasured because it took place outside of our purview.
that we can accurately report on those things. So the way I've approached security for CDC is very much a yes and or a yes but, right? Yes, but you're gonna do these compensating mitigating controls. Yes, and in addition to this thing, we're also gonna do these other things and that's gonna help facilitate things. And what I found over the last two years is that that has gotten us back to the table. Okay, and.
certainly we're not in a position to be able to affect every change we would like to see at every conversation, but at least we're at the conversation, which means we know about it, which means that we can advise, we can consult and we can try to what I call enable the secure delivery of the public health mission, right? The public health mission is going to get done. I would much rather get done as securely and as compliant as I can. And I have to be at a conversation to do that. That said, when you get into response mode, right, we get back to
probability times impact, the impact of not moving fast enough, the impact of not doing something is so grand, it's so, you know, potentially catastrophic, that yeah, we end up accepting risk that we end up having to clean up later. And I think that that is an accepted part of how CDC does business, right? We move quickly when we need to. And then, you know, when the dust settles, either public health emergency was rescinded last year, for example, in May.
And once that happened, it's like, OK, everybody take a deep breath. Now, what do we need to be looking at? What did we do for the last two and a half, three years that needs a second look? We went we revisit all of our privacy impact assessments. We wanted to validate that we understood the data type. So we're in all the new systems that stood up during COVID. We needed to understand, you know, the technological landscape that we were operating in. Have we accounted for everything in inventory? Are we tracking all the risks that are associated? So.
I don't know. That's just the way we've approached it. culturally, think CDC is, that's just how we've operated and it seems to work pretty well for us.
Robert Wood (26:09)
I love that. thinking about the definition of, and I love the textbook definition of risk. I think it's most appropriate in our context. One thing that I, this is another thing I wrestled with was the security industry uses risk in a very general term. Everything is just risk.
you know, whether it's a pen test finding an audit results, you know, some kind of IOC thing coming out of like a threat intelligence, actually, you know, whatever it is, it's all just risk. And in reality, there's a lot of different categories of risk. Like one of the things that, you know, like there's this surge of, of interest in trying to quantify risk in the private sector, you know, doing like Monte Carlo kind of modeling and all that, like lost.
Joe Lewis (27:07)
fair and all that other stuff, yeah.
Robert Wood (27:08)
Yeah, super interesting. In the government, though, that doesn't really matter. Not on the same level. Not that money doesn't matter. That's certainly not the implication I would want to leave anyone with. But a couple million dollar residual risk in a trillion dollar organization is like a rounding error. But.
risk that might be like political exposure or media, inviting media scrutiny. The Washington Post is going to write an article about this thing, what happens. That kind of stuff ends up getting people's attention in a lot more jarring, immediate way. so do you think that there's any, do you think that we as an industry are hamstringing ourselves by the labels and...
like language that we're using when we're talking about things going wrong. I mean, when you get into the senior leadership ranks, you do start to talk and behave and interact differently. But everything in the more tactical and technical parts of our organizations end up, we use the very technical and kind of the standardized vernacular to do our jobs and interface with one another.
Joe Lewis (28:29)
Yeah. So first and foremost, I think we absolutely overplay the word risk and not everything is risky or to your point, maybe some things are riskier than others. And think that's really ultimately where we fail. We don't do a good job of being able to calculate and quantify. I think the risk profiles, I think there are different risk profiles is what there are. Right. For example,
even within vulnerabilities, When vulnerability management's whack -a -mole, you're never gonna get rid of all of them, right? Anybody who tells you you will is absolutely out of their mind, right? And they've got no concept of how complex an answer that really is. But even something like, well, you have to go get rid of all your criticals and highs. Like, that's not helpful either, right? So for us, we focus on, is it internet -facing? Is it publicly accessible? Is it a critical or a high? Is it on the kev? Is it on the NIST exploitable?
vulnerability database, right? Those kinds of things where we can try to prioritize things so that way we are applying our efforts in the places that they make the most sense because there's never enough people, never enough money, there's never enough time. And so you have to, you have to balance those things. And I think similarly, we need to be able to kind of explain those different risk profiles within different portions and parts of our areas of responsibility. So that way senior leadership can understand.
Like, you I work the CIO, luckily, or the acting CIO here, he was a former acting CISO. So he understands when I explained things to him. But when I go talk to the chief operating officer, who is by career, a HR professional, he doesn't understand all these technical nuances. And I have to, and this is another, I think, skill or trait that we need to be developing more in our up and comers is the ability to translate what we do into business terminology that is consumable and understood.
by senior leaders and do so in a way that's also not talking down to them, right? Cause nothing will get you kicked out of the room faster than trying to talk down at somebody for fear of trying to dumb things down too much, right? So all that to say, I think we overuse the word risk. I think we need to focus on having risk profiles and then being able to translate those into ways that make sense. Now for me, I have the benefit of being on the enterprise risk management governance board for CDC, right? I'm a voting member of the enterprise risk management board and we escalate.
information security risk as a strategic agency risk, right? And so it's incumbent on me as the risk owner to go and explain what that risk means. And the board members have to have an opportunity to ask questions and we have a good dialogue. And then we vote on the residual risk levels. How is that going to get reported to the enterprise risk register? And then what are we going to do about it? Right? So I think if you have those mechanisms in place, there are ways for you to have that conversation if you're able.
Robert Wood (31:08)
Yeah.
Yeah, I love that. And that concept of not talking down to somebody, think that's something that's really easily misunderstood when you're like, all right, have to basically put this in non -technical terms. It doesn't mean you're talking to a six -year -old. It means you're talking to somebody who is very senior. And this drives me nuts when I see people ranting on LinkedIn about how
You know, it's so annoying that, you know, these CEOs or whoever don't understand cybersecurity and whatever. And I'll tell you what, like I talked to our accounting folks or anybody who does, you know, anything non -technical in this organization. And I struggle to understand some of the deep nuances of what they're doing. Like we switched over to an accrual based accounting system recently. was like, yeah, that sounds great.
Joe Lewis (32:11)
I've done that by the way. That's, that's not fun.
Robert Wood (32:15)
Yeah, I mean, definitely wasn't beyond answering questions because I'm useless in that endeavor. Let the professionals handle it. I feel like when we translate things to business terminology, I think there's a tendency to want to
dumb it down. so I love that you called that out because you could keep things very, let's say, technical, but they're being expressed in a different kind of form. You're talking about lost opportunities, pipelines slowing down.
Maybe it's the public exposure. You might be talking about material risk that impacts stock value, things like that that are still very technical in terms of there being like jargon and domain expertise oriented language involved. But they're not like IT, cyber, technical. And I think that's nuance.
Joe Lewis (33:16)
Yeah. Yeah, no, I'll say it again, right? I think we have not done a good job of training up and comers that want to be somebody when they grow up in cyber one day, right? Actually, the best skills I have that make me an effective CISO came from working in the private sector had nothing to do with it or cyber. I was a production manager and I was managing a profit and loss I had I operated a cost center, right? So as a cost center, I didn't make any money for the organization, but I cost everybody.
And so guess what I had to learn really, really quickly, you had to drive value for the people that were out there making the money. Otherwise, you were quickly made irrelevant. Right. And so that those kinds of things, understanding profit and loss, capital expenses versus operation expenses, all of the nuance of managing to a percent to sales, right? All I mean, all these these purely business or communication skills, written and oral. Those are all things that we don't do a good job of teaching our up and comers. And it's something that
actually, it drove a lot of my PhD research, right, which was around non technical skills and abilities that people need if they want to be successful as a sizzle one day. And in looking at that, you know, what we found is that good non technical skills are also mitigators to stress and burnout, which is a problem that's plaguing our industry as well. So what I do in order to help curb that in my own environment is I have a cadre of
I want to say four to six GS14s, mid -level frontline supervisors, their first real supervisory job, and we meet monthly and we talk about strategic decision -making. We talk about...
Robert Wood (34:59)
Duty calls.
Joe Lewis (35:01)
Sorry about that. It connected to both my phone and my headphones at the same time. So it was ringing teams, which was hilarious. But yeah, so I have about four to six GS14 frontline supervisors that we meet monthly. And we talk about strategic decision making. We talk about communication. We talk about the importance of relationships and how do you build coalitions, right? Those are things that I had to figure out the hard way, but I don't want them to have to figure out the hard way.
Robert Wood (35:05)
God. That's funny.
Yeah, because it really is like a yin and a yang. I mean, you go to any conference, any security conference ever in the world, and the training, the talk tracks, all that, it is 90 plus percent technical, cybersecurity, like technical, and not soft skills related, not personnel, not organization dynamics, that kind of thing. One of the people I read a lot of and kind of follow is Adam Grant.
If you're familiar with it, loves to do. And it's just like so thoughtful, so intentional. And with his words and his kind of phrasing of things. you know, there's so much to this idea of like intentional, well, A, kind of building people up and doing the proper work of like grooming and growing and investing. But then also,
Joe Lewis (35:55)
Yep.
Robert Wood (36:22)
Getting like designing an organization very intentionally. And so this was one of the other things that I wanted to that I wanted to double click on with respect to the whole innovation kind of enablement is Is there any? It well and let me let me back up because I think there's there's a there's a context I want to share first is There's a there's a pretty standard I think
body of thoughts, the kind of idea of like what a typical security organization should look like. You've got your CISO, you've got your deputy, you've got security operations, you've got policy, you've got offensive security stuff, you've got compliance. There's all the kind of typical components and departments and hierarchies.
And those aren't necessarily like, can have a lot of siloing that occurs even within a security organization. You know, pen testers can't talk compliance, security operations, people can't talk to pen testers. Like, you know, we can talk purple teaming all we want, but it's, you know, it doesn't happen naturally. And, and so, is there anything that you do with respect to like intentionally designing your
security organization to align within itself and then also to align some of those components with other parts of the CDC.
Joe Lewis (37:47)
Yeah, so first and foremost, think intentionality is something that is often lost. think, especially if we take us back to the Lego metaphor, right? Because I inherited a freestanding Lego structure. But then I say, well, you know what, this isn't really, this doesn't meet my current or future state needs. And so I now I need to I need to adjust. Well, in the government, that takes far longer than it should, right? reorganizations have to be approved by the published to the Federal Register, for example.
changing the line on an organizational structure does not fix organizational problems, right? So thinking intentionally about those things, and then having a long term strategy is vitally important. And I've been on I've been in this job for about two years now. And I when I took the job, I told the team that we were on a three to five year journey. And in that three to five years, we would completely evolve from current state into whatever our future to be state is and
And so we've been doing that. We'd like, for example, shedding legacy workloads and constricting resources and saying, you know, I'm not going to get any more people. So as I generate vacancies, I'm going to take vacancies and reappropriate to other areas where missions are either underserved or unmet. Right. For example, supply chain risk management is a brand new function. And we were not funded for that. We didn't get positions for that. So I had to make those out of hide. Right. So intentionality, I think, is really important. And that's that three to five year plan is something I'm still baking on. Right. We're still
iterating on it. And then, you know, to your point about silos within security and how an organizational structure can, I think, contribute to those things. One of the things I do to combat that is have we have a centralized theme for every year. Okay, so 2023 was the 2023 my first year was our role as a service provider. Okay, it was a cultural shift from being a compliant cybersecurity organization to we are evolving into a managed cybersecurity service provider for the agency. And with all that comes with that.
the understanding of our role as a service provider, customer service, stakeholder engagement, know, treating everything as a problem to be solved rather than a booger to be flicked at somebody else. Right. So that was 23, whereas 24 was the year of FISMA. Now I, I love to hate FISMA. Okay. I love to hate FISMA because FISMA is the only 100 % unique requirement that the federal government has that everybody else does not. And so by virtue of that, the leaders in a cybersecurity organization within the federal government,
should all be damn good at FISMA. Okay, because it is the one unifying thread that we all have that if done properly, everybody can make FISMA easier. But if done wrong, everybody can make FISMA harder. Okay, and I'll give you a perfect example I took over. There were 470 FISMA reportable systems. Of those, over 200 of them lived on shared infrastructure, lived on common SQL server farms and common IIS web servers.
like, why do we have a full authorization for these things? Right? So, you know, in 2024, 24 was famously the year of FISMA. Every leader in the organization had FISMA modernization and support for the authorization process in their performance evaluation. And everybody was charged with thinking from their own perspective, how do they break down silos? How do they help support the modernization of the authorization process? And so it is a matter of fact, next month, we have our strategy session to write our 25
26 strategies and what are our themes going to be for those, right? So that's, think, one of the ways that we try to combat those organizational silos is we find a way to weave them all together through some common thematic lens. And then everybody is thinking in the same kind of general way.
Robert Wood (41:23)
Yeah, I love that. The theme concept is fantastic. so it sounds like you're doing that very collaboratively with your team, with your senior leadership. How do you go about, what does that process end up looking like? Is there typically consensus, or is there a lot of debate?
Maybe a more important question is how do you operate in those meetings so you're not sucking the air out of the room versus letting the team interact and decide and grow and do all that stuff?
Joe Lewis (42:01)
So I'll tell you that the first two years were entirely mine, right? And it was based on my 100 day observations in the role, seeing where I felt like we could make the biggest cultural impacts the soonest, right? So, I inherited a cybersecurity program that was very much comply or die, black and white policy says, and that...
It just it was incongruent with the mission of CDC, right? So year one's theme was very much me saying, no, this doesn't work any longer. We are we are evolving and we need to change our perspective. And then twenty twenty four was it was almost a brainchild between a couple of the smaller my smaller circle, if you will, was me, my deputy, my enterprise resource manager and a few others were thinking about, you know, how do we solve complex problems?
right and the the FISMA consolidation project was kind of born of that. And by virtue of that 24 became the year of FISMA so that everybody could help support that effort. And it actually united us in ways that I wasn't really expecting. But it was directive. And it wasn't it wasn't as collaborative as I would have wanted. And that's why we're doing this strategy session where we're going to sit down for two full days and we're going to we're we have a guided kind of discussion on rewriting our strategic plan.
What are our strategic priorities going to be? What are the goals going to be that support those? And then we're going to charge each of the leaders in our organization to go and produce their own initiatives that are going to support the goals and objectives that we lay out, right? And so that's how we get to it. But initially it was, it was me and it was me trying desperately to take that Lego house and say, I need it to look differently than what it looks like right now.
Robert Wood (43:40)
Thanks, Dan.
Yeah, OK, I love that. Now, one of the other things that I find refreshing and interesting in that description that you gave, or in that answer you gave, is the process of eliminating or reducing wasteful or burdensome process. I found it very hard to.
let things go. Not like me personally. can drop it like it's hot. the people who are stuck on tools or this is like a process that they, this is their baby. you're kind of getting them, like asking them to stop it or, cause it's not like you ran out of budget and you can't do it anymore or whatever it is.
You're forcing some kind of change where there's, that some cost fallacy really latched on there. How do you go about approaching those conversations? Is it collaborative? Is it directive? Is it you're trying to get them to build consensus on their own to let the thing go? Or is it something totally different?
Joe Lewis (45:14)
Well, so it all depends, right? So you mentioned two specific examples I think are worth investigating. One is like tool rationalization. Boy, people get very emotionally connected to tools that they brought into the environment that they operate and they just desperately don't want you to let go of. However, we do operational analyses of these tools every year. And I want to know, is it being used to its fullest? You know, how is it being used? You know, and there was one in particular where
Robert Wood (45:29)
Yup.
Joe Lewis (45:45)
We went through many, many, many gyrations. And at the end, I also had somebody going around individually asking the teams that were supposed to be using this tool. Hey, what happens if Joe tells you to turn this off tomorrow? What breaks? Right. Cause ultimately I was worried that the information I was getting was colored by this emotional investment over time. And that's exactly what it ended up being. The information I was getting versus the reality of if I turn this off tomorrow, nobody would notice, was, was very real.
And so finally we came to a point where they came to brief the outputs of the operational analysis. And there was a lot of this, well, it could do this and it could do this. And I had to kind of stop them and say, are we in some perceived to be state or is this happening now? Well, if only these people would do this thing, I'm like, okay, well, how long have we been paying for the tool? Four years. I said, pull a plug, right? If we haven't adopted it in four years and it's costing us X number of dollars,
Robert Wood (46:29)
Yeah.
Joe Lewis (46:41)
then we have failed in its implementation and we need to look at other things. so tool rationalization is hard, but it's necessary. And sometimes you just got to rip that bandaid because they're not going to give it to you straight if you don't. Now on the legacy workloads, that's a lot harder. And the reason I say that is because people have invested their time and effort into producing a workload that you're now saying is not value add. And people can take that very, very personally. And so for me, I approach it from a, hey, we have this new opportunity.
Robert Wood (47:06)
Yes.
Joe Lewis (47:11)
and we want you to go and contribute in a different way. And why don't you go identify training needs in order for you to be successful? Here, I'm gonna partner you with your soon to be supervisor who's gonna inform you about all of the work that we're gonna talk about. And then you're gonna put together your individual development plan and we're gonna set you off to succeed in that space. And I make it less about the work and more about the people, right? Because I worry more.
about the emotional and mental health of the individuals that are adversely affected by a management -directed reassignment than I do about stopping doing this administrative function that we shouldn't be doing. And I think because I approach it from the people's side first, the adoption has been much, better.
Robert Wood (47:53)
OK, yeah, I like that. So one thing that is common in banks, for example, is you have these rotational programs where you kind of shift people around every so often. Somebody who's policy goes and does a detail over in operations and vice versa. Do you think that something like that?
could be effective in the, mean details happen all the time in the federal space. Do you think something like that could be effective in basically exposing people to different viewpoints, perspectives, voices of influence, stuff like that?
Joe Lewis (48:37)
So I think leadership in particular desperately needs that, right? Especially when they've only come up through.
Robert Wood (48:43)
like leadership detailing or leadership needs their people to detail.
Joe Lewis (48:47)
I think the leaders need to detail because especially those frontline GS 14 supervisors, they are now team leads because they were good at that technology. And so now they're a team lead of that technology. So they've never seen anything outside their own little realm, right? It's part of their broader development. I think we need to expose them to different things. And then in addition to details, one of things I try to do is I try to put together what I call crisis action teams, right? We call them cats where I have a big complex problem to solve.
I need subject matter experts from multiple disciplines. And so I pulled those folks, and this is not in addition to, this is in lieu of existing duties. We pull them into a dedicated team. And this is what we did with the FSMA consolidation. I had 10 people and they were subject matter experts from privacy, from ops, from architecture, from GRC, from everywhere, right? We built this team and we said, go.
and make me recommendations and policies and procedures and documentation in order to achieve this output. And my goal is I want a 50 % reduction at six months. And so they scrolled away into a room. And let me tell you, the development that those people got was better than anything we could have done any other way, because they all learned something outside of their own area of expertise. And guess what they also learned? FISMA. And they learned it really, really, really well, right? And so we actually achieved a 52 % reduction by doing this.
and everybody that was on those teams was all better for the experience, right? So, and there's no end to these complex, you know, agency wide problems that we can solve, using this construct. know, I don't want to create the impression that I came up with this. I totally stole this for my time at the army, by the way. they, they used to do these all the time. and it's, and I can see why me as the senior leader.
I accept operational risk for the things those folks are not going to be doing while they're on a crisis action team. But the upside is if I can invest a fixed period of time on a complex problem, we can solve that inherent systemic problem for good. And that's exactly what we did with our consolidation efforts. those are opportunities where I look for up and comers, those that have high potential and say, okay, let me bring you onto this team and expose you to something new. And then that helps kind of spur their own.
intellectual thoughts on is this maybe I'm not in the only space I want to be and maybe there's something interesting for me to do someplace else too.
Robert Wood (51:12)
Yeah, that's really interesting. that, do you find that, and I can totally see this being the case, where people take that experience, those relationships, that all of that stuff that they gained from being on that team back to their current team, and then that starts to help that team align better with other teams and vice versa?
Joe Lewis (51:35)
absolutely. So we have one gentleman in particular, he's a team lead. He's a real up and comer. I told him I'm just glad to have him while we have him because he's he's a Spitfire. He's got a mile long career runway ahead of him and whatever he wants to do. Yeah. And so for me, we put him on that and he actually co led co chaired that effort. And the development I saw out of him in particular, can be seen in how he approaches all of his current duties, everything else. I mean, it's just it's
Robert Wood (51:45)
I'll expect an air cleaner.
Joe Lewis (52:02)
It has paid such dividends and it broadened his perspective in such a way that truly I think it really does set him up longer term for some real impressive long -term career success.
Robert Wood (52:11)
I love that. so thinking about that longer term career success, I don't know if you get this, but I get asked by a nontrivial number of people how to grow and how do I be a SISO? How do I do this? How do I do that? And what I've tried to impress upon people is
that that is not the only pathway to success in this field. It's also very stressful, where it can be in certain works. some people just like they enjoy being hands on keyboard. They enjoy being very close to technology. so there's a lot of nuance that ends up getting thought through. so as you're
As you're thinking about your organization, what attributes are you looking for when it comes to identifying those up and comers? Because I imagine it's not the person who's rock solid on the Splunk command, or the Splunk dashboard, or whatever.
person who can like deploy such and such technology and configure it perfectly. So like what, you know, what are those, what are those things that really jump out to you?
Joe Lewis (53:37)
Sure. So first and foremost, I think it's important and just taking you half a step backwards, not everybody should grow up to be a CISO, right? And I think the idea of being a CISO is probably greater than the reality for some people. And I've met a number of people that are absolutely 100 % comfortable and want to be technical subject matter experts their entire career. And that's to be lauded, right? You figured out where your area is and you are fulfilled and you're great at what you do, good for you, right? You don't need to be a CISO someday.
and I'll also say that, and this is something I always tell people is that the path to the scissors desk is not a straight line. there is no one clear developmental path to get to the scissors desk. really, and I, like I said earlier, the best skills that I learned to be effective had nothing to do with technology or cyber, right? And so, when I'm looking for up and comers, I'm looking for a couple of things. One, especially given the fact that I inherited an organization that was compliance only, I was looking for those that could culturally adopt.
our role as a service provider that could make that mental shift and say, we are going to approach this from a customer service, customer experience perspective, recognizing the pain points and then trying to address those. Right. So that's part one. Two, I'm looking for good communicators, right? If you can be the most technically competent, you can be the smartest person in the room, but if you can't articulate in writing or orally your points in a way that is understandable, then you have lost the audience. And so
cultural adoption, communication skills, and somebody that understands strategic thinking. It took me way too long, Rob, way too long to figure out that organizational politics and relationships were the only way I was ever gonna get any big stuff accomplished at this agency, or in my career rather. By the time I got here, I think I had figured that out, but prior to that, it took me a lot. And I would say things like, I just don't understand why we can't get this thing done. It seems like this should be easy.
But once you build relationships and then you kind of trade on that political capital, then big things start to happen. And it took me long to figure that out. understanding and appreciating organizational politics and relationships and understanding the ability to build coalitions, I think that's so huge. And then finally, I'm looking for somebody that can keep one foot in the tactical and one foot in the strategic and understand both. You're going to have to make tactical decisions, right? I have to do it all the time, even that I really don't want to.
But when I do make tactical decisions, it is informed by my strategic goals and the priorities. And I'm thinking about if I have to pull this lever on the right over here, what's that going to do this 66 other levers that are going to look like a complex game? You got to make sure that everything stays like going in the right direction. So being able to appreciate how tactical decision making affects, supports, or detracts from strategic priorities, I think is another key important piece.
Robert Wood (56:29)
Yeah, think that makes a whole lot of sense. Now, how do you go about developing that ability to understand and navigate organizational politics?
Joe Lewis (56:45)
So I'm glad you asked that because that's a key part of my mentorship program with some of these people that are in my teams. So first and foremost, I asked them to identify a half a dozen to maybe 10 key relationships that they rely on in order to get their job done to be effective. Identify those. And then which are your top three most problematic relationships? Well, it's these three people. Why? Well, he doesn't like to do this or I don't like to do that.
I think he should do these things and okay, so we start unpacking those things. And then we say, okay, well, why don't you approach this differently? Have you made outreach efforts beyond transactional exchanges? Have you tried to get to know this person as a human? Have you tried to understand the landscape from which they're making their decisions? No, okay, well then let's unpack that, right? So we really get into, and we use, you know, specific examples. We get into how that transactional versus
you know, cooperative relationship, how you can get different things out of relationships in doing so. Too many people see things as, need this thing. So here, I gave you your thing. Now you give me this thing. And then I'll talk to you in a month whenever I need that next thing. But that's not really how relationships work. You know, and you need to treat these work relationships as true relationships and get beyond transactional into more collaborative and cooperative relationships.
We do that when we do that specifically, especially for those that have been here for a long time that have a lot of really complex relationships. Let's tackle the hard ones. I want to talk to you about this person. Tell me about your relationship with them. Okay, let's let's fix that one and then understand and appreciate how how that might help you longer term. Right. And the other thing I always like to say is sometimes you got to lose the battle to win the war. If you're willing to plant a flag on every hill and die on every hill, then you you're approaching the game wrong. Right.
Robert Wood (58:36)
Yeah, Well, I want to ask one final question to kind of round us out here. So we talked about everything from risk to innovation, the security compliance battle and complimentary stuff. We talked about personnel development and org design. If you had to leave
anyone who watches and or listens to this with one piece of advice for building an effective security program that enables the mission. Because I think that's really what, at least in my opinion, what security is there for is to enable mission. What is the one key thing that you want to kind of drive home?
Joe Lewis (59:24)
So that's complicated, but I'll answer it from my perspective as a CISO. I think the one thing, the one piece of advice I would have given myself five years ago is humility. It is not my job to be the smartest person in the room. It is my job to build a team of smart people and then listen to those smart people. Give them good guidance, good direction, good support, resources, intent.
and then let them go and do, right? The breadth and depth of material I would have to be a subject matter expert on to be, you know, the subject matter expert on everything is unsustainable, right? It's ridiculous. So if I want to build a good security program, you start by building a good team and you build a good team by surrounding yourself with people that are smarter than you. And that takes a certain level of humility that frankly, we're not conditioned to do in IT and cyber because of the types of people that we
Robert Wood (1:00:05)
Unreal.
Joe Lewis (1:00:22)
we attract, right? Christian Espinoza wrote a great book called The Smartest Person in the Room. And he talks about how cybersecurity is losing the overall war because everybody in cyber wants to be the smartest person in the room. And they want to be right, rather than be a good team player. And for me, that's just that's just not my style. What is it Michael Dell says, if you're the smartest person in the room, you're in the wrong room, right? That's me, right? I would much rather be surrounded by smart people, enabled, empowered, held accountable. That's the other thing that's
That's one thing that my time in private sector, I think, taught me that most, maybe IT and technology people don't see as the ability to hold people accountable to standards, right? You're not a bad person if you have to hold somebody accountable. That's just the nature of the relationship. But yeah, so, but yeah, just, I think approaching the job from humility and building a good team is the only way you're ever gonna be successful.
Robert Wood (1:01:06)
you're not a tyrant.
Yeah, yeah, yeah, no, I love that. And one of the things, so I've found along those lines of rapidly and ferociously humbling myself is anytime I go and study outside the field of cyber, like most of the books I read are, like I've got this on my desk right now. And I mean, I am very excited. So there's this book and then there's this other one.
Joe Lewis (1:01:32)
Okay.
Robert Wood (1:01:41)
the accidental CIO that I've started kind of reading together. And it's all about like org design and just kind of tying with business models, with strategy and all this stuff. I'll read stuff like that or I'll read about sociology or different soft skills topics or whatever, or finance. So I'm not a total idiot with our accounting team.
And I always find that my learning is so much, like I can make such a bigger jump and understand so much more and realize how much I don't know when I read stuff like that or learn stuff like that as opposed to trying to like learn the very fine -grained nuances of some application security topic or zero trust this or that. And it's just.
Yeah, so I love that advice. And with that, I wanted to say thank you so much for your time. I know you are a busy, busy man authorizing things and protecting the country as you do. So I appreciate your time. And yeah, I hope we get a chance to talk soon and offline of all the podcasty stuff. Thank you so much. All right, see you everyone.
Joe Lewis (1:02:57)
All right. Thanks. Appreciate it, Rob.