Drowning in Data, Starving for Insight: Cyber Risk Quantification in Action

Episode 4 February 19, 2025 01:21:03
Drowning in Data, Starving for Insight: Cyber Risk Quantification in Action
Security Program Transformation Podcast
Drowning in Data, Starving for Insight: Cyber Risk Quantification in Action

Feb 19 2025 | 01:21:03

/

Show Notes

In this conversation, Robert Wood and Mads Bundgaard Nielsen delve into the complexities of cyber risk quantification, exploring Mads' journey into this niche field, the importance of a business-first approach to risk management, and the distinctions between compliance and effective risk management. They discuss foundational steps for initiating risk quantification, the significance of stakeholder engagement, and the challenges of measuring non-financial impacts. The conversation also touches on the limitations of existing risk assessment tools and scoring systems, emphasizing the need for a more nuanced understanding of risk in cybersecurity. In this conversation, Robert Wood and Mads Bundgaard Nielsen delve into the complexities of vulnerability management and risk quantification in cybersecurity. They discuss the challenges organizations face in prioritizing vulnerabilities, the inefficiencies in third-party risk management, and the future of cyber risk quantification. Mads emphasizes the importance of understanding organizational attributes for effective risk management and shares valuable resources for those looking to enhance their knowledge in this field.
 

Takeaways

View Full Transcript

Episode Transcript

Robert Wood (00:00.976) Hello everyone. This is Robert Wood with the Security Program Transformation Podcast. I am thrilled to have this interview today. I'm joined by my friend Mads. We are going to talk about cyber risk quantification, which is one of those somewhat elusive topics in cybersecurity, especially if you've ever tried to get one of these programs off the ground from, from jump. You've read all these things. You've watched all these videos. Mads Bundgaard Nielsen (00:22.341) you you Robert Wood (00:28.999) And then you go to put it into practice and it's a little more challenging than standing up a scanner or something like that. a very, you it requires some real deep thought. And so that is one of the reasons why I am psyched to have this conversation today. And I'm not going to do it justice. So I'm going to turn it over to Mads to properly introduce himself. Mads Bundgaard Nielsen (00:49.797) Thanks, Robert. Yeah, my name is Mads. I work as a risk lead currently, and I'm trying to do my best of implementing quantitative risk management wherever I can, big and small. and I, you know, I really, really like to speak about it. So I'm very happy that you invited me to talk on this topic, and I'm sure we'll deep into some interesting insights. Robert Wood (01:17.176) So that is the plan. Well, to kick us off, this is so in the States we have. And I think this is really taking off all over the all over the world, but most people got into cyber, not even you didn't come up through university in cybersecurity profession. You came up through networking or systems engineering, software development, computer science. And now you have these set aside dedicated cybersecurity programs. And most people, when they're getting into the field, they kind of go into the compliance branch. They go into the pen testing. They go into being a security operations analyst, those sort of pathways into the field. And I'm curious, how did you find yourself doing what you're doing now because it's a very niche part of the field. Very important, but very niche. Mads Bundgaard Nielsen (02:21.838) Yeah. Yeah. So I think there are two inflection points to my, my path towards, cyber risk quantification. and the first is my, my journey into it's just information security as a general area was almost completely by accident. So, during my, my final years in university, I worked a lot. And so I've gotten a bit behind on my thesis, proposals and that kind of stuff. And when I ended up having to write them. Very few tutors were available. And one of them, there were only two. And one of them suggested writing about GDPR. And at my current work, they had a project that they sort of want to get off the ground. So I matched those two, my work and my thesis, and wrote about GDPR. And I graduated in 2018 when GDPR found effect. So that was very helpful in getting the initial employment. Robert Wood (03:17.286) Thank you. Mads Bundgaard Nielsen (03:18.922) So before that, I'd actually studied anthropology and then turned into the IT University of Copenhagen for more than process support and tooling and that kind of stuff. So the IT side of business processes. I ended up writing about GDPR and seeing if that would diminish process efficiency. I concluded it wouldn't. It probably has, generally speaking. But from there, I went on to more information security management, information security. And in there, I had this very big frustration with the way we did the analysis. was tech first, not business first. And especially when we were to analyze probabilities in these matrices, we have impact and probability. It seemed completely arbitrary. Was it a 5? Was it a 3? Was it a 4? What kind of systems were below that? OK, so we have these five parameters that we... And it was never satisfying to me. And so, at one point, my then boss, he came back from a certification where the instructor had sort of just mentioned, there's this Monte Carlo something where you can sort of simulate scenarios, and I thought, OK, that sounds pretty interesting. And so I looked more into it. read a couple of books and now I'm becoming a militant activist on the quant side is that if you're not quantifying your risks, you're not doing any risk management at all. I tried to be nice about it, but my opinion is there's no way around quant if you're talking about risk management. so that's where I now so after university, I've done a lot of studies myself, online training. reading books, building a ton of small models at my own hands. that's where I'm now with a lot of help from a lot of people, of course. Robert Wood (05:19.908) I love that. for anyone who wants to get in touch with Matt, so this was actually how you and I first came to meet is I stumbled across something that you had posted on LinkedIn. Like that's really freaking cool. I dig that because I had run face first into the wall on a few different occasions trying to... You know, dip my own toes in the water on cyber risk quantification and shared a model with me. And I went and put it into practice and I know that was, that was very, very helpful. So what has been, what's, what's one of the most interesting things that you have learned in your journey to kind of like going from university to, you know, like learning about this particular niche and the, and the approach and the the methods and then starting to put it into practice. Like what's, what's been the one big, big significant takeaway, maybe something that you didn't expect to learn along. Mads Bundgaard Nielsen (06:30.987) Yeah. So I think I'm still learning this and I'm still learning to get around it. So the biggest problem I see for risk management in general is that we haven't learned to do business first when we do any kind of analysis, whether it's just mapping out the problem space, mapping out different strategies for different solutions. We haven't figured out what's business first. And that's not entirely the fault of risk management and InfoSec and that kind of stuff. It's also It can be the fault of the business because they sometimes don't know what they're after. So that's one thing. Business first means what is the actual business outcomes, the outcomes of the business endeavors that we're doing. What are those? How do we measure their success? And when we do risk management, what kind of factors can diminish our probability of achieving that success? That's actually the only question we are talking about. One other thing that I've learned and what I can see from lot of the resources that I've used is as soon as you go into quantification, you move more and more from cyber risk management more and more into decision science to see what are good decisions because ultimately this is what it comes down to. You don't need risk management if you don't have a decision to make because you don't need to analyze anything. If the path is set, you don't need to analyze whether to go left or right. just go with the path. so every type of risk management is also a decision that needs to be analyzed just to be technical. The other thing is, and this I have learned and I don't need more learning on it, that compliance and business and risk management, those are two different things. So business and risk management can be connected. Compliance is a completely different exercise, has different stakeholders. different standards, different expectations, and usually it has nothing to do with the efficiency or effectiveness of business processes or your risk management framework. Robert Wood (08:41.318) Do you think then that, so in a lot of organizations, and I mean, we even have an acronym for it. have, know, GRC, governance risk compliance. Do you think that the industry is kind of kneecapping itself or setting itself up for a difficult time given that tension, given the divergence of goals and motives and whatnot? Mads Bundgaard Nielsen (08:50.368) Bye. Mads Bundgaard Nielsen (09:07.316) Yeah, I think to the extent that organizations don't realize that compliance is different from business, they're going to spend way too many resources on GRC. Because GRC is fulfilling all the requirements, legal, regulatory, by all of the controls, and it takes a lot of data collection, takes a lot of data processing, it takes a lot of mapping, it takes a lot of manual FTE hours. And what you are... essentially after is just reducing the number of observations that your auditors and supervisory authorities make. And the game of avoiding observations is fairly easy because auditors and supervisory authorities are also busy. They want to make some observations to prove that they've done some work, but everyone can probably recognize that those observations are not the ones that you know are critical, and usually they... stem from some kind of misunderstanding of the material that you've provided. So there's a huge disconnect. what I'm saying is don't chase the perfect compliance. as little as humanly possible to just satisfy the auditors. Don't lie in anything, but don't think that it's compliance that saves the business. It's business that needs, know, business requires some kind of compliance, but it's very hard not to have it as a cost center. Robert Wood (10:33.988) Yeah. Mads Bundgaard Nielsen (10:34.791) So if you really think we're gonna go all in on GRC as such, I think you're kneecapping your business and you're spending resources on paper tigers instead of actually creating business value. Robert Wood (10:46.446) Yep. I like the way you put that. And, know, one of the things that I have personally found as a, as a, a point of tension in like along those lines is because so many businesses, I should say organizations, maybe more accurately are, they are very motivated by compliance for Probably the obvious reasons it helps unlock new markets. It keeps the auditors off your back I was a federal see so for just shy of four years and You in that world you have the Office of Inspectors General you have Congress you have you have all of these Kind of intimidating third parties that are out there potentially putting pressure on your agency from from compliance and regulatory scrutiny standpoint. so clients was a big motivator. However, You like as a, as a leader, leveraging compliance to spark conversation, but then not leaving it there when you're trying to get something done or trying to convince people of going above and beyond of, of like really satisfying a, a need. It felt a little bit like, taking, like pulling, pulling apart a peanut butter and jelly sandwich and trying to, trying to put the ingredients back, you know, back in the shelves. And so along those lines, if somebody was just getting their risk quantification journey started in an organization, what are, what are some of the foundational things that you feel need to be in place or that they need to start doing to kind of go through that crawl, walk, run cycle? Like, you know, if you're at the cross stage, where do you, where do you start? Mads Bundgaard Nielsen (12:52.553) Yeah, okay. So usually I have this four phase or four principles for getting into quantification or doing any kind of quantification project if you haven't done them before. And to begin with, you need to define what is it exactly we want to quantify here. What kind of outcome. And don't quantify inputs. That's what we usually do. how many hours would it take to make X. That's not interesting. We want to be interested in how many Xs can we produce over time. So that's one thing. It could also be when we come from compliance, do we want to reduce the number of observations or do we want to reduce the severity of them? How many hours, how much cost would each observation cost us? Is it units produced? Is it uptime minutes? Is it... There tons of metrics that are relevant to whatever level of business you're looking at. If it's operational, it's usually uptime. It can be quality. the businesses will usually, organizations will usually have metrics that they use already. So first thing, define what is it we want to measure. And then you want to simplify how you measure it. Because measuring has an entire field of study. and they don't even agree on how accurate and how we can measure things. you want to keep this simple. So simplify what kind of inputs and outputs actually affect the outputs that we're interested in. Simplify that. Simplify the interaction between the elements. That's actually the qualitative analysis. Qualitative means what are the qualities of the factors that we're considering. Is it good or bad? When is it good or bad? Is that if it has MFA or if it whatever? has endpoint detection and response, all these factors. So that's simplified. And then you would like to baseline, find out what is usual. Because what you're usually interested in is some kind of deviation or the marginal effect of something. So one example could be if you want to quantify the strain of your operations and incident response, Mads Bundgaard Nielsen (15:15.589) you can say how much of the strain of our incident response and operations comes from incidents. But you know that that staff is also used by other functions of the business, so you've already hired some of them. So what you're actually interested in is what is the probability that we have so many incidents that it will exceed our capacity to respond adequately to incidents, right? Or it could be customer service. inadequate customer service, phone capacity or whatever. Robert Wood (15:50.31) Like if there's some kind of opportunity cost as a result of an incident that like they be doing other things. Mads Bundgaard Nielsen (15:55.217) Yeah, yeah, yeah, that could be it. But it's also, you know, baseline, what's normal in our business? Do we usually have around 100 incidents a year? Or is it one? it, 500, depending on your size? And, you know, what, what could cause a change in the wrong direction? And when you get these things, you can quantify, you can actually begin to put on units and, and go from there. I know this is, this is a bit conceptual, but It's to say that quantification is sort of the last step at least building models. It's the last step you want to be very good at understanding the problem space and the actual outcomes you want to know about. So before you start to quantify anything, you don't need to be an expert in probability theory or Monte Carlo simulations or building models in Excel or Python or wherever. You just need to know the problem space and define things very clearly. Do that first and then you're like 90 % of the way because most responses will already have been resolved before you get to the actual quantification. That would be my assumption at least. Robert Wood (17:05.51) OK, so let's get specific then. If I were... So if I were in the head of security type role, CISO position, and I was trying to make a decision between You know, different, different forms of multifactor authentication, for example, and, you know, which one was going to be either more effective versus, you know, better, have a better user experience and you're factoring in the cost of, of said, said solution, you know, that, that sort of thing. you know, there's a few different, few different kind of. Trade-offs that I'm, that I'm trying to balance. And, and of course you want to some kind of safe. Outcome you want, you want maybe it's phishing resistant authentication. You want users to be able to log into stuff and not hate you. Whatever it is that you're, uh, or whatever set of things that you're interested in. Who do you think needs to be? And this may very well be a really situationally dependent answer. that's okay. But who needs to be in the room for some of those foundational steps that you. Mads Bundgaard Nielsen (18:24.751) Yep. Yep. Robert Wood (18:31.098) mentioned as you're as you're trying to kind of figure out the the outcomes that you're measuring the the units that like as you're as you're flushing all of those things out what what stakeholders do you want to make sure like absolutely at the table Mads Bundgaard Nielsen (18:51.102) So the first obvious one is the sponsor. So there are two situations, right? Either the CSO already has budget for making this change, implementing one of several MFA solutions. If have, they'll probably not need the sponsor right there at the beginning. If they don't, obviously they'll need to go out and ask for budget and whatever the CFO or whoever the CTO who will afford the budget will be in there. then we would probably summon the experts who knows about... Mads Bundgaard Nielsen (19:30.074) You know, the general problem space, so whenever we implement MFA, it's usually to reduce the number of incidents. It can also be the magnitude, but we want to reduce the number of incidents. So that could be one outcome measured. It could also just be the expected loss in general. But say we want to reduce the number of incidents. Yeah, yeah. Robert Wood (19:52.708) Yep. Account takeovers. Mads Bundgaard Nielsen (19:55.799) Exactly. That comes with a price. as you said, there might be some friction that has a cost, know, internal billing hours, general user satisfaction, that's very hard to measure, but you could try to quantify it. So you could have a representative form from your identity and access management, if that's a team or if it's your internal IT or if it's that, those guys will do the technical implementation. If you really want to, you go political, you can also have a representative of your users. But that's more on the change management side. From an estimate side, you want to get the people in the room who knows what's the cost of implementation and who will actually reliably help predict the change in outcomes. And that could be internal cybersecurity experts yourself. It could be from operations. It could also be industry data that says, well, if you implement whatever solution, you get X percent less exposure. Something like that. I have another example where there's this interesting dilemma. So in one of the places I've worked, there was an acquisition. And this acquisition had a lot of legacy end-of-life infrastructure. And there was a dilemma because, of course, it's expensive to replace all of this. So some people didn't want to do that. And other people thought, not replacing it leaves a big exposure for us, so that if something happens, we can't recover. Robert Wood (21:31.174) Now we've just basically purchased all of this residual risk. Mads Bundgaard Nielsen (21:35.118) Yes, yes. And that's also because the responsibility of this decision was not well defined. wasn't one person who could take it. It was different functions arguing over what would be the right choice. And usually the right choice for me was the one that inflicted the less resource strain or load on me and more on you. And so in these cases, you want to call one of the grown-ups and say, OK, so we have different outcomes, have maybe transitioning speed, have some kind of considering the sunset of these systems or migrating systems, what is the cost of that? What are the probability that something really bad happens before we get the transition? And so this could be a case where you would want to quantify different scenarios, whether it's clear strategy between prioritize migration or just the hope strategy. So yeah, that's another case where who knows what's right, because it can be very expensive and straining to make these migrations. Robert Wood (22:45.572) Yeah, that, that's a really interesting example. We had done, we'd done a, a risk assessment relatively recently. There was a one com one software company bought another software company. There was confusion around, well, maybe not confusion. We're intent intended ambiguity around the adherence to a particular. compliance standard, was HIPAA in this case, so healthcare rated. There was confusion on the company who was purchased side as to whether or not HIPAA was really applicable based on the way that they had designed their software and the kind of data they were collecting and things of that nature. The purchasing organization felt that it was applicable. Well, they wanted to figure out if it was applicable. They had suspicions. The result of all this is it ended up being applicable due to some somewhat nuanced terminology in the HIPAA rule. However, the follow on to this was, well, hey, we just purchased this technology, this company, and now we have to go through, because we are selling to healthcare companies, and we have to now figure out What is the implementation cost really going to look like for getting from where we are now to where we need to be, which is compliance to the extent that you can be with HIPAA. There's contracts to sign, there's technical changes to be made, there's policy changes to be made, procedural changes to be made, training to be done, all of that stuff. And so they're going through this big quantification kind of forecasting exercise. Robert Wood (24:52.624) to arrive at some level of understanding of the financial impact across, both the financial impact as well as the hours spent impact of the effort. then that was going to be an input into some, exactly, exactly. So that was gonna go to the lawyers and to the grownups. Mads Bundgaard Nielsen (25:12.728) Yeah, I've got it repriced in the acquisition. Yeah. Robert Wood (25:22.438) to figure out how they wanted to deal with all that. So what stand out to me about that experience, and listening to you talk through some of these examples, is the measurement of things that are not explicitly financial. And that was one of the things. So when I first got exposed to this discipline, if I could call it that, I think that's a good enough, good enough term. so when I first got exposed to cyber risk quantification, it was through fair. I remember sitting in on a, was at a app sec consulting company at time, and they brought in a fair dude and he walked us through kind of the, it was like the risk exercise to like figure out where your preferences are, you know, like where, where you were kind of oriented around. the like around. And I forget the exact term that it was that he used or that that that fair uses to do this. And, you know, we kind of learned about the whole standard and all that. you know, it's and it sounds amazing. But, you know, in the business world, it's much easier to talk about dollars. And, you know, there's an there's always an answer for everything. You know, how do you measure your reputation, like go look at the stock price, go look at, you know, customer orders, you know, there's always an answer for something that seems to point back to dollars. And going along the journey, when I was in the federal CISO role, money really was not that much of like talking about a million dollar impact in a trillion dollar organization, like is a rounding error, which sounds crazy, but it's, you know, but it's true. And, you know, it's like if I was a kid and, I had just saved up 20 bucks, 20 euro, you know, after shoveling driveways, know, shoveling snow all weekend and, know, somebody I lost five dollars of it. That would be a, you know, I'd be heartbroken. know, my life is over. But if I lost five dollars right now. know, yeah, they're always jacking money out of my out of my wallet. You know. Mads Bundgaard Nielsen (27:42.272) Yeah, good life. Robert Wood (27:50.488) If unfortunately they leave it at like the ones and fives, but I guess the point being like that, the financial impact didn't really resonate in that, in that setting. so one thing that I felt was really lacking was this measurement of other units or like tying it, tying the measurement of other units to business outcomes. And, know, and then so like, We did something very rudimentary in this risk assessment I mentioned with the hours forecasting, but talk to me a little bit about your experience effectively, like identifying those other measurable units in a business setting, because I think that's one thing that I'm really hoping listeners of this or watchers of this get out of it is like they don't have to be stuck on the money. Mads Bundgaard Nielsen (28:46.996) No, And usually, actually converting or translating whatever risk you have to money sometimes require an additional translation where you could actually be very satisfied with whatever other metric you're looking at as long as it's something that's countable. It's actually... Yeah. Robert Wood (29:06.63) Right. parking hours or something. feel like uptime in the middle of, or rather maybe downtime in the middle of a Black Friday cyber, Black Friday big retail spike is a decent example of that. Mads Bundgaard Nielsen (29:23.83) Yeah, exactly. And it also depends who you talk to. So if you're talking to an operations manager, they are not immediately worried about the budget or budget overruns. They're worried about their uptime. They're worried about delivering whatever services, whether it's a web application or infrastructure or whatever. So I can provide a couple of examples of what to measure out the gate. or what to look for. I want to start by saying that, as with many other things, I'm really trying to collect a large amount of things to measure. When I did one of my first certifications, one of the questions that I failed on was metrics. I just couldn't think of useful things to measure. This was in the context of... security management systems, are inherently... That's another discussion. So now I'm really into finding out what's useful to measure. I have one specific example. So I built together with some colleagues and consultancies a quantitative risk management project for my former employer. And we also defaulted to expressing the risk in the laws of money. The problem was that the organization was customer owned. We provided software and infrastructure to the financial sector and all of the financial institutions were all owning this. So we were sort of internal IT for a bunch of banks and financial institutions, which meant that the costs wouldn't really be incurred to the organizations that I was in. And it was much more interesting to providing actually the services because the cost would be incurred by our customers. which was very hard for us to quantify because how many of them, what customers used, which services and what times were they critical and that sort of stuff. So instead of quantifying to a dollar amount or monetary value, we found out that number of incidents and downtime were much better to communicate the actual risk. And then you can always do all sorts of translations. So if our entire organization was one system, Mads Bundgaard Nielsen (31:45.737) how many incidents would we expect a year, a month, and on average, how long would that system be down? And then we could talk, would it be our online consumer banking solutions? Would it be transactional systems? Would it be core systems that maintain everything? And then it's suddenly much more tangible for everyone to see, okay, so if this clearance, clearing payments, if this system is down at certain points in the day, that's really, really critical. We don't want that. So that makes it very tangible. So finding out the stakeholders who want or mandate this risk assessment, what's important to them. And downtime is usually, especially in IT risk, very tangible for anyone. Another thing that can be fairly easily quantified is just customers affected or data subjects, if you're talking about GDPR. So, for example, in Europe, we have this critical infrastructure. Now, that's very within these two. Regulation. A lot of service providers, are now critical infrastructure, which means it's not only their business that's regulated, it's also the reliability of the service they provide on a societal level. So, you want to consider how many of our customers will not be able to get clean water or electricity or DNS services as with the case with my current employer. And for how many minutes will they be without that? Or what would be the service degradation? If you usually have around 100 % throughput, how much slower would it be the water pressure, the internet speeds or cleanliness of water? There's tons of things that are inherently very obvious when you look at the qualities of the service that you provide, where these metrics just pop out at you, and then you just select the most important ones. yeah. So, minutes, number of people affected, could be water pressure, it could be a number of interruptions, service degradation is also fairly easy to quantify. I don't know if that covers it, but you know, yeah. Robert Wood (34:04.87) I feel like that, you know, bringing it back to one of your earlier responses, like making sure that you have the stakeholder who is the expert in the outcome, the expert in the subject, they are as part of the conversations, you know, so they can kind of really break down and specify the like the problem space for everyone. And then you can kind of pick measures out of that and work it the modeling that you're Mads Bundgaard Nielsen (34:30.887) Yep, yep. Mads Bundgaard Nielsen (34:38.919) Yeah, and usually you can just ask, you know, what are you worried about? And there's usually something, and then if someone says, well, we can't really quantify that, then Douglas Hubbard and Richard Syerson has provided this very, very clever responses. What would we see if that bad thing were to happen? You know, something that's actually tangible. It might be secondary, it might be not completely 100%, know, correlating 100%. But what would we actually see if that bad thing you worried about happened? Would it be customer churn? Would it be employee churn? Would it be health and safety? That's also one thing, health and safety. And of course, for projects, it's budget overruns, it's delays. Those are just just count the hours or count the days. Everyone can count, right? So. Robert Wood (35:31.398) That's a great question and follow up question. I really like that. I'm definitely gonna steal that. Mads Bundgaard Nielsen (35:35.069) Yeah. Yeah. And at some point, if you ask it enough, you will come down to something that's countable. know? Yeah. Yeah. Robert Wood (35:43.19) Yeah. So. One thing that, another thing that is really challenging in this space, or I think that leads to challenges in this space is you've got tools and you've got all of these tools in the security space that are. They're producing these ordinal risk ratings. Nessus and a pen test, you've got your SIM tools, you've got all these scanning tools, all these WAFs. They're all producing high, medium, low, number scales, color scales. everything is called risk. It all just. it's the same label. Given the limitations of, mean, maybe it's not a limitation because I think those things are very intentional features, but given maybe the state of security tools and maybe the general skill level around risk quantification being on the more immature side, generally speaking, like across the industry. Robert Wood (37:19.49) Is there, is there a point inside of a, like a typical security team that you might see as a, like a tipping point or like a big driver to say like, all right, we, like, we have all of this stuff. have, you know, our pen tests and our scanners and all that. where, you know, we're gonna, we're gonna have that, but we're gonna, we're gonna use this. this situation, this opportunity or whatever to like be intentional about starting a risk quantification exercise. And maybe you like start with a specific scenario or something, but is there, I guess, yeah, is there a point at which it makes sense for the typical team, the average team to kind of go beyond all the ordinal labor? Mads Bundgaard Nielsen (38:17.52) Yeah, yeah, yeah. So, yeah, I want to start with saying you don't want to quantify everything all the time because it takes time to make proper analysis. But what you can do is make some high-level analysis of the scenarios that you actually worried about. Is it ransomware? Is protection of IP? Is it, whatever? Take those. And then whatever operations... Behind that, you're talking about, let's say, example, vulnerability management. If you have all these detections of things to mitigate and that kind of stuff. You don't start with a bunch of vulnerabilities, rating them on their attributes, and then say, okay, so we need to be able to mitigate a hundred of them a week to be able to be secure. That's the wrong way of going about it. What you do is you say, okay, we have these scenarios. Let's just say, for example, ransomware. What is the influence of open vulnerabilities of that scenario? Let's just start with saying small, medium, large. Is this something we should consider given the other things we can also consider like access control and hardening and all of these other things? Is vulnerability management, Sorry? Yeah, exactly. Yeah, yeah, yeah. And you'll use your in-house experts for this because, you know, Robert Wood (39:33.638) It's like a directional input into our vulnerabilities. Yeah, OK. Mads Bundgaard Nielsen (39:41.693) they know what's probably more important to focus on, right? And then you say, okay, so let's say that managing vulnerabilities, that's critical. How many resources do we even have to managing those vulnerabilities? Well, we have like two and a half guys a week. Okay, how many vulnerabilities can they usually mitigate by themselves or by drawing on dev team or operations teams or configuration management teams, whatever? How many can they realistically mitigate within whatever time horizon? Now you're beginning to be very operational, very specific on what you are forecasting. And when you know how many resources you have and what is the workload, you can say, how many mitigations can we realistically do within a given timeframe? Then you'll get a number. And that's probably much lower than what your initial assessment required. So now you can start to have a really, really practical discussion about what is the actual value in loss avoided if we add more vulnerability management resources. There's the other side of how do we even classify the vulnerabilities and are they compensated by other measures? is, know, that's also a discussion. I'm not an expert on that, but that's one thing, you know, how much is even critical. Is it 10 or is it two items per week, whatever, right? But what you do is you say, okay, let's say we have two and a half guys a week to do this. They can do around five mitigations a week. I don't know the going rate for those, but we need 10 in order for us to not have the exposure that we're afraid of. What's the cost of adding these FTEs and does it reflect actual business value on the other side? And there it's a very financial quantification because you want to do with the expected losses. But you break down the problem into very specific and this is just a completely regular planning exercise. The difference is you don't do it from averages, you do it by probability distributions. I'm not going into that level of technical description, but that's how you do it. That's how you can actually get a real discussion, practical discussion on Mads Bundgaard Nielsen (42:00.825) how do we do our vulnerability management? And then someone comes in, sells a tool that can do it all automatically and then the discussion goes away. Robert Wood (42:10.672) So I want to pick on that a little bit more. So I don't know if this conversation is happening internationally. I assume it is on some level. But we have this initial starting point of scanning tools leveraging scoring systems like CVSS. Everything started with just arbitrary high, medium lows, criticals, whatever. school started producing some kind of CVSS score as a means of trying to add some level of specificity or weighting to the scores, which I can certainly appreciate where that comes from. And that was deemed wildly inadequate by many, the talking head, influencers and the actual experts. And so Mads Bundgaard Nielsen (42:59.503) Yep. Yep. Robert Wood (43:08.728) Okay. So we set CVSS aside. Then you have the, this, this coming out of newer methods. you've got, you've got like the, the SISA keV, the node exploitable vulnerabilities. You've got EPSS scores, exploitable predict, exploitability prediction scoring system, I think is what the acronym stands for. You've got all these other. methods to try to add some level. It's not true risk quantification, but some level of more specific measurement to vulnerabilities coming out of these tools. And I definitely hear and agree with you that you don't want to measure, quantify everything because then you could just find yourself just doing nothing but doing risk assessment and you're not actually doing any work. therein lies a big problem in and of itself. I guess where do you have any strong opinions on that process of trying to identify? Because I think there's a real noble effort in EPSS and Kev and all of these vulnerability prioritization methods. Because it's like we have Mads Bundgaard Nielsen (44:06.572) Yep. Yep. Robert Wood (44:33.989) 10,000 vulnerabilities and we need to figure out, like we only have resources to mitigate a hundred per week. Which ones do we start with? Like, you know, some people turn to pen tests and it's like, or bug bounties, let's get the exploitable ones. You know, some people prioritize it on specific systems, but you know, the whole kind of art and science to figuring out which, which ones you do. so it's like bringing these two. Mads Bundgaard Nielsen (44:40.963) Yep. Robert Wood (45:02.852) like smushing these two worlds together at the risk of like, you know, doing the peanut butter and jelly thing. Like, what do you what do you think about that? And like, yeah, I guess what do you think about that? Mads Bundgaard Nielsen (45:16.801) Yeah, okay, so, you know, every vulnerability management system has a bunch of unmitigated vulnerabilities that just lives on for, you know, hours and hours, or, you know, months and years. And so that just shows us that our ambition of mitigating them is much larger than our capacity. And that comes down then to two problems. Either we don't prioritize all of the efforts that need to go into this problem, or we categorize a lot of things that aren't really vulnerabilities as vulnerabilities. And there two dimensions to that. So, one thing is... A vulnerability is technically a vulnerability from a technical understanding. So we know that this type of code can be exploited by this technique. And that is a technical vulnerability of a system. But we're not really interested in technical vulnerabilities. We are interested in vulnerabilities that can affect our business. just by default, that we have all of these businesses who have all of these unmitigated vulnerabilities that hasn't brought down the business yet. If they were so goddamn important, any business within a certain time period would be reduced to rubble due to these vulnerabilities. So the other side is a very, very difficult problem of categorizing the vulnerabilities as critical or high or low or whatever we call them. something that needs attention, you know, with some kind of urgency and effort. I'm definitely not an expert on how to classify vulnerabilities, but I'm very inspired by how Richard Searson speaks about this, where what is the actual business risk of this library of open vulnerabilities we have? We need to be able to speak about that, whether it's in financial terms or in any other terms. We can't just speak Robert Wood (47:12.398) I, yep. Mads Bundgaard Nielsen (47:26.101) speak about it from a technical vulnerability standpoint. It doesn't make sense because there are literally infinite vulnerabilities. Robert Wood (47:34.512) Well, and I feel like, you know, kind of pulling on that thread a little bit more. This is one of those, like, this is, think, a fundamental problem of just taking a tool and using it out of, you know, out of, out of the box. you know, if you drop a, you know, a scanning tool into an environment, you know, it's, it's, it's programmed to. do certain things, you know, it's somewhat deterministic. If it finds X, going to report it as Y and, and there's not a lot of intelligence in that. you know, intelligence maybe is all relative here, but there's no context behind the thing that it's scanning and why it's important to the business. And so you almost on some level need a, like a filter that everything runs through. to say, like to interpret all of those, like ingest all of those things and be able to deduce, you know, what systems are impacted. Like one of the things that I thought of as you were, as you were speaking a moment ago was like, perhaps instead of risk ratings, perhaps like certain types of vulnerabilities paired with certain systems that are like parts of the environment that they exist on. Like those things together might be much more important. Like an access control issue, for example, in a, a financial setting might be like a Holy like Holy smokes. This is really, you know, this is really not good versus a, you know, versus a, an injection attack or something, you know, something of that nature, missing patch, outdated libraries, et cetera. And so. Yeah. You know, gnarly problem. for sure, because I feel like the industry kind of drowns in these findings, the thousands to tens of thousands of findings that come out of these tools. you're just trying to figure out how to deal with it all. And then at the same time, thank you, how to deal with it all. And then at the same time, Robert Wood (49:52.934) that's, that's where the compliance needs come in and all of that. they're, you know, why haven't you mitigated all this stuff? know, and they're in creates a conundrum for the poor CISOs and the CIOs of like, you know, they're, they're, they're left trying to pick up all the pieces. And so, Mads Bundgaard Nielsen (49:59.647) Yeah. Robert Wood (50:15.655) Bringing in a, I wanna shift gears ever so slightly and talk about a little bit more of an abstract kind of vulnerability that our field is also unhealthily obsessed with, I think. And that is in the space of third-party risk. Now, we have this tendency, in our field to, you know, we love to send questionnaires. We love to monitor and report on the state of another entity's security posture, whether it's through, you know, passive monitoring, pen test reports, questionnaires, policy reviews, SOC 2s, ISO 27001, you know, whatever it is. We love to do that. And we derive vulnerabilities from it. even though we have no means of actually mitigating them or, you know, we have no means of actioning any of those findings, other than to say to a stakeholder, you know, if you and I are working together and I want to go buy a thing that is riddled with vulnerabilities, you could just say, no, Rob, you can't do that. And then, you know, and then I don't want to come work with you anymore because you tell me no. And, you know, I'm going to go to the other parent. I'm going to go to mom instead of dad because, you know, I'm going to get what I want. or I'm just gonna do it. And so in the space of third party risk. And thinking about risk quantification, what do you think is a more appropriate way to approach third party risk through risk quantification? things that I was thinking about leading into this conversation were identifying critical suppliers that map with some Robert Wood (52:18.47) critical part of your organization, like, you know, it's part of a value stream or it's part of a customer facing system. You know, somebody who's like a data processor, perhaps like, you know, there's, there's, there's like a mapping or a, like an initial classification step that you do there. But yeah, I guess like, how, do you think about these two things coming together through your lens? Mads Bundgaard Nielsen (52:23.464) Yep. Yep. Mads Bundgaard Nielsen (52:35.464) Yeah. Yeah. Mads Bundgaard Nielsen (52:43.198) Yeah. Okay. So first of all, I think there's an insane amount of wasted effort in today's third party risk management regime. It's insane. And I've been on both sides of that. I've been sending the questionnaires, I've been designing the questionnaires, I've been responding to the questionnaires and the insurance companies or insurance providers to send insane questionnaires to that they don't have no chance in hell of using for anything useful. Robert Wood (52:51.526) but not a good one. Robert Wood (53:12.677) Jason, you're. Mads Bundgaard Nielsen (53:12.775) So, yeah, yeah, yeah, yeah. And it comes down to, you know, after defining, and I talked about the four steps, after you define something, you simplify and you find out what's actually important here. So, you need to, and this is done in parallel. So, this is one of the areas where compliance and actual operations, that divorce is most ugly, because from a compliance side, you, by law or by regulation, You need end-to-end requirements. So whatever requirements you have, they need to go throughout your entire value chain and supply chain. So you need to ask them whether they have whatever implemented access control, whether they have a security policy, whether they have assigned loans, all of that. You need to ask it because you are obliged to do so. We can always discuss whether additional questionnaires is necessary if you already have the certification. That's an entirely different discussion. So that's the compliance side. Then if we look to the actual business side, then you want to see, so what kind of output from this supplier is actually critical to our business? Is it just the time and material? Is it some kind of solution that needs uptime? Is it actual resources, like if you're a mining company and it's machinery? What is the actual output or outputs that this supplier provides us? in increments or continuously. So that's the first thing. That's usually not very many things. Then you see, we want to be able to predict whether this supplier is reliable and how do we ensure reliability upfront and ongoing in the operations. And then you want to define three to five things that you think are the most important uncertainties to predict whether... that supplier can actually provide the services that you procure from them. It's usually not very many that are the most critical. I made this 80-20 post a couple of months back. It's the same applies everywhere. So 20 % of the factors usually account for 80 % of the variants. It's the same with your supplier. So find out what are actually the important things with this supplier or this general supply service. Mads Bundgaard Nielsen (55:35.536) And then you manage that much more harshly. what I've seen is that in operations, they usually do that. So if you have a very, very close relationship with a supplier, you maybe have weekly or monthly meetings with them where you constantly adjust procedures, adjust the services, you say, we want more of that, more of this, less of that. This is ongoing. that's actually risk management in practice. It's just not mandated by a matrix. But find out what's the supplier actually supplying. Find out what are the biggest uncertainties to reliability of that service. And then you manage those and don't go to your standard control catalog. It won't help you. There's a business logic to this. And most of the relationships, they're managed by a lot of other things than security controls. yeah. Yeah. So, and we are probably never going to get there. Robert Wood (56:27.333) What? Mads Bundgaard Nielsen (56:35.537) But if more and more security people would realize that this is a business problem and not a security problem, security's uncertainty contribution to that entire third party uncertainty is so small compared to a lot of other things. There are liabilities and all of these other contractual things that I'm not too familiar with. But what actually regulates the business relationship there's a lot more uncertainty that's not security. And we need to realize this. It's not security that's the most important. Obviously you want to avoid completely unreliable ones, but yeah. Robert Wood (57:15.802) Yeah. Well, well said. Now, you think like given the, given the, the state of cyber risk quantification and its adoption across the, across the industry, what do you think is going to help us get to a place, get to a state where more organizations and more people within those organizations are leveraging it and putting it into practice? Is it an education problem, a tooling problem, know, AI magic? Uh, what's going to, what's going to get us there in your Mads Bundgaard Nielsen (57:54.257) Yeah. Mads Bundgaard Nielsen (57:57.639) Yeah, I think this newest AI bus around agents seems pretty interesting. And I've seen, for example, we have this Professor Hernan Hüller on LinkedIn who's speaking a lot about that. I would recommend looking him up. So that's one thing that I don't know too much about. Then... Yeah, definitely. Definitely. Then... Robert Wood (58:19.43) Would you send me his details? I'll include it in the. Yeah, cool. I gotta follow. Mads Bundgaard Nielsen (58:26.586) There's one push. So a lot of cybersecurity comes from just legislation. So in Europe, have GDPR, we have NIST 2, we'll have the AI Act, we have a bunch of other regulations that now says that it says it's risk-based, but it just mandates you have all of the controls of the ISO, basically, if I'm being very, very blunt. But that pushes cybersecurity. Some of it mentions quantification 2. And I think that in the guiding materials that will come along, it tends more and more towards quantification. So that push will obviously also push for quantification of the areas. I'm not too optimistic around it because there's a competence gap. So I've spent a couple of years now learning basic probabilistic methods. And it is, I would say, is difficult. It's not as difficult as it seems, but it is difficult. And if you're not used to doing it, it will take time to train. So it's also a training issue. The funny thing is that in all other risk domains in financial risk, like liquidity market risk, currency risk, all of these established financial sector risk domains, you have quant guys and only quant guys doing it. And then in the financial institution, you can go just across the aisle to the operational risk manager. who's never even used a calculator. And to me, it's completely insane. So I think that I've seen that risk management will just be eaten up by compliance. I hope that it will be eaten up by more traditional risk management professions instead, so that we can have a more mature and effective analysis of the risks. Obviously, I want to say that it's going to blow up. and the people who can actually do it will be extremely valuable and that sort of stuff. But I'm hesitant in saying that this will explode anytime soon. It's difficult and there's no call for it. And that's the final factor is that no one expects cybersecurity professionals or risk managers to be able to provide quantified risk assessments and decision support. They don't expect it. So if you bring it up, they maybe don't even know what they're looking at. Robert Wood (01:00:28.387) Yeah. Mads Bundgaard Nielsen (01:00:51.662) So if no one, you know, requests it from you, you're going to present a response to a question that hasn't been posed. That's also a problem. So this space is, yeah, riddled with a of... Yeah. Yeah. Robert Wood (01:00:51.686) Hmm. Robert Wood (01:01:04.71) It's it's it's nascent stages. It's it's yeah and and the financial equivalent with the quant guys like that movie. The big short comes to mind with Steve Carell. That is such an interesting like I didn't think about that as as I asked the as I asked the question but. But you're you're you're definitely. I mean, that's such an interesting point because we have compliance people, we have IAM people, we have vulnerability management people, we have SOC analysts, we have testers. But like in the financial world, if a firm's doing mergers and acquisitions, the whole business kind of revolves around that thing. Whereas cyber It's a very like vertical kind of problem space. Whereas cyber is, it's very like horizontal. It cuts across all of these things. There's all of these kinds of sub-disciplines. And it makes me wonder if that's one of the, that's one of the reasons why like, you know, having the equivalent of the quant guy or gal in an organization would be really hard in our field or, or maybe just very much ineffective because of the, all of the other subdisciplines. And so it's like, do we need to, do we need to get to a place where there's some kind of, like 101, 201 level, level of awareness across the board. So that way it can be starting to, it can start to get worked into more, more situations, more scenarios. you know, universally as opposed to having, you know, and like reserve specialist cases for the actual specialists. Mads Bundgaard Nielsen (01:03:12.034) Yeah, yeah, I think so too. So, you know, for the vast majority of organizations that are small and medium sized enterprises, their loss exposure is very, very similar. You can do the analysis on each of them and there might be some variance, but a lot of them will have, if you're manufacturing with a certain size, you probably have exact same loss exposure as another manufacturing company with almost the same size. And the controls that usually account for the most of the security are almost always the same. Access control, MFA, backup, perimeter control, firewalls and that kind of stuff and hardening. It's like five things that those are the ones you should focus on. And until you have those nailed down, you don't need another analysis to tell you you also need whatever control additionally. So that's one thing where you don't need to quantify what we know. We have a pretty good idea at least of what are the basic controls that you should apply across the business. And that's the horizontal thing you're talking about. Then we have the more vertical thing where if you're an &A company and you are a CSO, what is your objective there? Is that to help the &A team price in the exposure of the companies you're looking at? Or is that to protect the &A process as such? such that you don't lose your IP so that other ones can come in and take the deals or it fucks up the pricing or something. So I think that's more, that's something to be decided in the organization. So if you have a quant guy from security, do they help assess the exposure of the companies you're looking at? Or do they simply help protect your virtual document space to make it very concrete, right? But that's an example of... of what is the role of that person in the Kwan Kari. Robert Wood (01:05:09.69) Yeah. And that, you know, it take, take risk quantification out of it. You know, that's, that's a thing that I think some CISOs or, you know, security leaders have been put in interesting positions in, you know, maybe the past 10 years, maybe, maybe longer, maybe shorter 10 years ish where you have, let's say it's a It's a product company, Amazon web services, Google cloud, you know, a specific software company, you know, they go and hire a CISO and really that person's there to kind of be a sales enablement or a business, a growth enablement resource as opposed to working on internal operations. And that, yeah, that, that, that example that you just gave with like, you know, is that, is that quant person? Mads Bundgaard Nielsen (01:05:55.716) Yeah. Yeah. Robert Wood (01:06:07.482) doing like helping in the, the value that the organization delivers to, you know, customers or outside stakeholders, or are they, you know, helping with internal operations? And I've seen a lot of people get stuck in that, that rock and hard place where, you know, they're kind of expected to do both. Cause it's like, we're screaming and you know, pounding the doors like, We must provide business value. have to speak the language of the business. At the same time, we have to protect the business and make sure that everything doesn't implode and set on fire. that's, yeah, it's, yeah. Mads Bundgaard Nielsen (01:06:46.186) Yeah, yeah, yeah. Mads Bundgaard Nielsen (01:06:51.298) Yeah, so this is not specific to the CISO role that their role is not very well defined, but it seems like the churn of CISOs in general, it seems like they specifically burn their candles in both ends. Exactly as you say there. And something comes down to personality, something comes down to just organizational expectations. And I would argue that most of the time it comes down to what are the actual business objectives that the CISO Robert Wood (01:07:06.042) Yeah. Mads Bundgaard Nielsen (01:07:20.098) influence that he has the responsibility of. Because a ton of times you just protect the enterprise, but they have no mandate to change operations, to change internal IT, to change protection of sites. Just protect everything. You have no mandate, no budget. That's one problem. So the CSO wants to find, if you are a CSO, want to find what can I even influence? Robert Wood (01:07:25.048) Mm, yeah. Mads Bundgaard Nielsen (01:07:48.598) that's important to the business. That's usually a very narrow domain because the product will take care of the products, And product guys and the finance will take care of the finance problems and so on and so forth. So what levers do even have that you have access to that are even important to the business because you can implement MFA from here to the moon. If it doesn't really affect, then what are you even doing, bro? Robert Wood (01:08:17.392) Yeah, hitting those compliance check boxes. Mads Bundgaard Nielsen (01:08:19.233) Exactly. Which can be valuable too, a very small proportion of people find that exciting, I would think. Robert Wood (01:08:32.122) Yeah. So, is there any industry that you feel either an industry or a size of organization that you feel is better suited to explore risk quantification? it's. Maybe that's in terms of, you know, readily available measures, although, you know, based on our conversation, I'm more convinced than ever that that's sort of a non-problem, that you can find something to measure that's important in any scenario. Mads Bundgaard Nielsen (01:09:10.017) Yeah, so I think it's hard for me to talk to industries or sizes, but organizational attributes that would predict whether this would be successful. I would say those that are focused on those that have strong, mature governance, the decision-making processes, they would have an easier time because risk management is ultimately about making decisions. and who can make what decisions and what information needs to be provided. Yeah. Yeah. So if you have very immature governance or ineffective governance where no one knows who can decide what, it's very difficult to come in and quantify anything because you can't speak to the stakeholders that actually can make the decision, afford budget and say this is the important thing we want to analyze. Robert Wood (01:09:44.486) and all of that stuff. Mads Bundgaard Nielsen (01:10:06.291) So, and that, you can have immature or mature organizations of any size, you know, related to the governance. I would say though that, you know, if you want to do risk quantification where you haven't done it, that means change management. And change management is inherently easier for small organizations or smaller, maybe business units or smaller functions within a bigger organization. But if you want to change an interim process of a large enterprise, That's difficult. You'll need a project or program or whatever. I haven't even been in those big projects. So if you're a federal government, for example. So you probably want to start small and you want to start where there is an ambition of good decision making processes or where there is one existing. Yeah, I think that that's the best I can say to that. Robert Wood (01:11:00.006) I think that's a wonderful answer because my initial inclination coming into this was kind of geared around certain industries being a little more well set up for success, maybe they're more familiar hearing. risk expressed in certain terms and that sort of thing. And my perspective on that is different now than whatever, an hour and 10 minutes ago. yeah, so I like the organizational attributes. Because I think for the CISO, who's thinking about maybe getting started with this, they Mads Bundgaard Nielsen (01:11:41.715) Yeah. Yeah. Robert Wood (01:11:55.184) They only have so much political capital, time in the day, resources at their disposal. And so, if they're looking at their culture and it's like, this is not the right place to invest now. I can do decision support and risk management in other ways that maybe spend my resources more efficiently and effectively. Great. You know, and they kind of put this one in their back pocket for either a future state at that organization or another organization in a future, in a future role or something like that. I think that, you know, I, I think that's important because we, we, in this industry, we tend to talk about things in absolute States. Like, you know, if, something's a good thing, then it has to be done everywhere. Mads Bundgaard Nielsen (01:12:50.355) Yeah. Yeah, all at once. Robert Wood (01:12:52.614) You know, all. Yeah, it's just like, gotta, you gotta go all in. And, and, so my kid, my kids are really into, the, you know, those little like juice squirty things that, you add to like water to make it, make it taste, taste differently. so my, my younger son in particular is, is obsessed with these and he has quickly learned that like adding too much is going too far, too fast. Mads Bundgaard Nielsen (01:13:21.951) Yep. Yep. Robert Wood (01:13:22.054) Not a good thing and we're going to waste a lot of water that way. Or if you add it to things like if he has milk and he adds orange flavoring to it, then it might not actually be that delicious or, know, so he's, he's kind of figuring this out as he, as he goes. But, know, it, uh, know, setting the silly example aside, I feel like in our field, there's, you know, we, we could benefit from not. not approaching everything as a universal, this is a good idea in all situations and we have to make it work as a result. And just be a little bit more situationally and contextually aware of our. Mads Bundgaard Nielsen (01:14:06.013) Yeah, definitely. Definitely. I want to return to one of the first things I said is that whatever management you're doing, you want to do it on outcome metrics, right? So quantification can be advanced modeling, but it could also simply be counting instances within a given timeframe. you know, a common security management system metric is like, have we updated all of our policies within the last year, right? Which is... Robert Wood (01:14:33.35) Mm, yeah. Mads Bundgaard Nielsen (01:14:34.151) Okay, that's fine, but how does that change your security posture? know, how is that the important part? So finding out, okay, in our team, our function, division, whatever, what is the actual output that we want to have measured? And in products and in engineering and in operations, these are usually already very well defined. But security needs within every organization and maybe these metrics will converge on the same five a matrix, but find out what is the output, what is the value that we provide to the business that we can actually measure and that is actually different given our effort and resources. So find that out and you don't need big models to do that. You can simplify, simplify, simplify to begin with, but find out how is our success measured and don't measure something that's completely beyond your influence. because that just provides burnout. So, yeah, so given that, you need very specific metrics to measure your performance because you're interested in outcomes. Yeah. Robert Wood (01:15:35.12) Yes. Robert Wood (01:15:45.37) Yep. Yep. I like that. So I have one final question for you, and it's not explicitly related to cyber risk quantification. for anyone who's, you mentioned, you kind of self-taught in a way, your way into this position. Do you have any like really solid books, you know, people to follow, you know, like if somebody wanted to you know, get their own capabilities off the ground and start, start learning this space, you know, in addition to following the stuff that you put out, where, like, what resources would you recommend to, to some. Mads Bundgaard Nielsen (01:16:29.127) Yeah. Okay, so the first thing I would do is actually nothing to do with with cyber risk management, but that's a book by the author of Robert D. Brown called Business Case Analysis with R. So even though it presents the problem in the programming language of R, it also runs through concepts of how to quantify things in general, how to map out a problem space. And where it's particularly useful is that it introduces this idea of the influence diagram. And I think that the influence diagram is the main tool for anyone to use who needs to bridge understanding gaps. It's so useful. do that. That will also introduce you to basic quantification. Yeah, yeah, yeah. Robert Wood (01:17:21.926) Get the concepts out of that. Mads Bundgaard Nielsen (01:17:24.711) And if you're really interested in R, go ahead, but you can get a lot from that book without ever writing any code. So that's one thing. And then for cyber risk quantification, I would go for Richard Sihesson's The Matrix Manifesto. I would go for Douglas Hubbard and Richard Sihesson's How to Measure Anything in Cybersecurity Risk. I would also go for Douglas Hubbard's other books. Then actually, I found it very useful to do Khan Academy's a course on statistics and probability, especially the unit of random variables, it'll take you probably an afternoon and you'll think, well, this is pretty useful for predicting number of vulnerabilities, number of incidents, know, fairly easy. And then from there, I would go into more statistics and probability, probabilistic methods. with set statistics on YouTube. He provides the best explanations. He even shows you how to do it in Excel with different distributions, that kind of stuff. I think that's enough resources for now. I have a huge library of things too. you know, I would like for... Yeah, yeah, yeah. Robert Wood (01:18:35.717) Yeah. It sounds like you can get dangerous. A fantastic dinner party conversationalist if you're working through these things. Mads Bundgaard Nielsen (01:18:46.3) Yeah, definitely. But I would say, you know, the first book, you can read through that in like an afternoon, a couple of hours. It's not very thick. It's very, very accessible. Khan Academy is free. Set Statistics on YouTube is free. I don't think these cybersecurity books are too expensive. So you can really go a long way with free resources too. Yeah. Robert Wood (01:19:11.152) Amazing. is yeah, that's, that's great. Great advice. All right. Well, I want to say thank you so much for spending my morning, your afternoon with me, having this conversation. I found it absolutely fascinating. And yeah, you know, like we have more, we have more stuff to talk about. We'll take that offline. But if there any last parting words that you want to leave people with before we hit stop here and resume our days. Mads Bundgaard Nielsen (01:19:41.948) Yeah, yeah, yeah, yeah. So I'm really trying to provide useful resources on my LinkedIn page. Hopefully you can link to that in the notes. So go look it up. There's already a bunch of templates you can request. And I plan on keeping up that quality in the future. So go in there and see if I'm worth following. And just write me if you have a specific problem, as you can probably... Robert Wood (01:19:50.374) Absolutely. Mads Bundgaard Nielsen (01:20:10.011) I'm really enthusiastic about this. I love speaking about it and I love writing about it and I love helping people solve their problems pro bono. reach out and I'll be happy to speak to anyone who's interested in the field. Robert Wood (01:20:25.19) Yep, can 100 % vouch for the quality of the posts on there. Anyone who knows me knows that I shame on the influencer community. This man is not an influencer in that sense of word. Puts out actual good, useful stuff, and you can take it, put it into practice. Well, sir, thank you again. Mads Bundgaard Nielsen (01:20:46.875) Thank you so much Robert. Robert Wood (01:20:51.43) I hope you have an awesome rest of your week and yeah, we'll talk soon. Mads Bundgaard Nielsen (01:20:55.727) Yeah, we will. Thank you so much for inviting me. I love these opportunities. Robert Wood (01:20:59.418) All right, appreciate it, sir. Have a good one and thank you everyone for listening. Mads Bundgaard Nielsen (01:21:02.011) Take care.

Other Episodes

Episode 2

October 30, 2024 01:16:11
Episode Cover

From Cost Center to Business Driver: Making Security a Strategic Asset

In this conversation, Robert Wood, CEO of Sidekick Security, interviews Tyler Healy, CISO of DigitalOcean, discussing the evolution of security leadership, the importance of...

Listen

Episode 3

January 15, 2025 01:15:00
Episode Cover

From DMZs to DevSecOps: Building Modern AppSec Programs with Gunnar Peterson

In this conversation, Robert Wood and Gunnar Peterson delve into the complexities of application security (AppSec), discussing its evolution, the importance of building effective...

Listen

Episode 1

October 01, 2024 01:02:59
Episode Cover

Tech Debt, Compliance, and Strategy: A Deep Dive with the CDC’s CISO

Summary In this conversation, Robert Wood and Joe Lewis discuss the complexities of leading cybersecurity efforts within a large organization like the CDC. They...

Listen