June 29, 2022

Better User Awareness Training with Tim Silverline

by Cyber Ranch

Show Notes

Tim Silverline, VP of Security at Gluware, joins host Allan Alford on the Ranch this week for a discussion about user awareness training and the latest and greatest (as well as not the greatest) methods around phishing simulations. Tim and Allan get into the nitty gritty of how your company can improve user awareness results through avoiding basic click-through models, considering advanced warning for certain training exercises, and understanding risk quantification when evaluating employee metrics.

 

Timecoded Guide:

[04:30] Running the right phishing simulation for your user base and gauging your results appropriately

[10:08] Pushing boundaries in the tactics used in phishing exercises and making employees pay attention more closely to their everyday emails 

[15:10] Calling out unlikely and unhelpful phishing strategies and simulations, including the harm of impersonating employees without any warning

[21:04] Realizing which methods of user awareness are no longer effective and shifting away from the mindset of just “checking the box” in these training exercises

[25:54] Changing security for the better with increased awareness and a better understanding around the value of risk exposure amongst employees

 

Sponsor Links:

Thank you to our sponsor Axonius for bringing this episode to life!

Manual asset inventory just doesn't cut it anymore. That's where Axonious comes in. Take control of security complexities by uncovering gaps in your organization. Sign up for a free walk through of the platform at Axonius.com/Get-A-Tour

 

What, to you, are the biggest highlights, the high points, the critical bits of user awareness training?

Tim has seen the good and the bad of user awareness training, and has found the best results for his users in interactive training sessions, especially when paired with gamification. Allan compares this method and approach to modern virtual escape room sessions, and Tim agrees that the more interactive and hands-on a training can be, the better the learning experience will be. Instead of framing our user awareness and phishing exercises around checking boxes for cyber insurance companies, we should be striving for active learning engagements that demonstrate the value of security to our users.

“After those trainings, users have come up to me and talked to me about how they weren't aware of this particular risk and hearing about it in a real-world use-case was very effective for them to really understand why it's important and why they should be behaving in a slightly different manner.”

 

If the users never fall prey to attacks, is there a reason to continue performing them? 

Hearing Tim talk about his success, Allan was curious about how he chooses to approach successful user bases. If someone isn’t falling for Tim’s phish, does he still see the need to perform these exercises? The short answer was yes, but Tim explains that user awareness training should be customized to the needs of a user base. Testing new employees is a must, along with refreshing successful users on their skills a few times a year. Additionally, scheduling out different exercises that hone in on different phishing simulations exposes employees to a variety of learning opportunities and encourages them to see this beyond just a yearly test where they might as well “get it over with.”

“If you've tested all your existing employees, and they haven't fallen or been susceptible to it, that doesn't mean that the next employee you hire is also going to be of that same mindset.”

 

What ineffective methods are there in security awareness?

Throughout the episode, Tim and Allan keep coming back to the simple fact that checking boxes no longer works. Having employees read or watch through videos and take “common sense” knowledge tests makes user awareness training a distracting activity that feels more like grunt work than a learning experience. While you never want to disrupt the workflow of your employees, stepping outside of the box with interactive activities that are explained in advance shows the value of these exercises to your users instead of making them feel that you’re yet again wasting their time with another gift card scam.

“I find that there's the typical thing a lot of people do to hit compliance, which is having their users watch videos, and answer questionnaires. My feeling is that most people just try to get that done. Their goal is really to get it completed, so they can check the box and their company stops bothering them to complete it.”

 

You are given a magic wand and you are told you can wave it and change any one thing in cybersecurity you want to change. What do you change?

There’s so much in cybersecurity that Tim and Allan would love to change, especially when we look at cutting edge approaches to user awareness training. However, Tim makes one thing clear: if he could change anything, he would change our mindset. Instead of seeing security as just someone’s job, we should encourage our users to see themselves as an instrumental part of their company’s security. When everyone concerns themselves with following the right protocols and caring about security beyond simulations, companies will find themselves in a much stronger, less vulnerable place.

“I think ultimately, a lot of the weaknesses inside of our organization are our users. If I could just increase the level of carefulness, or the level of interest that everybody has in keeping their own companies secure, I think we would overall improve the posture of all companies.”

-------------

Links:

Learn more about Tim Silverline on LinkedIn and the Gluware website

Follow Allan Alford on LinkedIn and Twitter

Purchase a Cyber Ranch Podcast T-Shirt at the Hacker Valley Store 

Continue this conversation on our Discord

Listen to more from the Hacker Valley Studio and The Cyber Ranch Podcast



Transcript

Tim 00:00
Because in a previous company of mine, I was actually told by senior leadership inside of the company, that we could never have any phishing simulation that was sourced from anybody saying that it was HR. It didn't matter if there was an external banner on the top, it didn't matter if this employee didn't exist in the company, it didn't matter anything. It was just that we had such a high click rate for any email at all that had anything to do with HR at all, they're like, "No, you can no longer send any of those emails because people are now afraid to click on emails."
Allan 00:28
Howdy, y'all, and welcome to the Cyber Ranch podcast. That's Tim Silverline, VP of Security at
Gluware, who has held a variety of other roles as well, including CISO at a couple of places in a couple of different industries. Tim and I have been chatting about user awareness training and its nuances and technicalities. Come join us in this great conversation. And, Tim, thank you so much for coming on down to the Ranch.
Tim 00:28
Thanks for having me, Allan. I appreciate it.
Hacker Valley Studio 00:29
Welcome to the Cyber Ranch podcast, recorded under the big blue skies of Texas, where one CISO explores the cybersecurity landscape with the help of friends and experts. Here's your host, Allan
Alford.
Allan 01:15
As a quick disclaimer, I'm on the advisory board over at Living Security and, because our topic is security awareness training on this show, Living Security is going to come up a couple of times. It's not a commercial, I promise, they're not a sponsor of this show. And I just wanted to be totally transparent about my relationship with them, given that they do come up a couple of times. With that said, please enjoy the rest of the show. Alright, so why don't you get us started with just a little bit about your background in cyber and a little bit about your day job.
Tim 01:40
Yeah, sounds great. I've been in the cyber industry for close to 20 years now, started out in the network security space, back when Cisco Security was a big deal and you know, worked on consulting on all sorts of different networks. And then, over time, I went into some larger companies, and eventually, I was hired back by one of my customers to come be their CISO and build out their entire security program, and that's really what got me into the CISO field. And, you know, as you mentioned, I was the CISO for a couple of different companies until I wound up here at Gluware. The reason I came over to Gluware was because, after witnessing a few different cyber-attacks, I started thinking about some of the ways that I felt like there were some shortcomings in the industry, from a product perspective. I thought about actually going to start my own company and then, you know, after chatting with the CEO of Gluware, who I've known for a while, he convinced me to come over and help them build out and actually turn their product into a more full-fledged security product, in addition to what it does today, which it really is intelligent network automation. And, you know, we focus on helping companies that have let's say, 10s of 1000s of different network devices all across the world, really maintain consistency of firmware perspective, and a configuration perspective, as well as ensure that they're compliant with standards and things like that.
Allan 02:51
So, drift management for the network infrastructure. Absolutely. That's cool. That's cool, man, that's real cool. Alright so, user awareness training, you and I chatted a little bit before the show, and we kind of bounce some ideas back and forth here. I guess, let's talk about what you. We'll start at the high level, like, what are the best methods of user awareness training? I know, we've got phishing training, but there's also, and I have to disclaim I'm on their advisory board, but Living Security has the really cool virtual escape rooms. There's gamification, there's all kinds of other techniques, risk quantification, and cross referencing to the tech stack, and all kinds of cool stuff going on. What, to you, are the biggest highlights, the high points, the critical bits of user awareness training?
Tim 03:27
Yeah, I had actually never heard of the Living Security escape room concept until you had mentioned it to me. So, I looked into that a little bit and that sounds very interesting, because in my experience, the most successful user awareness trainings that I've been part of, have always been live, in-person, very interactive training, right? You know, I find that there's the typical things a lot of people do to hit compliance, which is they just have their users watch videos, and sometimes answer questionnaires, and that stuff, and my feeling is that most people just try to get that done, right? Their goal was really to get it complete, until they can check the box and their company stops bothering them to go and complete it. But when we've done it, when it's been in-person, and interactive, especially when you can do gamification and get people excited about what you're talking about, that has led to some really, really great outcomes. I've often even had, after those trainings, users come up to me and talk to me
about how they weren't aware of this particular risk and hearing about it in a real-world, use-case was very effective for them to really understand why it's important and why they should be behaving in a slightly different manner.
Allan 04:30
I get it, I get that and that interactivity and that relevancy piece is there. What you were describing is that people just try to get through it and check the box, right? I call that the annual click fest. You get the training in front of you, you just start right-clicking until it stops and forces you to answer a question and then, you use what is hopefully common sense. You're probably right, you might be wrong occasionally and have to go back and re-click, but you can easily zoom through the whole thing without even thinking, right? That I think, is the worst possible model. So, let's get into phishing simulations. Let's talk a little bit about that. How often do you think you should run phishing simulations with your user base?
Tim 05:05
So, I think it's an "it depends" answer. I've worked in a variety of different companies and what I would say is, in my current company, Gluware, we have generally a pretty intelligent user base that is aware of what they're doing. I have worked in other places, though, where the company wasn't necessarily focused on technology, or necessarily on security, and a lot of times it's focused on sales, and those users are not really that in tune with it. For those companies, I think you need to end up doing it, at least more frequently, until you get to a point where people are starting to be more cautious. So, I think it really depends on your user base and you can get a gauge on that, as you start conducting them, by the failure rate. Depending on that failure rate, I think that's really what should guide how often you should do it, because you want to get to a point where your users are pretty effective at identifying the
malicious emails.
Allan 05:54
Yeah, okay. I'm gonna give you my hot take on this one though. Phishing simulations, I'm a big believer in, should only be run with advanced warning. Attention users, you will sometime in the next week, get a fake email, that's going to be a phishing test, that's going to be us doing it, your security team, right? And I also think that, instead of focusing on failure rates, we should focus positively on the reporting rates. How often do users actually report the fake phish, right? So, I'm a big believer in giving them warning, so it's not a trap kind of thing, and focus on the positive, not the negative, so there's nothing punitive there and so, that you keep their morale and keep their confidence and their will in the game, if you will. Does that make sense?
Tim 06:30
I absolutely agree with a lot of the concepts that you just described, and I think positive is always right. Positive enforcement for behavior, for pretty much anything in life, is always typically better than negative enforcement. What I will say, though is sometimes, you have users that they care about their day job. Regardless of how much you try to give them positive reactions to what they're doing and how they're identifying those emails, if they don't care about it, and they just continue to click on things, I don't know, there needs to be some other way that you can convince these users to start actually paying attention to what they do. So, that's the only thing that I feel like, it would be great if companies could just do that positive stuff and people were on board and they wanted to get that positive feedback, but I feel like, again, in certain companies, you just don't get that reaction unless there's not necessarily like punitive, like you're gonna do something to them, but put them on a list and make sure at least people are aware that these are the these are the users that we need to increase and improve
their behavior with identifying those emails and stopping clicking on it. I mean. I've been in companies where you do the phishing test where they click on it, and they go put in their credentials on those sites, right? Sometimes, it's a high enough percentage that it's pretty scary to think about that this email came in and more than 30% of my user base just gave away their credentials. So, what do you do in those situations? I'm actually kind of curious.
Allan 07:55
Yeah, so, I'll tell you this before I answer what I do, I'm going to share another story with you on the absolute opposite end of the spectrum. When I was in the data services industry, I had a client who literally had a "3 strikes in your fired" rule. And it was enforced all the way up to the board level, the number one metric this CISO reported to his board was the phishing results, and if any one individual got phished successfully three times, they were gone. That draconian, it was a perpetual hammering them and constantly challenging and testing. I think he did an every three months cadence. So, you know, three clicks would be three times in a year, or three times in nine months where you clicked and you were out. So, that's the polar opposite of my approach, right?
Tim 08:33
Right, but it reset on an annual basis? Or, was just like a rolling record of every click?
Allan 08:38
Oh, yeah, it reset on an annual basis, three times in a year. He reset the clock, so it was three times a year. So, you can click three times over three years and be fine.
Tim 08:45
Yeah, your employees are too important to let somebody go over for just that, especially if it's a hardline rule, like you'd let your best employee go because they were busy trying to rush to a project and they clicked on something too quickly, that's too much.
Allan 08:56
Right, exactly. To your point, you were saying people are trying to do their jobs. They've already got a job, they've got a mission, they've got a job description, they've got a set of obligations and expectations towards the business that they're trying to tackle, and here you are, off to the side, annoying them with the security stuff, right? Like, I always try to keep that in my brain when I'm implementing anything with my user base, right? So, as to how I deal with those negative stories, if you truly have a Timmy or a Billy, and I always call him Timmy and Billy, the perpetual clickers, the ones that it's like, "Oh my god, dude, we just trained you five minutes ago and here you are, clicking again," right? If you truly got some cases like that, I will work directly with their frontline manager and come up with a strategy jointly and I hope that I've only got ever one or two of those in a bigger organization and that it's not an endemic or systemic problem. I'll go to the frontline manager and just, Look, we got to talk about Billy. Billy's got a clicking habit we have to break and we have to work with him." I'll actually bring the frontline manager in so it's a sit-down. To your point, they care about their job, they care about their responsibilities, and if their manager is sitting in the room with them saying, "This is a big no no for you," they're gonna listen to that person a lot more than a random guy, even with a CISO title. "Oh, whatever, you've got a C in front of your title." You're still some random guy showing up and trying to tell them who's who and what's what, and they'll just say, "Whatever, I've got a job to do," right? So, I engage the frontline manager and work with them that way, and do it non-publicly and non-punitively, but definitely with a firm, "You need to improve this, this is important, this matters, and we are looking at us specifically, and we are measuring you specifically going forward," and definitely make it almost like a performance plan kind of approach. But one on one, not a big public shame fest, right?
Tim 09:16
Yeah, that makes sense.
Allan 10:08
How about the boundaries we stay within, right? Like, when we're doing a phishing training, and this is where we're getting into some of the technicals now, should you change SPF records to allow for the phishing solution to spoof your company domain, right? Even though that may not be possible in the real world, because you got whatever DKM and SPF and all the goodness all wired in, and you're set up for anti-spoofing and all that. Do you deliberately bypass your anti-spoofing controls to allow phish to be more effective and trickier? Is that one you do?
Tim 11:04
This is one that I'm actually starting to warm up to. Previously, I felt like, no, that's not something you should do, because it's not a real-world scenario, but I wonder if the reason that I felt that way previously is because of how susceptible users were in previous companies of mine to clicking on them. I almost wanted to make it easier for them, so they would they would notice it, but I feel like if you have a decent user base, and they're pretty good at this, I feel like it's a good practice to have them analyzing their emails a little bit more closely. I don't think it's over the line. I do like the idea you mentioned, about giving advance warning. I think that makes sense. So, you said you do it within a week, I wonder what that right timeframe is.
Allan 11:45
Yeah, sometime in the next week is what I always said. I would send that on a Monday and the
phishing email would be out the door by Thursday. You want it still relatively fresh in their minds, I would say a week, and it would usually be 2, 3, 4 business days tops.
Tim 11:57
Generally, I think it's okay. I think the one thing I would probably move away from is trying to spoof individual users and make it look like it's coming from their email accounts. I know that there still is theoretically the risk vector that their account could be taken over and they're actually sending an email from a legitimate account that they've compromised, but I feel like it's hard if you're just almost aggravating users at that point and creating to the point where you're slowing them down and their daily activities, because now they're worried about every email could theoretically be that way.
Allan 12:27
Right, and it's not just the user perspective, you have to keep in mind when you're saying that, because you're also saying that about all the entities in the business that send out spoofed emails on purpose. Like, I remember the very first time I ever really got a good aggressive anti-phishing training program in place, and was starting to track metrics and was seeing a genuine improvement where reporting rates were going up, click rates were going down, everything was happening the way it should be. And then, my marketing department bought some new tool and sent out a company-wide thing that pretended to be the CMO, but was actually from the outside and it was all done on purpose and it was all legitimate, but it completely undermined everything I was training my users. If you get an email that looks like it's from Sally, except it's not from Sally, don't frickin’ click, and this was a marketing email saying everyone needs to click and take the following survey kind of thing. And then, HR turned around and did the exact same thing with yet another system. So, I found myself constantly telling my users don't click impersonated emails, and yet, my own business partners were constantly sending out impersonated emails. And so, finally, I had to, just like I sent the warning in advance on a phishing simulation, I worked with the other departments in the company to say you're going to send warning in advance whenever you do this. Hi, this is the real Sally CMO, you're about to receive a fake Sally CMO email, but it's legit. It's from me, it's part of this program, and blah, blah, blah, blah, blah, and please click, right? So, you get some sort of advanced warning, a, "You're gonna see something fishy, but it's cool, this time it's legit and I'm telling you up front and ahead of time." And if that kind of advanced notice works on the spoofing that's deliberate spoofing, then I think, you can keep your user base trained on spoof bad, but if you don't have the rest of the business on your side in doing that, I think it just completely undermines the entire cause.
Tim 14:01
Right. Yeah, I think the one case I can see where the spoofing, again, makes sense, or the one way I would use it, is with emails that don't exist. It's coming from like, a support email that's not a real email inside of the company, that you never used to send anything out from. I can see using that and spoofing from that, just to kind of get your users to understand that this isn't a legitimate email. It looks like it might be coming from inside of the company, but you should know, we have our official communication channels and this is not one of them. You should at least check with somebody before you start interacting.
Allan 14:31
Let's pause right there and hear a brief word from our sponsor.
Hacker Valley Studio 14:34
When it comes to IT and security, we can all agree on two things: complexity is increasing, and the manual asset inventory approach no longer cuts it. It's time to adapt, and that's where Axonius comes in. Axonius correlates asset data from existing cybersecurity and SAAS solutions to provide an always up to date inventory, uncovered gaps, and automate actions, giving you the confidence to control complexity. Sign up for a free walkthrough of the platform at Axonius.com/Get-A-Tour.
Allan 15:10
I've got a new employee on my team, he's only been with us for a couple of weeks, but he's already received three text messages from the CEO with the gift card scam. "Oh, I'm in a meeting, I need you to go get some gift cards." Yeah, okay, and these same things happen by email. It's all you know, it's amazing with LinkedIn and everything else today, they know, they find you, they figured out it's the new guy, they trick you, and they'll impersonate somebody. It's important, I think, for all the true somebody's in the company, the CEO, CFO, CMO, anybody with any rank who's going to be sending out regular emails, possibly spoofed, possibly legit, whatever, to set some boundaries and expectations. CEO sends out an annual reminder, twice a year, or even a once a quarter reminder, you are never going to get an email from me about gift cards. That is just simply not going to happen. Worst-case scenario, it
might be my admin, but even then, that is incredibly unlikely. Just sort of set some expectations there as well, right?
Tim 16:08
Yeah, absolutely. I don't know any CEOs that asked their employees to buy them gift cards.
Allan 16:12
Right. I've never seen it in the real world ever. So, alright, if the users never fall prey to these attacks, is there a reason to continue performing them? Like, if the users are not getting caught by real phish, is there a reason to continue sending fake phish?
Tim 16:29
Yes, but with a much less frequency, and the reason why is because typically, you're still getting new employees, right? So, if you've tested all your existing employees, and they haven't fallen or been susceptible to it, that doesn't mean that the next employee you hire is also going to be of that same mindset. And so, I think it still is useful from that perspective. And then, the other thing is insurance companies, cyber insurance companies are looking for you to do this kind of stuff now. Sometimes it's written into your ISO compliance policies, or other various different regulations that you might be following. Yeah, so, there are reasons to continue doing it. It also, I think, sometimes from a potential customer, they may have you fill out a security questionnaire, and again, those are the sorts of things that sometimes you get asked, and so. it just generally makes it look like you have a more mature security program, if you're doing those kinds of things, even if you know that it might not be moving the needle that much in terms of overall security posture for your company. I still think you should do it,
minimally at least once a year, right? Just kind of at least double check your employee base, and again, kind of hit some of those regulations and be able to answer those questions a little bit better.
Allan 17:37
And you're right about the cyber insurance policies, I hadn't thought about that one. They're getting so much more prescriptive these days, and insisting on so much more granular and specific action, it's no longer: Do you have a firewall? It's: How is it configured, right? It's no longer, "Do you have endpoint protection?" It's, "Which brand have you chosen? Pick one of these three." I'm seeing more and more prescription coming out of these cyber insurance policies, which reminds me of, in terms of that kind of regulatory compliance type stuff, I was, again, in the data services industry and we had so many questions coming in from folks about antivirus and we were on an EDR solution. We ran into a customer who flat out refused to acknowledge or accept that EDR existed, their policy said antivirus, you better darn well have antivirus. So, we turned on the free antivirus in Windows and complimented our EDR solution with that, simply so we could check a box, for no reason other than, "Yes, we have that, too, check the box. Look, there's any virus out there, now go away." It was literally a box check.
And so, I think if your security awareness training is getting driven by that, it's maybe gone too far, but I think you're right. I think the regular recurring reminders, even if the crowd is doing what you need them to do, it doesn't hurt to have the refresher on occasion, right? So, how about clicking links on legitimate emails, right? We already talked about marketing and HR sending these automated systems from outside the company, posing as them, like there's that whole willful spoofing on purpose, businessaligned spoofing, but then there's also just links, period, right? I mean, how many emails today don't have a link in them? And how do you possibly get on top of that, when you're talking about security awareness training?
Tim 19:06
So, I think one of the reasons that I came up with this question, or I thought of this question was because, in a previous company of mine, I was actually told by senior leadership inside of the company that we could never have any phishing simulation that was sourced from anybody saying that it was HR. It didn't matter if there was an external banner on the top, it didn't matter if this employee didn't exist in the company. It didn't matter anything. It was just that we had such a high click rate for any email at all that had anything to do with HR at all. They're like, "No, you can no longer send any of those emails because people are now afraid to click on emails." And I thought that was a ridiculous response personally, and I was like, "We have a problem. This has identified a problem, and what you're telling me is that we should go around this and basically create a workaround, so that our users no longer have to address the actual root of the problem." I thought that was insane. And so, personally, I think that, it's good for users to spend 3 seconds mousing over the link and seeing where it's going. I don't think it really disrupts their day that much, or slows down their progress or their efficiency that much at work, to be a little bit more thoughtful about looking at the emails.
Allan 20:19
Yeah, no, I agree. People will retort to that one about the phone, "Well, everyone's on their phones these days." You know what, if you see the email on your phone, and it looks even halfway dicey, wait till you're on a computer and do the hover over, right? You can wait 30 minutes to get on your computer, right? I mean, just some basic slowdown, check it out. I agree with you. I think that stuff is just some basic human behavior required. In the logic that you were just presented, the metaphor I just came up with was, you're walking in the backyard at night with a flashlight, looking for the dog poo, and someone says, 'Don't shine a light on that poo and I won't step in it," right? I mean, that's what I'm saying, "Turn that flashlight off, we might step in dog poo, wait a minute." So, that seems to be the logic that they're espousing on that one there. That's my metaphor, the best I came up with on the fly. So, how about ineffective methods? What ineffective methods are there of security awareness?
Tim 21:04
Well, I think we talked in the beginning about point and click, the video click thing, and in my opinion, I just don't think it's effective at all. I think that, again, it's just people just try to get it over with. And so, it checks the box a lot of the time because there's, again, cyber insurance and different compliance standards and cyber questionnaires, they all want you to be doing the security awareness training. So, you check that box, but it doesn't tend to accomplish a whole lot. Yeah, I'd actually be interested, you're involved with this at some level on their board, can you talk a little bit more about the escape rooms and how those work and your experience with that?
Allan 21:37
Yeah, you were talking about how physical and in-person used to be the fun, that's how these escape rooms started, was they literally went and toured around and did physical escape rooms. You would sign up and have six or eight employees in a room, and it was literally like one of those real escape rooms, where you had to figure out the clue and open the drawer and find the thing and that's the next clue, and that goes into the next, but it was all about cyber, and every little lesson and every little clue was something about cyber. So, you were given some brief lessons upfront and then, you had to use what you learned from those lessons to get out of the escape room. They virtualized that and have that now online in a virtual format and again, you form teams, and ideally, if you're gonna organize it correctly, in the company, there's a couple of ways to do it, right. You can say like, "It's HR versus marketing," right? Like, you can set up that kind of rivalry, or you can deliberately pick a team that's folks from all over the company and say, "You eight who've never even been together before, you're now joined at the hip, and trying to escape this virtual escape room," right? So, you know, there's those kinds of physics. And then, the other thing, and I don't want this to turn into a Living Security show, but there's some really cool new stuff where they're actually starting to talk to the tech stack and harvest data and add to the human element of measurement, by way of: How does the human interact with the tech stacks? So, imagine, you're a human being in a shop, and that shop has firewalls and that shop has web browsers and that shop has EDR and that shop has antivirus, that shop has all these various tools, your specific and personal interactions with those tools could be calculated and contribute
towards an overall score in assessing your security awareness and posture. Training could be targeted based on those specific things, right? So it's that kind of idea as well, right? I think that's where security awareness is headed.
Tim 21:39
Cool you also mentioned earlier, so risk quantification and how that interacts with security awareness training. I didn't follow that one. Can you go into that a little bit more?
Allan 23:19
Yeah, so, at the end of the day, why are we doing user awareness training? Because at the end of the day, we're saying there's some sort of a risk represented here, right? Let's say, I've got exactly 100 employees, we'll use nice round numbers. I got 100 employees, and 4 of them are recurring clickers, and 2 of them click occasionally and the other 94 seem to never click at all and they all seem okay, good. So, we're in a fairly decent shape. What is the risk, the actual risk to the business represented by those 6 clickers, especially with 4 of them being recurring? What are their roles in the company? Are you simply just deploying security awareness training? Or are you saying to yourself, "Well, these two guys are in finance, this guy's in payroll, this gal has got access to this sensitive medical data." Like, you have to start, to me, not just blindly throwing a security awareness training out there, but actually starting to tailor it specifically to: Who is the person? What data do they have access to? What is their
role in the organization? If a CEO miss-clicks a thing, or a CFO miss-clicks a thing, that's a lot more critical than if the guy in the mailroom does it, right? So, there's a certain amount of risk quantification that you should be doing, in my mind, as you roll out a security awareness training program, to not just blanket and blindly do the same thing, because, again, back to how do you report. Okay, great, so click rates are down and reporting rates are up, but if it's all these people that really no damage would occur when they reporting and the handful that great damage would occur are not the ones reporting, you've got a much different problem than if you just look at the reporting rates, right? That's kind of where I get to with that and I'm always trying to find, if anybody's got a clever way of dealing with that intersection, of it's not just we're gonna blindly roll this out to the whole shop, but we're going to specifically think about: Who is this user? And that's kind of that Living Security thing, right? Where they're going in and saying, "It's your interactions with the tech stack." Well, what I'm saying is, "It's your interactions with the business," that's a variable and a factor that should be on the table as well.
Tim 25:02
Yeah, that's an interesting point. I mean, I'm not aware of any of the user awareness training solutions that you can go and assign a criticality score to the individuals inside of the organization, but it'd be interesting if you could do that on your own and base it on either the department or the level of title that they're at, and then, use that to generate some additional statistics around that worst quantification, based upon the criticality of the user and their click rate.
Allan 25:30
Yep, the first time in my life that I was given free identity protection, 2 years worth for free, was because a guy in payroll got a phish, thought it was legit, and sent the entire name, salary, address, social security number, et cetera, et cetera, et cetera, of all North American employees out the door to somebody who turned out to actually be, and I think it was Nigeria, posing as the CEO.
Tim 25:53
That wasn't the C-gate one, was it?
Allan 25:54
No, no, no, no, no, no, I won't say where this was, I worked there. I was an employee, my information went out the door, I was one of the people whose info went out the door, and it was all because one person in payroll, who had access to that most sensitive and confidential employee data of all, fell for a phish. I'm way more worried about that guy clicking than I am about, I don't know, whatever, the guy in the mailroom clicking, right? So, that's where I think that we need to get into some risk quantification aspects of this and not just blindly be always talking about user awareness. It's like, "Well, let's train and target, let's ramp up our efforts with these people. Let's have custom tailored solutions for that department." That kind of thing. Last question, and I changed it up on you because I just started literally, as of this morning's recording, we have a new final question for every guest. It used to be: What have you learned outside of cybersecurity that's helped you in cybersecurity? It's a new one now.
So, this was gonna surprise you, you're not prepared. This one's gonna be off the cuff. Let's see what you come up with. You are given a magic wand and you are told you can wave it and change any one thing in cybersecurity you want to change. What do you change?
Tim 26:56
Ihis kind of goes along with, I would say, our talk track here. I would like to make every individual user actually care about their own cybersecurity as it relates to the overall security of the company. And I'm not saying that the users inside of my company don't care about that today, but I've been in lots of places where it hasn't been. It hasn't been something that mattered to them, and I think ultimately, a lot of the weaknesses inside of our organization are our users. It's the administrative users configuring the products, or it's the end users that are doing their day job and not worried about what they could be causing from a risk exposure perspective. If I could just increase the level of carefulness, or the level of interest that everybody has in keeping their own companies secure, I think we would overall improve the posture of all companies.
Allan 27:48
I like that. I like that. Just if everybody's investment ratcheted up just a little bit, that result would be powerful. I like that a lot. I like that. Well, Tim Silverline, thank you so much for coming on down to the Ranch. Thank you, listeners, y'all be good now.

00:00:00