cyberplace.social is one of the many independent Mastodon servers you can use to participate in the fediverse.
Cybersecurity, fandom, video games, technology, dog photos and most importantly, you.

Server stats:

878
active users

The cyber awareness industry - phishing simulators and such - is almost all complete garbage, just so y'all know.

If you look at your proxy logs, you'll quickly discover you've got entire departments whose job involves opening links and documents from people unknown (also almost every manager does it, when reviewing CVs etc).

If your security depends on nobody clicking a bad link, security and IT fucked up their jobs, and awareness training is just a sticking plaster on your own poor choices.

@GossiTheDog it was a good general idea that went horribly wrong, just beyond the pale in practice

Kevin Beaumont

@hacks4pancakes yeah. It would be awesome if orgs got into a culture of educating people that they will be targeted and how to report things when they've been seriously targeted.. but the amount of orgs I've seen who've spend $100k+ a year on phishing training services and then have nobody actually report MFA bombing etc (as nobody knows how, thinks they need to etc) is insane.

It should be an internal culture thing, not a tick box service.

@GossiTheDog I have had clients that just blanket fire anyone who clicks. Doesn’t matter their tenure or role.

@hacks4pancakes

Here it's simply less bonuses, but the same admission of failure.

@GossiTheDog

@hacks4pancakes yeah.. I've had two jobs with opposing views.

One, in security incidents nobody would ever admit to clicking on anything or doing anything or talking to anybody. Why? Poor culture, firings. As such as we couldn't manage security incidents, as human intelligence about what happened is good intelligence.

The other, almost everybody admited what happened, security incidents were much easier, security culture was great as no end users got blamed and architecture got fixed.

@GossiTheDog you can educate users about threats without being a dick, for sure

@hacks4pancakes @GossiTheDog
my observation on phishing training is that it only serves to create 'numbers that can go up' and that's all that matters for some types of leaders. numbers must go up, that is all. (or down, in this case)

@hacks4pancakes @GossiTheDog and, oddly enough, people learn better when you have a collaborative relationship than with an adversarial one.

@GossiTheDog @hacks4pancakes One of the most interesting talks I’ve seen in recent years was about how an org had applied their overall culture of zero blame in safety incidents to security as well.

Just as the driver for safety is to use incidents as a learning exercise, taking the same approach with security incidents led to an environment where people felt safe to report mistakes, knowing that they wouldn’t be reprimanded or suffer any consequences.

Result?

Everyone feels comfortable to report even issues proactively, which has put the response teams on the front foot and measurably improved their overall security posture.

They also scrapped the fake phishing campaigns, which were felt to be creating a feeling of “they’re always trying to trick us or catch us out.”

@richardstocks @GossiTheDog @hacks4pancakes

My most severe criticism of most corporate fake fishing test campaigns is that they can't be bothered to make their actual real and legitimate corporate email campaigns clearly distinguishable from phishing.

When you force us to take training to identify phishing, you should also consider how your corporate campaigns match (or not) the "signs of phishing" in your training!

@JeffGrigg @GossiTheDog @hacks4pancakes

That’s another good one. We get all manner of official internal comms that for some reason still obfuscates links through trackers.

Sender address that doesn’t match our domain? Check!
Imparts a sense of urgency? Check!
Wants me to click on an unverifiable link? Check!

But it’s my fault if I don’t read the content that looks like spam…

@JeffGrigg @richardstocks @GossiTheDog @hacks4pancakes one of my pet peeves. And often it takes actual engineering work that nobody will approve in order to fix it.

Even if that weren't the case, somehow we're so supposed to know the specific ways that each company will interact with us. Otherwise, we won't realize that something is out of the ordinary. There needs to be some industry standards around this stuff just so that people can't be so easily fooled.

It's as if every city had a radically different design for road signs and traffic signals. How are you supposed to know whether you're expected to stop or go? Or which side of the road to drive on? If you can't easily figure it out, you'll get it wrong, with potentially devastating consequences.

@hacks4pancakes @GossiTheDog Sounds like a good reason to tailor the exercise to things senior management folks are most likely to click on ('your Ferrari registration {their registration} is overdue a service and insurance cover will become invalid unless you...').

@hacks4pancakes @GossiTheDog I ensure all phishing campaign data is anonymized because this is a fear of mine.

"Well how can we know who needs remedial training if you won't tell us who fell for the phish?"

"I refer you to the RoE you signed, which clearly stated that names of specific individuals would not be stored or even captured. I would suggest a general awareness training, as this will assist those who need assistance, while reinforcing the success of those who saw through the email."

Gophish absolutely keeps track of that data if you ask it to, but fuck that noise. I'll be goddamned if some middle management puke takes my findings and uses them to fire people so he can retain a yearly bonus.

@GossiTheDog @hacks4pancakes I was in an org that started down the education route and then got made to switch to punishment by the CIO and his primary stooge, the head of ops. Not that long after the ops manager clicked a phish.

No consequences for him, unlike the other poor sods who did so.

@GossiTheDog @hacks4pancakes Always seems to happens when something becomes a compliance mandate, phishing prevention becomes a cost center to minimize over being a security risk to mitigate. 🤷‍♂️

@GossiTheDog @hacks4pancakes believe it or not. We actually have users reporting MFA bombing. Obviously measures are in place, technically, to catch this, and take automatic action.
Most orgs I have seen are just too scared to do this.

@nieldk @GossiTheDog @hacks4pancakes Doesn't help when windows update periodically MFA bombs you.

@AMS @GossiTheDog @hacks4pancakes that won’t happen in a proper org setup, with solutions like MDM/Intune or similar setup. ;)

@nieldk @GossiTheDog @hacks4pancakes something I really push my teams to do is take the time to not only thank the reporter but, when it's a false positive that isn't reasonably detectable as such without InfoSec tools, give them an attaboy for correctly reading the signals. Sometimes they feel like they're bothering us. We need them to feel great about "bothering us" in those situations.

@GossiTheDog @hacks4pancakes I’m *starting* to see orgs care about the number of people who report (so we can make defenses better!) instead of who just clicks.

But that’s still the minority.

@GossiTheDog Or how much people are being *trained to* use their corporate credentials to log in anywhere all and sundry. Including in random windows that pop up and which give no indication of where they are coming from.

@hacks4pancakes

@GossiTheDog @hacks4pancakes I always try to encourage people to report suspicious emails to IT.