Image: Nicoleta Ionescu/Shutterstock

I once worked in an environment where adding users to Active Directory privileged groups was forbidden except via an official request approved by the individuals’ managers. This was carefully monitored, and on one occasion an email went out to a massive group of people stating the policy had been violated and someone who was named directly in the email had updated a group without permission.

SEE: Security incident response policy (TechRepublic Premium)

Several managers admonished the sender for calling out the alleged perpetrator, and one produced the very request that authorized the change, exonerating the individual and causing embarrassment for the accuser, who did apologize. However, that entire email thread should have been a face-to-face, private discussion with the employee and their manager.

This episode shows the wrong way to go about cybersecurity. Another is tests, like sending company-originated phishing emails to internal recipients to see if they can be tricked into clicking links which then take them to a page scolding them for falling for the content. That simply builds a wall between the end users and the IT/security departments making users less likely to respect these groups. Positive reinforcement is the key to encouraging employees to want to comply for their own good and that of the company, rather than fear of retribution or embarrassment. Even simple recognition from management for reporting phishing emails or completing training can suffice to build a positive environment promoting cybersecurity principles across the organization.

Experts in cybersecurity agree. Sai Venkataraman, CEO at SecurityAdvisor, a security awareness training and automation company, said: “Cybersecurity culture is nearly impossible to quantify due to an absence of measurement tools. Many businesses attempt to quantify the human element of their security posture by sending employees simulated attacks to demonstrate how susceptible workers are to phishing, social engineering, spoofing and other types of hacks. The flawed logic security leaders use to justify these tactics is that simulations help identify high-risk users and secure budget for additional budget. However, the negatives may outweigh the benefits as simulations embarrass workers and position security teams as antagonists rather than allies.”

SEE: How to manage passwords: Best practices and security tips (free PDF) (TechRepublic)

Venkataraman said embarrassing people is pointless. “Embarrassment rarely accomplishes anything positive, and from a security perspective, has been thoroughly discredited. Phishing simulations and other ‘Gotcha!’ security training attacks are an example of shame culture. Experience has taught us that attacking our employees doesn’t increase cyber-resilience as much as it positions the internal IT teams negatively in the eyes of the organization’s employees, making it more challenging to get people on board with strategic initiatives. If anything, these boring training sessions make employees less likely to view the IT team as a force for good within the enterprise. The best security leaders implement tactics and technologies that create a frictionless experience for employees.”

Rather than trying to shame and then coach employees, IT and security leaders should create a frictionless security strategy intended to support workers during their greatest time of need, Venkataraman said. “‘Cookie-cutter’ approaches to security training don’t work over a long period of time. This approach often does not target at-risk users when a potential attack is in progress or is executed with enough frequency to remain top of mind for employees.”

SEE: Working at a safe distance, safely: Remote work at industrial sites brings extra cyber risk (TechRepublic)

Johanna Baum, founder and CEO of Strategic Security Solutions, a provider of information security consulting services, agreed. “Shame is always a bad way to motivate an individual or the masses. It doesn’t work for your kids (we’ve all tried), and it doesn’t translate well to any other population. It might trigger some short-term responses, but fosters long-term resentment and a pent-up stockpile of ill will.”

She offered a different way. “The approach should be to increase overall learning and the individual threat intelligence of every user. It’s hard, it requires significant patience, but is way more effective than setting a trap and full-scale mockery of the transgressor. No one wants to publish their internal cybersecurity test results.”

The general security intelligence of the average user and executives is fairly low so it’s rare to see anyone airing their dirty laundry, she said. “Openly discussing security initiatives, assisting your team in internalizing the global impact and promoting wide-scale security evangelism as an organizational imperative, rather than an IT mandate, goes a very long way to securing the organization—certainly much further than the fired employee who was the poster child for the failed shame game phishing test.”