How to Develop a Human Centric Security Policy

Written by isaac-kohen-teramind | Published 2020/02/13
Tech Story Tags: data-security | rsa-conference | security | cybersecurity | data-privacy | human-centric-security-policy | security-policy

TLDR Isaac Kohen is the VP of R&D of Teramind. Kohen: The Human Elements of Cybersecurity: Privacy, Ethics, Usability, and Responsibility. IBM study: Human error is the leading cause of 95% of cybersecurity breaches. The solution is to use autonomous systems, such as employee monitoring, UEBA, and DLP systems, to implement endpoint security but do so without inadvertently capturing employees’ personal data and exposing yourself to privacy violations. Instead, security shouldn’t compromise on user freedom and creativity, he says.via the TL;DR App

The Human Elements of Cybersecurity: Privacy, Ethics, Usability, and Responsibility

Along with my team, I have been been a passionate supporter of the annual RSA Conference. It’s a place where top cybersecurity leaders and community peers come together to exchange the biggest, boldest ideas that help propel the industry forward. I like that RSA conferences feature a key theme that’s predicated on an industry movement, contribution, or idea that has the potential to significantly impact or disrupt the status quo. This helps professionals like us focus on the most prescient trends impacting the industry. This year’s theme is “human element,” a topic that I highly value.
Information security professionals often interpret the human component of IT as “human fallibility,” the weakest link in a company’s data security apparatus. You can’t blame them. In many cases, cybersecurity incidents are enabled by human error, malicious intent, or ignorance. In fact, according to a study by IBM, human error is the leading cause of 95% of cybersecurity breaches. Therefore, it makes sense that the industry is increasingly investing in technologies, strategies, and standards that minimize these human risks. It’s one of the primary reasons that technologies offering behavior monitoring, insider threat detection, and data loss prevention tools are designed to reduce threats from both malicious and accidental human actors.
However, this isn’t a diatribe about the obvious predicament facing today’s data security landscape. Instead, I’ll look from the other side of the human equation: the users we are supposed to guard. Humans aren’t just resources that you can force to comply with security best practices. We have feelings, concerns, and needs. An effective security strategy will need to address these human elements.
For example, if you implement a strong password security policy without addressing the human tendency to look for convenience, people will find a way to bypass the rule. They will either write it down in plain text, save it on their browser, or start repeating the same passwords on unsanctioned/personal sites. You will need to provide them with an efficient option such as SSO, key vault, or something else to manage their passwords easily.
Similarly, let’s consider workplace monitoring. Many companies use these services to improve productivity and to reduce insider threats and data leaks. However, if you ignore the employees’ right to privacy, you will risk legal ramifications, not to mention cultural rifts, loss of trust, and many other issues that will outweigh any security benefits you can achieve. In other words, you need to adopt solutions and policies that are effective at delivering not just a functional security but enables inclusion. Let’s take a look at how this is accomplished.
Privacy
In recent years, data privacy has become the topic of conversation among cybersecurity professionals because of the introduction of GDPR, CCPA, and other similar laws. On the one hand, you need to protect your customers’ data, your intellectual property, and business secrets from external or insider threats. At the same time, you have an obligation to uphold your employees’ privacy. The solution is to use autonomous systems, such as employee monitoring, UEBA, and DLP systems, to implement endpoint security but do so without inadvertently capturing employees’ personal data and exposing yourself to privacy violations. For example, suspend monitoring and keystrokes logging when users visit their bank’s website or access their personal email account, use anonymization or smart blackout features to redact PII/PFI/PHI or other private data. This can be a bit tricky and requires modern solutions that have such capabilities.
Ethics
While data security is undoubtedly a good thing, it’s also a nuanced issue that can present companies with an ethical dilemma. After all, you are protecting your organization, customers, and employees from a devastating data loss event. In reality, things aren’t as black and white. However, it’s easy for motivations to get muddled when working to protect customer data.
For instance, employees might wonder why you are implementing specific security measures or monitoring initiatives. Is it because you want to increase your workplace productivity? Do you truly need to scan their emails to achieve that? While the goal of data security is ethical, the defensive measures need to be appropriate. Finding the purpose for monitoring and security and establishing boundaries and transparency protocols is key to avoiding such ethical pitfalls.
Usability
Security shouldn’t compromise usability. Instead, it should enable freedom and creativity. Fortunately, with the introduction of machine learning/AI, NLP, context-based classifications, and other software developments, companies can balance security and usability. However, you still need to spend time configuring those solutions or training them with enough data to minimize false positives. In addition, the success of your security initiative will suffer when you block a workflow without offering an alternative solution. For example, you might think blocking the use of cloud drives a sensible precaution. However, if you don’t allow another channel such as a private cloud or a ‘cloud-like’ solution such as Transporter or Space Monkey, employees will most likely share those files using email, USB drives, or less secure methodologies, ultimately making it even harder to enforce your security policy.
Responsibility
Data security isn’t just the responsibility of security experts. To be successful, data security priorities have to be a collective effort that extend to all levels of the company. Indeed, everything from election hacking and deep fakes to the weaponization of information can’t be addressed if we just rely on security professionals and technologies.
The problem is too big for a single group to handle. So, what can we do as security professionals to drive mass engagement? Most importantly, we can evangelize the importance of data privacy best practices.
Organizations like RSA are doing a great job of spreading the word, but we can all help out too. Educate and train people whenever you have a chance. Skills like avoiding phishing emails, detecting the signs of social engineering, acting responsibility online, using basic protections, and reporting spam calls are some topics we can all share on our social channels. The more we share, the more awareness we create.
Conclusion
It’s easy to pass the buck and blame the users when they do something wrong, but as security professionals, we are the ones who are responsible for weighing the hard decisions between security and privacy, ethics and profitability, usability and compliance, responsibility and authority. Developing a human-centric policy to security will make it more approachable to our users and, in turn, propel its success. As our friends at RSA say, “it’s about people protecting people.”
About the Author: Isaac Kohen is VP of R&D of Teramind, a leading, global provider of employee monitoring, insider threat detection, and data loss prevention solutions. He recently authored the e-book: #Privacy2020: Identifying, Managing and Preventing Insider Threats in a Privacy-First World. Follow on Twitter: @teramindco.
Photo credit: Mavo Images stock.adobe.com

Written by isaac-kohen-teramind | Isaac Kohen is the VP of R&D of Teramind https://www.teramind.co
Published by HackerNoon on 2020/02/13