Data ethics aims to improve overall security by making employees view data differently
Just imagine what a difference it could make to your security and privacy goals if your employees instinctively treated your most sensitive data like it was their own children—zealously protecting it, always being personally accountable for it, thinking twice before handing it off to others, always fretting about where it is and how it’s doing, and always focused on what’s in its best interests.
That’s the goal of data ethics, a new and much different approach to improving security and privacy, reducing data breaches and privacy failures and taking the sting out of those failures when they do occur.
The drumbeat for data ethics training has been growing louder in just the last couple of years. In a landmark report, “The Business of Trust,” Cognizant concluded, “Trust has become the new battleground for digital success. To win, organizations need to master the fundamentals of data ethics. Companies that earn consumer trust will be better suited to weather the inevitable — and yes, they are inevitable — data and policy breaches.”
MIT Sloan added, “A singular focus on security is not enough. While data ethics is a new area for most businesses, it must be a key consideration as organizations evaluate starting or continuing their digital transformation journeys.”
TechCrunch has described data ethics as a strategic business weapon and IDC as the new competitive advantage.
So how does data ethics translate into employee security awareness?
What if employees could be encouraged and empowered to view data differently, to care and account for it better by seeing the data through an ethical and moral lens, to instinctively ask important questions? Could that fundamental change in the way data is viewed and handled improve security, privacy and trust?
What if employees asked questions such as:
- Do I really have permission to collect or use that data, and am I living up to our own privacy promises or their privacy expectations?
- Do I even know what those privacy promises are?
- Do I really need the data at all for my mission?
- Am I using it for its intended or requested purposes or something else?
- Can I take into account the humans—the people, the families, the lives—that data represents?
- Am I doing all that I reasonably and personally can to take care of that sensitive data?
- Do I really need to share the data and if I do, what can I do to ensure it continues to be protected and respected?
- Am I doing my best to ensure the data remains accurate and fairly represents the subjects of that data?
If employees are allowed to ask these questions and think about data this way, there’s a much less chance of them making the kinds of mistakes (or simply data indifference) that could lead to a costly and embarrassing data breach or privacy failure.
And there’s an even bigger and broader business opportunity. Trust has been declining steadily in the U.S., across all sectors and industries and for years. According to the most recent global Trust Barometer from Edelman, the public in China, Mexico, Columbia and nearly a dozen other nations now trust their government, business and media more than the American public do their own.
And that breakdown in trust is largely fueled by a breakdown in ethics. In the case of businesses, that breakdown usually revolves around the relentless and unapologetic collection and use of personal data—an appetite that too often results in embarrassing and costly data breaches and privacy failures.
As Cognizant put it: “Trust is money, and all businesses are losing it.”
And while it’s easy for any organization to claim to be ethical or to have high ethical standards, it’s not so easy to prove. In too many cases, insincere attempts at ethical stances have backfired on businesses amid claims of ethics theater or “ethics washing.”
And that’s where data ethics training can help, by providing a very visible and verifiable commitment to the highest ethical standards around what consumers worry most about, by encouraging employees to always think about the humans the data represents and behave accordingly.