Cloud Security Myth #4 – Security slows down your ability to get insights

Author : protegrity

Data today drives businesses, industries and economies– browsing and purchase habits, search queries, software logs, social media activity, sensor networks, point of sale and more. Leveraging that data for competitive advantage is one of the more pressing priorities of organizations today. Some organizations are leveraging their Big Data for real-time analytics. Because Big Data often means unstructured data, there’s a good chance that means NoSQL databases will be involved. But what does that mean to your security? And what about your performance?

Let’s talk about security first. NOSQL databases have varying levels of security today. While commercial enterprise versions of popular NoSQL databases now offer advanced security features like authentication, authorization, auditing and encryption, not all do. In those NoSQL databases with encryption, some offer granular security, but others are limited to blanket encryption. Other NoSQL databases offer limited security features with only authentication and access security measures. Providers recommend using the database in a “secure environment”! That means that the security for the data within the NoSQL database must be provided by the application accessing the data – or by securing the data itself.

In addition, traditional perimeter-based security approaches are challenged by the use of NoSQLs and Hadoop. While Hadoop makes it easy to store, retrieve and query Big Data at scale by distributing it across a number of different computers, to completely protect all data, each node would need be secured. If you are leveraging third-party vendors for big data resources, the co-location of your data can also create vulnerability. At-rest analytics and business intelligence represent valuable targets for attackers.

Clearly, protecting the data is essential. The simplest solution would seem to be to encrypt it all. That presents two challenges. The first is where performance comes in. When data lakes are protected as a whole, the constant encryption and decryption of huge data chunks tend to slow down performance. Plus, encryption only protects data at rest – data is unencrypted as it travels between or is used by applications, leaving it exposed to breach.

Data security and real-time analytics are not exclusive, but they do require the right technology to effectively co-reside without impacting performance. Protegrity suggests putting in place policy-based controls to identify sensitive information. From there, sensitive information should be protected with encryption and tokenization. Sensitive data is deidentified, rendering it useless even if it should be breached. Meanwhile, non-sensitive data can remain in the clear. This enables maximum usability for users and processes to continue to mine the data for transformative decision-making insights, without slowing down performance.

Don’t leave your data unprotected to improve your analytics performance; learn more about The Secrets of Cloud Data Security instead. Choosing a data-first security approach is the first step towards getting the most out of your data, without security risks.

*** This is a Security Bloggers Network syndicated blog from Blog – Protegrity authored by protegrity. Read the original post at: