Four Steps to Integrate Zero-Trust

Data is arguably a company’s most valuable asset. While there is no shortage of examples of crippling data breaches caused by attackers from outside companies, there’s far less conversation about threats coming from within. Approximately 19% of data breaches are caused by insider threats, and most organizations spend less time developing security postures for these, leaving them vulnerable to hacks, breaches and potentially devastating loss of business.

So, how do you know who to trust in your company when it comes to protecting your most sensitive data? The answer: No one.

Better known as the “zero-trust” principle, is a security model that eliminates the idea of “trusted” or “untrusted” networks, devices, personas or processes. Instead, it relies on authentication and authorization policies with the goal of providing everyone with “least privileged access,” which maintains that a user or entity should only have access to the specific data, resources and applications needed to complete a required task during a specified time.

While zero-trust isn’t a new concept, Gartner predicts that even if organizations plan to achieve zero-trust, by 2026 only 10% of all enterprises will succeed in implementing a fully mature, zero-trust approach. The threat landscape will continue to evolve in the interim and those who employ a zero-trust model may be able to avoid or minimize the impact of about 50% of attacks.

This article will outline four best practices for companies that want to establish data-centric, zero-trust security protocols — both inside and outside of their organizations.

Classify Your Data

Data classification is a critical component of implementing a zero-trust architecture. This process allows organizations to assess their data, and implement customized security measures and establish appropriate access controls, encryption mechanisms and monitoring functions.

It also ensures that sensitive information is adequately protected, reducing the risk of data breaches, regulatory non-compliance and reputational damage. Additionally, data classification facilitates risk management and informed decision-making around data handling, retention policies and data-sharing agreements.

AI-based automated data classification tools are widely available and simplify the process by scanning, analyzing and categorizing data automatically. It can also expedite incident response efforts by enabling organizations to quickly assess the potential impact of security incidents and take prompt action to mitigate risks, allowing organizations to efficiently manage and safeguard their data.

Apply the Principle of Least Privilege to Enforce Access Controls

By understanding and identifying which roles should have access to specific data classifications, organizations can enforce the principle of least privilege, which dictates that administrators and security professionals always assume a breach is being attempted. Rather than “trust but verify” companies should instead “verify and never trust.”

With a zero-trust approach, establishing access controls for insiders is vital when authenticating users and preventing any individual from seeing and accessing all data. This inside-out approach requires security leaders to define the microcore and perimeter (MCAP), which includes all data, services, applications and assets to be protected, with a comprehensive set of controls, in a company.

Partners and solutions providers should be able to verify this authentication and know what segment of the data they need. This ensures individuals are granted access only to the data required for their job roles through associated permissions and access rights. As roles evolve or employees transition, access permissions should be adjusted accordingly.

Additionally, strong authentication mechanisms, such as multi-factor authentication (MFA), further enhance access controls by requiring multiple forms of authentication, enforcing the principle of least privilege.

Monitor, Encrypt and Log

Monitoring tools provide user analytics to create an understanding of normal behavior patterns within a given system. To achieve this, companies will need to set proactive alerts and examine user behavior analytics. From there, those analytics should be continually updated, reverified and authenticated.

Encrypting data at rest and in flight is a crucial aspect of implementing a zero-trust architecture, ensuring the confidentiality and integrity of sensitive information. For data at rest, robust encryption algorithms such as AES (advanced encryption standard) encrypt the data before it is stored in databases or on disk, making it unreadable without the corresponding decryption key.

As for data in flight, establishing secure communication channels using protocols like TLS (transport layer security) or IPsec (internet protocol security) will safeguard data as it travels between systems, preventing eavesdropping and tampering.

Finally, implementing end-to-end encryption across all layers of the network infrastructure ensures that data remains protected throughout its journey, even in the event of network compromises.

Set the Ability to Recover and Backup Data

To ensure these mechanisms are effective, companies must proactively back up their data with tools that continuously log the environment and activity. Data backup serves as a fail-safe mechanism to ensure that if – and when – an incident happens, that data can still be recovered.

While backing up data is key in a zero-trust architecture, the “immutability of data” is equally important. As the “bad guys” get more creative, companies must also protect their backup data. Companies looking to adopt recovery and backup solutions should find providers that have controls to indelibly resist deletion unless more than one approved administrator is involved.

For example, if someone is active in the system at 2 a.m., the monitoring capabilities would be able to flag the abnormality and log the changes, and the backup functions would take a snapshot of the data at that moment in case recovery is required. If the data were to be corrupted or deleted, administrators would be able to recover the sanctity of the data.

By combining these techniques and adopting the zero-trust principle of “never trust, always verify,” organizations can bolster their data security posture and foster a robust zero-trust architecture.

Avatar photo

Jim Cosby

Jim Cosby joined NetApp in November 1999. Jim has over 25 years of engineering and technical sales experience supporting a variety of Federal and Commercial customers. Jim has held various positions during his career including Senior/Enterprise/Consulting Systems Engineer, Technical Account Manager, Manager Systems Engineering, and Director Systems Engineering. Jim has focused on Storage Technology for 17 years and Data Security Technology for 4 years. He led the Civilian US Public Sector team of Systems Engineers for 6 years and currently leads the Partner Ecosystem SE team at NetApp. Jim has worked at various companies including SMS Data Products, Unisys, Auspex Systems, NeoScale, Decru, and NetApp. Jim has a Bachelor’s of Science Degree in Management Information Systems from Old Dominion University. He lives in Ashburn, VA with his wife and son.

jim-cosby has 1 posts and counting.See all posts by jim-cosby