The Truth About Serverless Security

Serverless architecture shows an annual growth rate that exceeds 700%, evidence that companies highly value its advantages, including a shorter time to market, lower cost and better scalability.

But what about the disadvantages? Are there any pitfalls people tend to overlook? Yes there are, and they mostly relate to security measures.

Who is Responsible for What in the Serverless World?

When you build  a serverless app, cloud providers secure your project—but only partially. They protect databases, operating systems, virtual machines, the network and other cloud components. However, they are not in charge of the application layer. It’s up to the app’s owner to defend it against cyberattacks.

Unfortunately, many businesses don’t realize their responsibilities. As serverless security platform PureSec noted in its survey, “There’s a huge gap in security knowledge around serverless.” To narrow this gap, let’s get down to the major challenges that come alongside the benefits.

Severless Security Challenges

More permissions to manage

In the serverless ecosystem, we have lots of independent functions, each with its own set of services and responsibilities, its individual storage and state management system. This causes hundreds of interactions and may create a situation in which certain functions get more permissions than they are supposed to have. For example, functions that were developed to make calculations or send emails can get access to database resources.

Precautions:

  • Review each function and determine what it really needs to do.
  • Follow the rule of least privilege—minimize roles and permissions for functions so that each of them can do no more than performing its task.
  • As functions are deployed, they should be continuously scanned for suspicious activity.

More points of vulnerability

Your serverless application is split into small parts and each of them can be triggered in different ways, not only by API gateway commands but also by cloud storage events, database changes, data streams, IoT telemetry signals and emails, to name a few. This expands the attack surface and makes it more difficult to eliminate malicious event-data injections.

Keep in mind that traditional web application firewalls (WAFs) protect only those functions that are called by an API gateway, leaving all other parts open to attacks. For example, serverless apps can be plagued by SQL (structured query language) injections, which embed malicious code into a request.

Precautions:

  • Alongside WAFs, apply perimeter security to each function to protect it against data breaches.
  • Identify trusted sources and add them to the whitelist. Use whitelist validation when possible.
  • Remember Doctor House’s credo, “Everybody lies,” and monitor updates to your functions continuously.
  • Apply runtime defense solutions to protect your functions during execution.

More third-party dependencies

Dependencies of functions that rely on third-party software (open-source libraries, packages, etc.) are difficult to monitor regardless of the app architecture. However, with serverless, the task to control them manually becomes extremely challenging.

Precautions:

  • Avoid third-party packages with lots of dependencies.
  • Derive components from reliable official sources via secure links.
  • If you run a Node.js app, use package locks or NPM shrinkwrap to ensure that no updates will penetrate into your code until you review them.
  • Continuously use automated dependency scanners such as io or OWAS Dependency-Check to identify and fix vulnerabilities in third-party components.

More data in storage and transit

Regardless of the architecture, sensitive data exposure is considered one of the major security problems. So most of the practices we use to secure traditional apps also work well for serverless. What we must consider is that attackers can target other data sources, extracting sensitive information from cloud storages and database tables instead of servers.

In the world of serverless, functions interact with each other as well as with third-party services and exchange data more actively than in the classic approach. The more information shared, the greater the risk of data leakage or destruction.

Precautions:

  • Identify at-risk data and reduce its storage to the necessary minimum.
  • All of the credentials within your functions that invoke third-party services should be temporary or encrypted.
  • Provide automatic encryption of sensitive data in transit.
  • Use key management solutions offered by the cloud infrastructure.
  • Set stricter constraints on allowed input and output messages coming through an API gateway.
  • Send information over HTTPS (HyperText Transfer Protocol Secure) endpoints only.

More hustle with authentication

Unlike a traditional server-based application, serverless software is decentralized and has multiple access points including web browsers, mobile apps, etc. To ensure security, you need to authenticate all end users and control what resources they can access.

Precautions:

  • Rather than build a complex authentication system from scratch, use one of the available access management services.
  • Keep access privileges within the serverless infrastructure to a minimum by default and increase them manually when needed.
  • If you allow users to edit data, perform additional validation for actions that can destroy or modify data.

 More wallet-busting attacks

Autoscaling is one of the killer features offered by serverless. Alas, this technology not only created new opportunities for businesses but also begot a new generation of hacker attacks called Denial of Wallet (DoW).

When a traditional app becomes a victim of denial-of-service (DOS) attacks, a flood of fake requests creates a kind of a traffic jam and makes your services unavailable for regular customers. Serverless architecture dictates a different scenario, though. While under attack, the app isn’t blocked. Instead, it responds by scaling up in an attempt to deal with an avalanche of calls.

What happens next? The cost of serverless infrastructure grows dramatically until your budget is exhausted. It’s up to the enterprises, not to the cloud service vendors, to pay the bill for the overrun.

Precautions:

  • Set budget limits and alarms based on your current spending (though this may cause a DDoS attack when the hacker reaches the predefined limits).
  • Put limits on the number of API requests in a given time window. Allow a client to make one call per second while blocking additional calls.
  • Use DDOS protection tools.
  • If API gateways are internal and used only within other components, make them private and thus unapproachable for attackers.

Bottom Line

By combining a variety of precautions and tactics throughout the life of an application, from idea to deployment to maintenance, you’ll win this game and hit the jackpot.

Roman Sachenko

Roman Sachenko

Roman Sachenko is a backend software development and Node.JS fan. Beyond writing code, he likes writing articles on complex tech topics and make them easy to understand for everyone. Currently, his primary interests are in the field of IT and IoT security, microservices, and serverless.

roman-sachenko has 1 posts and counting.See all posts by roman-sachenko