Why API Gateways are Critical for Cloud Security

What is an API gateway? APIs are an important part of the information economy, allowing applications to communicate with each other, and sharing functionality and data. An API gateway is middleware that manages communications between API consumers and upstream services.

The primary role of an API gateway is to serve as a single entry point for interactions between an organization’s applications, data and services, for both internal and external customers. API gateways can also perform a variety of other functions to support and manage API usage, including authentication, rate limiting, analytics and API security.

Here are a few examples of API gateways:

Amazon API Gateway—A fully managed service that allows developers to create, publish, maintain, monitor, and secure APIs at scale.
● Kubernetes Gateway API—Enables Kubernetes service networking via expressive, role-oriented interfaces, which are implemented by a variety of open-source projects and vendors, such as Istio, Cilium and NGINX.
● Apigee—Google Cloud’s cross-cloud API management platform. Apigee provides end-to-end API management with monetization and built-in monitoring.

How Does an API Gateway Work?

An API gateway provides a focal point and standard interface for applications to communicate with each other. It receives requests from internal and external sources, known as “API calls,” packages multiple requests together, routes them to the appropriate API and then receives responses and delivers them to the requesting user or device.

API gateways are also a basic component of a microservices-based architecture. In a microservices application, application components use APIs to communicate with each other, and with external applications and services. The role of the API gateway here is similar—to provide an entry point to a defined set of microservices and apply policies to ensure availability and secure access.

For example, a Kubernetes API gateway can be used to regulate traffic flowing into and out of Kubernetes clusters, as well as manage communications between services within the cluster.

The API Gateway’s Role in Cloud Security

API gateways include monitoring and logging capabilities, which can help secure API communications, collect logs and analyze them for troubleshooting purposes.

The main functions of API gateways include providing an inline proxy for control over APIs, strong authentication for API requests, setting rules for traffic to backend services, enabling API rate limiting and throttling, logging transactions and providing last-mile security for backend services.

With all requests flowing to one infrastructure object, it becomes possible to maintain two different versions of the application, the old version, and the new version, and seamlessly switch from an old to a new version (as in blue-green or canary deployment). This also creates major challenges in cloud migration projects, in which applications and APIs transition to new versions, often with the old version still operating in parallel.

Because the API gateway is the only access point to the microservices application, you can see all traffic going through the gateway, without manipulating DNS to point to the new entry point, or having to change the application. This also provides important metrics about API services, such as traffic, response time and error rate.

API Gateways in the Cloud: Critical Security Best Practices

Use HTTPS Communication and Strong Authentication
All communication between the API Gateway and the client should be sent over HTTPS. Reliable and secure authentication methods must be established for content that is restricted to logged-in users, making it difficult for attackers to retrieve usernames and passwords and impersonate valid users.

The above capabilities can be implemented at the API Gateway level rather than within each microservice. This avoids duplication of developer effort and ensures a consistent approach across applications.

Deploy a Centralized Authentication Server
APIs or gateways should not issue Access or Refresh tokens. These tokens should always be issued by a central authentication server. Token issuance involves several complex steps, including server authentication, user authentication, client approval and token validation. These tasks require access to various data sources, such as including customer information.

Additionally, when multiple entities issue and sign tokens, it becomes increasingly difficult to manage the encryption required to sign the issued key. Only one entity, the authentication server, can safely handle these processes.

Limit API Requests
API rate limiting prevents the overloading of upstream services with excessive API requests (a common scenario for denial-of-service attacks). With rate limiting, an API gateway only accepts a certain number of concurrent client requests within a specified time interval. Other forms of rate-limiting include:

● Throttling—Reduces bandwidth or terminates client sessions when overloaded.
● Size limiting—Blocks client request payloads that are larger than a certain size.

Deploy Monitoring and Analytics

By monitoring APIs, you can keep tabs on the health of each service and understand potential threats and issues services are facing.

An API gateway centralizes metrics and aggregates log data. It can centrally capture metrics related to requests and traffic. Logging also facilitates an audit trail of all client access requests. This aggregated and centralized data can be exported to a security information and event management (SIEM) for analysis, visualization and alerting.

Equipped with monitoring tools an API gateway can use internal IPs to identify when anomalous activity is occurring, when and which IPs are involved.

Leverage Serverless Functions

Cloud-based serverless platforms, such as AWS Lambda, let you run code snippets in a controlled and secure computing environment.

Serverless functions run code in response to events or HTTP requests and, when no longer needed, the temporary computing infrastructure is shut down. From a security point of view, this protects backend servers from potential attacks, because clients can only access API gateway in front of the serverless function.

Avoid Exposing Internal API Endpoints

If your application is used in different ways—for example, as a mobile application, within IoT devices or accessed by internal systems—it is a good idea to create a separate API gateway for each use case. This avoids exposing central endpoints to the outside world. Each type of user or machine accessing your system has its own endpoint, which serves as a proxy between the outside world and the central API endpoint.

Conclusion

In this article, I explained the basics of API gateways and showed critical best practices that can help you improve cloud security:

● Use HTTPS communication and strong authentication – An API gateway can enforce these practices across all API communication.
● Deploy a centralized authentication server – When you have only one authentication survey, API gateways can effectively regulate authentication.
● Limit API requests – Ensure all requests are limited to prevent abuse and DoS.
● Deploy monitoring and analytics – An API gateway can give you a holistic view of your APIs and help identify and resolve security issues.
● Leverage serverless functions – Serverless functions can help provide just-in-time responses to API requests.
● Avoid exposing internal endpoints – API gateways can be used to create separation of concerns between internal and public-facing APIs.

I hope this is useful as you evaluate the use of API gateways in your cloud deployments.

Avatar photo

Gilad David Maayan

Gilad David Maayan is a technology writer who has worked with over 150 technology companies including SAP, Oracle, Zend, CheckPoint and Ixia, producing technical and thought leadership content that elucidates technical solutions for developers and IT leadership.

gilad-david-maayan has 44 posts and counting.See all posts by gilad-david-maayan

Secure Guardrails