Table of Contents
- Executive Summary
- API Security in the Cloud
- GigaOm API Workload Test Setup
- Test Results
- Conclusion
- Appendix: Recreating the Test
- Disclaimer
- About NGINX
- About William McKnight
- About Jake Dolezal
- About GigaOm
- Copyright
1. Executive Summary
Data, web, and application security have evolved dramatically over the past few years. Just as new threats abound, the architecture of applications—how we build and deploy them—has changed. We’ve traded monolithic applications for microservices running in containers and communicating via application programming interfaces (APIs)—and all of it deployed through automated continuous integration/continuous deployment (CI/CD) pipelines. The frameworks we have established to build and deploy applications are optimized for time to market—yet security remains of utmost importance.
The challenge of securing and innovating is profound and requires a lightweight, integrated security solution that won’t impede performance and delivery. For example, DevOps teams need security controls that work across distributed environments without invasively slowing down or burdening the release cycle. The maturation of these controls and processes ultimately transitions into DevSecOps, where security is built into the CI/CD pipeline.
The multitude of deployed apps, APIs, and microservices produces a constant flow of communication and data among applications that require active management—both internal and external. Apps can vary greatly in their protocols, allowed methods, authorization/authentication schemes, and usage patterns. Perhaps most important, IT departments need granular control over the entire application ecosystem to prevent security breaches and attacks, be they man-in-the-middle, distributed denial of service, or script/code/SQL injection attacks.
While security is of utmost importance, the pace of modern business demands high performance, and this is especially true in application- and microservice-enabled enterprises. The conventional approach—deploying a perimeter web application firewall (WAF) to protect applications by filtering and monitoring traffic between the app and the internet—is no longer enough. Even internal communication between apps and microservices on the trusted corporate network can be compromised and must be addressed. A defense-in-depth strategy is needed with multiple WAFs.
This report focuses on web application security mechanisms deployed in the cloud and closer to your apps. The cloud enables enterprises to rapidly differentiate and innovate with microservices and allows microservice endpoints to be cloned and scaled in a matter of minutes. The cloud also offers elastic scalability compared to on-premises deployments, enabling faster server deployment and application development as well as less costly compute. However, the cloud is just as vulnerable, if not more so, to attacks and breaches as on-premises APIs and apps are.
Our focus is specifically on approaches to securing apps, APIs, and microservices that are tuned for high performance and availability. We define “high performance” as companies that experience workloads of more than 1,000 transactions per second (tps) and require a maximum latency below 30 milliseconds across the landscape.
For many organizations, performance is a big deal—they need to ensure secured transactions at rates that keep pace with the speed of their business. A WAF or application security solution cannot be a performance bottleneck. Many companies seek a solution that can load balance across redundant microservices and enable high transaction volumes.
The numbers add up. If a business experiences 1,000 tps, that translates into 3 billion API calls monthly. And it is not uncommon for large companies with high-end traffic levels to experience 10 billion or more API calls over 30 days. So, make no mistake, performance is critical when choosing an API security solution.
In this report, we performance-tested four WAFs: NGINX App Protect WAF, AWS WAF, Azure Web Application Firewall (WAF), and Cloudflare WAF. The latter three products were tested as fully managed, as-a-service cloud security offerings.
In our benchmarks, NGINX App Protect WAF outperformed all three cloud products at all tested attack rates. NGINX App Protect WAF produced lower latency at the 99th percentile at 500 tps, where either 5% or 10% of requests were blocked as script injection attacks. For example, NGINX App Protect WAF produced 10x lower latency than AWS WAF at 500 tps on the 5% bad request test at the 99th percentile. Also, NGINX App Protect WAF produced 120x lower latency than Azure WAF and 60x lower latency than Cloudflare WAF at the same rates. Since AWS WAF, Azure WAF, and Cloudflare WAF are fully managed, we do not know what underlying compute resources are working behind the scenes, which makes an apples-to-apples performance comparison difficult. Keep in mind, latency differences were minimal until the 90th percentile, with a significant difference witnessed at the 99th percentile and above. Also, proximity can impact network latency, although we made efforts to ensure our WAF, back-end API, and attack resources were all in the same availability zones.
For NGINX App Protect WAF, we used a single small 2 CPU and 5.25GB of RAM AWS EC2 instance; we captured the maximum transaction throughput achieved with 100% success (no 5xx or 429 errors) and less than 30ms maximum latency. NGINX App Protect WAF achieved about 19,000 requests per second (rps), compared to only 2,000 rps for Azure WAF, 6,000 rps for AWS WAF, and 14,000 rps for Cloudflare WAF.
Testing hardware and software in the cloud is challenging. Configurations may favor one vendor over another in feature availability, virtual machine processor generations, memory amounts, optimal input/output storage configurations, network latencies, software and operating system versions, and the workload itself. Even more challenging is testing the fully managed, as-a-service offerings where the underlying configurations (processing power, memory, networking, and the like) are unknown. Our testing represents a narrow slice of potential configurations and workloads.
As the report’s sponsor, NGINX opted for a default NGINX installation and API gateway configuration out of the box—the solution was not tuned or altered for performance. The three fully managed cloud security mechanisms were used “as-is” since, being fully managed, we have no access, visibility, or control over their infrastructure.
We leave the issue of fairness for the reader to determine. We strongly encourage you to look past marketing messages and discern for yourself what is of value. We hope this report is informative and helpful in uncovering some challenges and nuances of security architecture selection.
We have provided enough information in the report for anyone to reproduce this test. You are encouraged to compile your own representative workloads and test compatible configurations applicable to your requirements.