Key Criteria for Evaluating Cloud Performance Testing Solutionsv3.0

An Evaluation Guide for Technology Decision-Makers

Table of Contents

  1. Summary
  2. Cloud Performance Testing Primer
  3. Report Methodology
  4. Decision Criteria Analysis
  5. Evaluation Metrics
  6. Key Criteria: Impact Analysis
  7. Analyst’s Take
  8. About Dana Hernandez

1. Summary

Cloud computing technologies are relatively mature and have achieved high levels of adoption in many organizations. This adoption requires stakeholders to ensure the applications can scale to meet ever-growing demand. Today, these stakeholders include developers, testers, quality assurance (QA) teams, performance engineers, and business analysts. These teams leverage performance testing to ensure the application is performing as expected and users have an optimal experience. Confirming this ability to scale—in terms of users, transactions, and data and processing volumes—is accomplished using performance testing tools.

These performance testing solutions help to isolate issues and identify ways to resolve them. Cloud performance tools are evolving and becoming more robust in their ability to identify problems and bottlenecks earlier in the process. Many have the ability to pinpoint issues and possible solutions. From a database perspective, these solutions help to test and optimize configurations for cache size, bucket size, and input/output (I/O).

This review is focused on cloud-based load-testing solutions in which the ability to scale is based on leveraging hyperscale cloud economics and allowing usage-based billing. This approach expands the ease of use for load testing and removes the need to schedule projects to use limited on-premises testing capacity. Cloud testing enables an application programming interface (API)-level of interaction between developers and testing capacity. It also provides automated low-cost ways for operational teams to test applications before and after changes to corporate infrastructure. Such testing reduces the need to have large pools of people online during a change control event to ensure their application still works after the change or that a change fixes the issue that initially prompted the change.

This GigaOm Key Criteria report identifies the common features needed to make a solution viable (table stakes), details the capabilities that differentiate products (key criteria), and highlights important evaluation metrics for selecting an effective cloud performance testing platform. The companion GigaOm Radar report identifies vendors and products that excel in those criteria and factors. Together, these reports provide an overview of the category and its underlying technology, identify leading cloud performance testing offerings, and help decision-makers evaluate these platforms so they can make a more informed investment decision.

How to Read this Report

This GigaOm report is one of a series of documents that helps IT organizations assess competing solutions in the context of well-defined features and criteria. For a fuller understanding, consider reviewing the following reports:

Key Criteria report: A detailed market sector analysis that assesses the impact that key product features and criteria have on top-line solution characteristics—such as scalability, performance, and TCO—that drive purchase decisions.

GigaOm Radar report: A forward-looking analysis that plots the relative value and progression of vendor solutions along multiple axes based on strategy and execution. The Radar report includes a breakdown of each vendor’s offering in the sector.

Solution Profile: An in-depth vendor analysis that builds on the framework developed in the Key Criteria and Radar reports to assess a company’s engagement within a technology sector. This analysis includes forward-looking guidance around both strategy and product.

Full content available to GigaOm Subscribers.

Sign Up For Free