Table of Contents
1. Executive Summary
Whether you are looking for a voice solution or a complete unified communications (UC) platform, verbal communication is a critical requirement. Unified communications as a Service (UCaaS) delivers seamless collaboration functionality, blending telephony, messaging, and video capabilities. (Figure 1)
UCaaS platforms are typically easier and more cost-efficient to deploy than on-premises solutions; they enable flexibility for remote working and can trigger workflows in other applications and services across the enterprise. They can improve productivity through more fluid discussion of work items and ease the sharing of assets connected to discussions. High-quality, face-to-face meetings with low latency allow more natural conversations and lead to better business outcomes. Usage analytics helps teams understand the frequency and quality of internal and external interactions and can drive process efficiencies.
This report compares select vendors for the phone services aspects of UCaaS and tests key areas of functionality, ease of use, quality, and uptime performance supported by service level agreements, synthetic testing, and human testing. In addition, service level agreement (SLA) terms were assessed for each vendor. The selected vendors include Microsoft, RingCentral, and Zoom.
Overall, all three vendors’ uptime performance was high at 100% for the test period. All three vendors have compelling features that, for our testers, enhanced their call and meeting experience. Conversely, all three vendors have areas of potential improvement in their feature sets, although some are already on the vendors’ roadmaps. All three vendors had varying levels of SLA terms and potential credits for customers.
We found that Microsoft performed ahead of the competition in several key areas. Synthetic testing of the platform illustrated an Application Performance Index (Apdex) score of 100% satisfied transactions over the test period, alongside 100% uptime. Apdex is an open standard for measuring the performance and behavior of software applications. The other tested vendors fell short in this area.
Our testers rated Microsoft as the top vendor for both ease of use as well as call clarity and quality. They highlighted features such as live captioning and transcriptions of audio, video, and voice mail as excellent. Microsoft’s reporting of call quality also more closely reflected the actual tester experiences and the synthetic testing results. Microsoft SLA terms were the most publicly available and accessible online. It commits to a slightly lower call-quality percentage than other vendors; however, it is willing to credit customers with 100% of their bill.
Figure 1. Integration of Solutions in a Unified Communications Platform
Context
Today, Microsoft Teams provides customers with an SLA that offers service credits if Teams falls below system uptime targets. The three SLA metrics are:
- Teams core service: Presence, messaging, and online meetings (VoIP calling not included)
- Calling plans and audio conferencing: Outbound for 1:1 calling, inbound for audio conferencing
- Voice quality: On Ethernet/IP phones only
Most Microsoft products have a 99.9% SLA metric target except for 99.99% uptime for Teams Phone, excluding planned outages. During Phase 1 of the benchmark, GigaOm conducted research to assess the overall competitiveness of the SLA Voice/PSTN SLA targets versus the key competitors in the market.
Benchmark Focus and Scope
We applied the following Key Criteria for testing:
- HD/audio/video (emphasis on audio and phone features)
- Reporting and analytics
- Meeting controls, collaboration tools, and call routing
- Ease of use
High Definition (HD) Audio/Video (Emphasis on Audio and Phone Features)
The core capability of UCaaS solutions remains their delivery of high-quality calls and video with good availability. Strong providers can deliver HD quality video in either 720 or 1080p, with low latency/minimal lag, while minimum offerings are at 360p. Audio remains most important to the user experience and service; solutions must provide quality audio and maintain a stable synchronized connection. Enterprises require large numbers of concurrent participants on calls during meetings, with the top vendors supporting over 200 callers. It’s important to note that last-mile delivery between service provider and customer should provide sufficient bandwidth and hardware solution support to not impact the vendor’s audio and video quality.
Reporting and Analytics
Analyzing call duration, stability, and utilization rates by user groups within a unified environment is critical. Reporting and analytics must be able to monitor and provide insights into uptime, call quality, and facility troubleshooting of performance issues. Strong vendors in this key criteria provide real-time data to participants and admins/managers. These reporting and analytics tools enable enterprises to work with their vendor to evaluate whether uptime SLAs are being met and what actions are triggered if they are not. Reporting and analytics can also reveal the extent of internal collaboration, while data related to external calls can provide insight into how an enterprise interacts with its partners and vendors. The tool’s ability to visualize data and monitor the health of a communication system can be key to an enterprise’s success, and ensure users are compliant with internal and external policies.
Meeting Controls and Collaboration Tools (Emphasis on Phone Service Features)
UCaaS solutions enable a range of methods for information to be shared in various formats. Distributed employees need to work on assets in a collaborative way that goes beyond screen sharing. Whiteboards and other design spaces greatly assist in meeting productivity. As information in these meetings may be sensitive, it’s important hosts and admins have sufficient controls to manage interactions. Similarly, controls should exist to limit channel and team groups beyond set audiences, control audiences within meetings, and restrict user groups in an organization that has stringent rules related to local laws or regulatory requirements.
Ease of Use
At the core of any UC solution is how much it promotes user adoption. Solutions must meet the needs of a range of users, from those struggling with technology to those wanting to leverage the advantages of online collaboration. Intuitive tools with strong UX and easy-to-configure local applications on desktop and mobile devices will lead to more successful implementations. A more intuitive solution will make day-to-day management easier. Intuitiveness also facilitates wider adoption of additional features; however, feature-rich but complicated solutions may not yield the desired results within an enterprise. The ease with which other SaaS solutions can be integrated and controlled will assist in getting the most out of a UCaaS solution.
The GigaOm Key Criteria for Evaluating Unified Communications as a Service (UCaaS) Solutions and Radar for Unified Communications as a Service (UCaaS) reports provide an overview of the unified communications and voice over IP market. They identify capabilities (table stakes, key criteria, and emerging technology) and evaluation metrics for selecting a unified communications/telecommunication service, and detail vendors and products that excel. These reports give prospective buyers an overview of the top vendors in this sector and help decision makers evaluate solutions and where to invest.
2. Scenario
One of the primary goals of GigaOm Lab reports is to make the benchmark easy to reproduce by anyone wishing to replicate the tests to perform an independent verification. In this section, we describe all the elements and processes necessary to perform the tests or change the variables according to the end-user’s needs.
For each of the selected UCaaS vendors, we worked with their sales and product teams. Each vendor set up a proof of concept (POC) environment for GigaOm for 45 days. Interestingly, most companies provide a free two-week POC versus a full month or more. We asked each company to provide us with the longer period, and they all agreed. Based on our testing experience, we recommend a window of 60 days to set up testing, conduct the test, and do follow-up reporting.
To prepare for testing, we worked with each vendor to provide a demo/training session for our testers. These training sessions lasted for approximately one hour and covered the core processes for each vendor’s solution. In addition, we trained two administrators to create users, assign calling plans, and run reports. All vendors offered support and additional training as needed throughout the testing process.
Our testers were located in Africa and Asia with a standard remote, work-from-home setup with laptops and cell phones. We leveraged U.S. domestic phone numbers and calling plans for each vendor. The plan was to push each vendor’s system to its limits and ensure we had both successful calls and conferences as well as unsuccessful calls and conferences. We were not looking for happy-path calls and meetings only. The testers documented the experience, features, and internet performance for each test period. The same features were tested across all three vendors during the test period.
Each vendor provided its standard call-quality reports for the testing period. The reports reflect how each vendor rates calls based on several factors and algorithms. Two vendors leverage a calculation method called Mean Opinion Score (MOS). If the call quality falls below an MOS of 3.5, the quality is considered poor. One vendor had targets for key elements of call quality. If statistics fall out of the target range, the calls are considered poor.
In addition to user testing, we leveraged New Relic for synthetic transaction testing and monitoring throughout the period. We also used the Apdex method for evaluating the data. Proponents of the Apdex standard believe it’s a better way to “measure what matters.” The method converts many measurements into one number on a uniform scale of 0 to 1 (0 = no users satisfied, 1 = all users satisfied). The resulting Apdex score is a numerical measure of user satisfaction with the performance of enterprise applications. This metric can be used to report on any source of end-user performance measurements for which a performance objective has been defined.
The Apdex formula is the number of satisfied samples plus half of the tolerating samples plus none of the frustrated samples, divided by all the samples:
Apdext = Satisfied Count + (Tolerated Count * .5) + (Frustrated Count * 0)
Total Samples
The subscript t is the target time, and the tolerable time is assumed to be four times the target time. So, it is easy to see how this ratio is always directly related to users’ perceptions of satisfactory application responsiveness.
Example: Assuming a performance objective of 3 seconds or better, and a tolerable standard of 12 seconds or better, given a dataset with 100 samples where 60 are below 3 seconds, 30 are between 3 and 12 seconds, and the remaining 10 are above 12 seconds, the Apdex score is:
Apdext = 60 + ( 30*.5) + (10*0) = .75
100
The Apdex formula is equivalent to a weighted average, where a satisfied user is given a score of 1, a tolerating user is given a score of 0.5, and a frustrated user is given a score of 0. In our test results we change this number to a percentage for readability.
These tests aim to assess call quality and features of the three vendors, validate the vendor’s reports, and provide a third-party assessment of the data by leveraging a synthetic testing tool to monitor transactions.
3. Compared Vendors
We selected three key players in the UCaaS market for the Voice Services benchmark comparison:
Microsoft
Microsoft Teams (Teams) is a UCaaS solution that provides messaging, meetings, and telephony as part of the Microsoft 365 suite of products. Messaging and meetings are the core product and lead the most to adoption, but its telephony services are increasingly gaining traction. In addition, a wide range of Microsoft-compatible “certified” products and solutions are available that use Teams as a hub. Indeed, some of the vendors mentioned in this report are strongly aligned with servicing enterprises that prefer building around Teams.
Teams is capable of delivering video at 1080p with up to 1,000 concurrent users. It scored high on the key criterion of transcriptions and captions, with a strong live transcription service with features that disable the speaker attribution option and other controls. Teams provides background-noise suppression, screen sharing, and breakout rooms. It also allows file sharing and direct collaboration within Microsoft 365 products. Teams includes native whiteboards, polling, and surveys, and has integrations to more than 1,000 apps through the Microsoft app store. Similarly, Teams scored well on the key criteria of mobile experience, with a well-designed user experience (UX), features like endpoint switching between mobile and desktop, and the ability to share and work on files “in app.”
Microsoft Teams Calling Plans, voice, audio, network, and server uptime have an SLA of 99.99%, with potential customer credits of up to 100% depending on the severity of the impact/outage. Likewise, Teams’ voice quality and latency have an uptime SLA of 99.9%, with potential customer credits of up to 100% depending on the severity of impact. The company documents service levels, calculation of SLA goals, and credit agreements online. For customers to receive credits, they must open a ticket and file a claim. We could not locate information about a customer’s ability to exit the contract without a penalty due to repeated SLA failures.
RingCentral
RingCentral Message Video Phone (MVP) provides telephony, meetings, and messaging with strong capabilities. RingCentral has grown substantially from its telephony services origins and has evolved its meeting capabilities from those originally underpinned by Zoom to developing its own video services.
RingCentral MVP supports video up to 720p and up to 200 active participants in most plans, with up to 500 available through their large-meeting add-on. Transcriptions are available at user request at the end of meetings, and live transcription and native whiteboards are roadmapped for 2022. Polling and survey features are available in meetings.
RingCentral analyzes call duration, stability, and utilization to determine adoption and usage, performance, and quality of service (pinpoint ISP, network, endpoints, and so on).
Uptime SLA for meetings, telephony, and messaging is 99.999%, which is among the highest in the UCaaS space. However, it does not publish any SLA terms online or information about potential customer credits when SLAs are missed, and online information does not include an “out clause” for repeated missed SLA terms. It prefers to have a non-disclosure agreement to negotiate these terms.
We found published RingCentral contracts for comparative purposes; however, we do not know how terms were negotiated. These contracts were from 2020 forward and included a commitment to uptime of 99.999%. The customer credits ranged from 10% to 30% of the monthly costs. In addition, they committed to a voice quality of 3.8MOS and above. In both situations, the customer must file for the credits within a specific period. For MOS credits, the customer must use monthly QoS reports from RingCentral.
Zoom
Zoom’s UCaaS platform combines Zoom Meetings, Zoom Chat, Zoom Phone, Zoom Webinars, and Zoom Rooms (a physical presence conferencing solution) to provide a unified user experience. The mobile experience is strong, with a well-designed UX, and useful features such as endpoint switching.
Zoom Meetings scored high on the key criterion of HD audio and video and can deliver video at 1080p on business and enterprise licenses with 100 active participants by default, and up to 1,000 with a large-meeting add-on. Cloud recording up to 1 GB is provided, with options to save recordings locally, and transcription processed after the call has ended. Zoom includes polling, surveying, and integrated whiteboarding capabilities. At present, Zoom lacks call insights and sentiment analysis features.
A variety of dashboards and analytics show usage by time, device, and other data points, in addition to room feedback scores that can highlight quality issues.
Zoom provides an uptime SLA of 99.9%, below other leading vendors in the UCaaS space. Customer credits are offered and calculated based on the percentage of downtime multiplied by the customer’s monthly subscription for the impacted service. Customers must self-report all downtime. Zoom also assigns a MOS score per call, but there is no published SLA for these scores. Zoom does not commit to an SLA for the MOS since it does not control the end-to-end process. It provides public links to systems status worldwide at https://uptime.zoom.us/.
No further details in SLA terms or credits were found online. Nor was information published regarding a customer’s ability to exit a contract after repeated SLA failures.
4. Environment: Configuration and Setup
One of the primary goals of GigaOm Benchmark reports is to make the tests we develop easy for users to reproduce. In this section we describe the elements and processes necessary to perform the tests or change the variables according to the user’s needs. In the benchmark, we aimed to create environments that were as close to an apples-to-apples comparison as possible.
Applications leveraged:
- Microsoft – Microsoft Teams Calling Plans, Audio Conferencing; Microsoft Teams – Voice Quality Application.
- RingCentral – MVP – Meetings, Video and Phone RingCentral Unified Communications and Collaboration services.
- Zoom – Zoom Phone and Zoom Meetings.
Testing:
- Testers were located in Africa and Asia with a standard remote working setup of a laptop, cell phone, internet, and cell service.
- Testers used the mobile applications, desktop applications, and browser version for each vendor.
- Human testers conducted daily tests of the three providers and documented the test results.
- We correlated the reports of their testing experience with the New Relic synthetic tests details to show performance and usability differences.
- Data from New Relic, a synthetic monitoring tool, provided 15-minute point-in-time samples—run 24 hours, 7 days a week—of metrics from the three providers.
Tools used:
- New Relic, a SaaS-based synthetic-monitoring tool, was used to test the three services and generate reports and Apdex scores.
5. Tests
A group of human testers made back-to-back short calls between all three services. They tested the same features across all three vendors for each test period, and tests occurred on different days. They recorded relevant data that was correlated with synthetic testing results. This provided both qualitative and quantitative data to evaluate the three vendors.
- Network Service Uptime
- Synthetic service monitoring of 15-minute point-in-time samples over 24 hours and 7 days each week of the test period per vendor. These tests provide machine-level detail of network statistics and service availability. At the end of the measurement period, a report provided data needed to evaluate the services on technical metrics.
- The three vendors’ uptime was measured during the synthetic tests.
- New Relic monitoring tool was leveraged to produce synthetic transactions against each vendor’s environment.
- The goal was to determine if vendors’ uptime statistics were in line with vendor-reported uptime statistics, in particular as they relate to the SLA commitments of 99.99% to 99.999%.
- Voice Quality
- Monitored vendor voice quality during actual voice calls.
- The goal was to determine if voice quality statistics represent solid voice-quality numbers based on industry standards and vendor self-reporting.
- Made voice calls (duration, frequency, participant(s)) with each of the three vendors’ solutions.
- Ran speedtest.net at the start of call and documented MPS, download, and upload measurements.
- Monitored call quality/pulled call statistics from vendors’ tools.
- Leveraged vendors’ testing tools for voice quality comparison.
- Monitored vendor voice quality during actual voice calls.
- Top features tested during benchmark
- Schedule a conference call
- Attend a conference call
- Create audio call and add team members
- Use chat during a call
- Use whiteboards during a call
- Screen share during a call
- Make a regular audio call
- Leave voicemail (a specific paragraph)
- Retrieve voicemail
- Listen to voicemail and assess recording quality (3 – easily understood, 2 – mostly understood, 1 – garbled)
- Read voicemail transcription and assess quality (3 – exact verbiage, 2 – most words correctly transcribed, 1 – unable to transcribe)
- Transfer a call:
- From audio to video
- From video to audio
- From browser to mobile device
- From mobile device to browser
- Between desktop, mobile device, and browser
- To another person
- Switch to carrier and back to vendor.
- Park a call, hold a call, and record a call
- Live captioning
- Suppression of background noise in audio settings (auto, high, low)
- Assess call quality
- Assess ease of use
6. Results
The methodology was designed to obtain qualitative data for 30 days from human testers and correlate those with quantitative data from synthetic monitors and with each vendor’s quality of service reports. The combination allowed us to better validate the values from the different testing sources: our external synthetic tests and human testing results. In addition, it will provide a comparison for assessing vendor-reported statistics.
Overall, all three vendors’ uptime performance was high at 100% for the test period. We found that Microsoft performed ahead of the competition in several key areas. Synthetic testing of the platform illustrated an Apdex score of 100% satisfied transactions over the test period, alongside 100% uptime. The other vendors fell short in this area.
In addition, our human testers rated Microsoft as the top vendor for both ease of use and call clarity and quality. Microsoft’s reporting of call quality also more closely reflected the actual tester experiences and the synthetic testing results.
Synthetic Testing Results
We performed synthetic testing to generate simulated transactions across all three vendor platforms to use an independent tool to test and rate several performance factors of the solutions. Synthetic testing was set up to test each platform with machine simulated transactions every 15 minutes daily 24×7. Performance factors were monitored for 30 days.
Overall, uptime for all three vendors was 100% during the test window for all transactions performed. All vendors have continued to successfully enhance their platforms with a focus on uptime.
We leveraged the Apdex model for assessment of the transactional data. The calculation is described in section 3. Scenario above. Table 1 shows the average results for the test period for all three vendors:
Table 1. Overall Average Apdex Scores for Test Period June 13 to July 14, 2022
Vendor | Average Score |
---|---|
Microsoft | 100% |
RingCentral | 93% |
Zoom | 99% |
Source: GigaOm 2022 |
As reflected in the table, Microsoft scored the highest number of satisfied transactions throughout the testing window. This score was closely followed by Zoom, which averaged a high score of 99%, even with a few transactions in the tolerated and frustrated ranges. RingCentral scored 93%, with several double-digit tolerated scores.
As described elsewhere, the Apdex formula is equivalent to a weighted average, where a satisfied user is given a score of 1, a tolerating user is given a score of 0.5, and a frustrated user is given a score of 0. We have changed this number to a percentage for readability. In this calculation the target time (t) is set to 7 seconds; therefore, the satisfied range is less than 7 seconds, tolerating is 7 to 28 seconds, and the frustrated range is more than 28 seconds.
Details from the New Relic monitoring tool for each vendor are in Tables 2 through 4.
Table 2. Microsoft Results by Week
Date | Week of July 11, 2022 | Week of July 4, 2022 | Week of June 27, 2022 | Week of June 20, 2022 | Week of June 13, 2022 |
---|---|---|---|---|---|
Apdex | 100% | 100% | 100% | 100% | 100% |
Satisfied | 100% | 100% | 100% | 100% | 99.9% |
Tolerated | 0% | 0% | 0% | 0% | 0.1% |
Frustrated | 0% | 0% | 0% | 0% | 0% |
Uptime | 100% | 100% | 100% | 100% | 100% |
Source: GigaOm 2022 |
As noted in Table 2, Microsoft Teams had an uptime of 100% for the duration of the testing. For Apdex results, Microsoft experienced 100% overall satisfied transactions, with a minor drop to the tolerated range in the first week. No other drops to tolerated and frustrated occurred during the test window.
Table 3. RingCentral Results by Week
Date | Week of July 11, 2022 | Week of July 4, 2022 | Week of June 27, 2022 | Week of June 20, 2022 | Week of June 13, 2022 |
---|---|---|---|---|---|
Apdex | 89% | 90% | 95% | 95% | 94% |
Satisfied | 79.0% | 80.4% | 90.6% | 89.7% | 88.4% |
Tolerated | 21.0% | 19.6% | 9.4% | 10.3% | 11.6% |
Frustrated | 0% | 0% | 0% | 0% | 0% |
Uptime | 100% | 100% | 100% | 100% | 100% |
Source: GigaOm 2022 |
Table 3 shows that RingCentral had an uptime of 100% for the duration of the testing. For Apdex results, RingCentral experienced an average of 93% overall satisfied transactions. There was a range of approximately 10% to 20% of the transactions that fell into the tolerated range. None fell into the frustrated range for the testing period.
Table 4. Zoom Results by Week
Date | Week of July 11, 2022 | Week of July 4, 2022 | Week of June 27, 2022 | Week of June 20, 2022 | Week of June 13, 2022 |
---|---|---|---|---|---|
Apdex | 97% | 98% | 99% | 100% | 100% |
Satisfied | 94.3% | 96.1% | 98.4% | 99.6% | 99.4% |
Tolerated | 5.7% | 3.7% | 1.6% | 0.4% | 0.6% |
Frustrated | 0% | 0.1% | 0% | 0% | 0% |
Uptime | 100% | 100% | 100% | 100% | 100% |
Source: GigaOm 2022 |
Looking at Table 4, Zoom had an uptime of 100% for the duration of the testing. For Apdex results, Zoom experienced an average of 99% overall satisfied transactions. There was a range of approximately 0.5% to 6% of the transactions that fell into the tolerated range. There was a minor drop to frustrated range for the testing period.
The goal of the synthetic testing was to provide an unbiased analysis of each vendor’s performance and uptime for comparison with the vendors’ internal reports.
Human Testing Results:
Overall, our testers ranked Microsoft as the top vendor for overall performance. Both the quality of the calls and video, as well as the top features from Microsoft Teams Phone, ranked ahead of the other two vendors. Zoom was the testers’ second choice with call quality ranking close to Microsoft. They did enjoy the design of the Zoom user interface and how easy it was to use many of the key features. RingCentral placed third with the least favorite call quality of the vendors in our tests. RingCentral has some excellent features allowing users to transfer a meeting or call between mobile phone, desktop, and browser quickly, easily, and seamlessly. All vendors have excellent features but also have areas for improvement. Each vendor has key enhancements on their upcoming roadmaps. Note that some of the features to be added or improved are based on the testers’ experience. Some of these features may exist in the product but are not intuitive to use or need additional configuration.
Microsoft
Microsoft scored high on the user testing of Microsoft Teams Phone. The testers felt the call quality was excellent, live captioning was available for audio calls, and voicemails were accurately transcribed as spoken. It’s also convenient to chat during an audio call in the same window as the call.
Other top features noted by the testers included:
- Live captioning is available for both video and audio calls.
- It is quite easy to switch from audio to video and from video to audio. It also doesn’t take long to switch.
- The chat and share feature is directly linked to the audio call window. You don’t have to leave the call window to access the chat and share features.
Testers noted that these areas could use improvement:
- When leaving a voicemail, you cannot listen to it after recording. The dialpad cannot be accessed for more options when recording a voicemail and before sending it.
- The transfer call between devices worked, but after the initial transfer, you cannot transfer back to the original device. You also cannot transfer calls to the browser.
Additional features suggested by the testers include:
- Microsoft Teams should introduce the “park call” feature (available, must be configured).
- It should add the ability to switch the voice call to a cellular carrier (on 2022 roadmap).
- The record call option is not available for audio calls (available, must be configured). (This will need options on recording permission or warning to be compliant with government regulations.)
Reporting: Reporting and analytics for Microsoft are available from either the Teams admin center or the Power BI application. They are not available directly from the application, unlike the other two vendors’ solutions. High-level call-quality reports are available from the Teams admin center. More detailed reporting for call quality is available from Power BI. The Power BI reports have high-level graphics and extreme details about each leg of the calls and/or meetings. They leverage green, yellow, and red color coding to show elements that fall out of the target ranges for jitter, packet loss, and latency. These reports contain a significant amount of data, allowing users to research how to improve call and meeting quality. They are geared to an engineering user who would leverage extensive call details for troubleshooting. Microsoft would benefit from an easier-to-use, graphical report for initial reviews of calls quality and performance.
RingCentral
RingCentral scored well in the ease of use of the overall solution. Traditional telephone features also scored high in testing.
Other top features noted by the testers included:
- Switching back and forth between devices (mobile, desktop, and browser) was quite easy and fast during calls.
- The ability to park several calls and return to them by either dialing an extension provided or clicking a link sent to your inbox was simple.
- The option to switch from an internet audio call to a GSM carrier. It allows the user to continue a call should the internet connection drop.
Testers noted that these areas could use improvement:
- The quality of the call was not great. They noticed a significant difference when compared side by side to the other two vendors.
- The transcription of the voicemail is poor. Most words of the transcript are not the exact verbiage.
- Switching from audio to video is possible, but it takes longer to completely switch to the video meeting.
Additional features suggested by the testers include:
- Live captioning is not available.
- Whiteboard functions (on 2022 roadmap).
Overall, testers experienced that RingCentral was easy to learn, and the UI was intuitive. In addition, switching between mobile, desktop, and browser was the easiest and most seamless of all the vendors. However, the call quality, when compared to the other vendors, caused the product to be rated lower.
Reporting: Reporting for RingCentral was simplified on the main screen with an analytics button. There are six major report types (two are in beta). We leveraged the Quality of Service report, which contained overview and detailed information about audio and video calls. The calls/meetings quality scores were broken into three rankings: Good – MOS score of 3.5 and above; Moderate – score of 3.00-3.49; Poor – scores below 3. These are color-coded, and with a click, the details related to the score are displayed. The call listing reports can be downloaded, but the graphics and overall ratings are not included. Scores are not included in the download reports for each call. The RingCentral reporting platform was very intuitive and simple to learn and use. The high-level graphics were easy to interpret, and drill-down capabilities were available to easily access call details for research.
Zoom
Zoom ranked high in the overall UI design and ease of use. It also scored well in call quality.
Other top features noted by the testers included:
- The park call feature was quite easy to use, and it is possible to park several calls that can be accessed by dialing an extension provided.
- Zoom has the option to switch from an internet audio call to GSM carrier. It makes it possible to continue the call should the internet connection drop.
- Zoom has a feature to record audio calls.
Testers noted that these areas could use improvement:
- The transfer call option between devices (mobile, desktop, browser) is not available. If you join with mobile, you cannot switch to the desktop client and vice versa.
- Screen sharing pauses the audio call and a new meeting is created.
Additional features suggested by the testers include:
- The ability to make an audio call from the browser.
- Live captioning for audio calls.
Overall, the testers felt the call quality was very good. In addition, the testers rated the user interface (UI) easy to learn.
Reporting: Reporting for Zoom is part of the main application. It is available from the dashboard. Zoom Phone contains quality information for audio calls. The call logs fall into good or bad quality. Good quality is a MOS score of 3.5 or above. The information is also broken down by device type, ISP, voice codec, and network. Each call is listed with individual details on jitter, packet loss, latency, and bitrate for sending and receiving. One drawback is the lack of download capabilities.
For video conferences, the data sits under the Meetings tab on the dashboard. A setting needs to be turned on to get health data for these calls. It is not automatic, nor is it retroactive. In addition, there is no graphical overview of overall video health. The report provides a textual health status, such as WARNING, and provides potential reasons. In addition, the report provides a rating of good, fair, or poor for the categories of audio, video, and screen sharing. These details can be exported to a CSV file for further analysis. There is also detail for each call and each participant on jitter, packet loss, latency, and bitrate for sending and receiving. There is enough data to adequately research call/video issues in these different reporting areas.
Overall Vendor Reporting Statistics
As shown in Table 5, each vendor uses a different methodology for scoring calls. RingCentral and Zoom leverage a mean opinion score (MOS) used to describe call quality. These calculations are based on industry standards, company specific algorithms, and often are adjusted by other factors a company deems important, such as increasing a score due to the use of a specific codec such as Opus.
Table 5. Reporting Statistics
Vendor | MOS Average Overall Quality | Phone Percent Good | Phone Percent Moderate /Poor | Video Percent Good | Video Percent Moderate /Poor |
---|---|---|---|---|---|
Microsoft Teams Phone | NA | 84.57% | 15.44% | 97.30% | 2.70% |
RingCentral | 4.1 | 92.40% | 7.70% | 76.90% | 23.10% |
Zoom Phone | 3.1 | 38.94% | 61.06% | * | * |
Source: GigaOm 2022 |
* Note: Specific monitoring setting for Zoom Meeting was not set up to collect video health.
Microsoft uses a minimum target for each component, such as packet loss, jitter, and latency. If the component is out of the target good range, it is scored as poor. Based on these reporting differences, it is difficult to get an apples-to-apples comparison of the vendor reporting on call quality.
Based on the testers’ experience, Microsoft reports most closely resembled the experience on the test calls and meetings. Most calls and meetings were rated as good, i.e., above target. The testers rated Microsoft call quality as excellent and the top of all three vendors.
Interestingly, Zoom had very high call quality for our testers. However, Zoom reporting reflects that most of the calls fell short of the MOS score of 3.5 and were scored as poor. This did not reflect the user experience as they felt the quality of the calls was high.
The RingCentral reporting showed high scores for most of the calls and meetings. While not many of the calls were dropped or lost, our testers rated the quality of most of the calls as lower than the other two vendors. It looks like RingCentral has chosen metrics and thresholds that show it performing better than our human testers experienced.
Overall, vendor reporting can be used to troubleshoot calls and the experiences of users. These reports are also required by the vendors when reporting issues related to performance and quality supported by SLA agreements. A report that reflects a more positive call than experienced by users can impact a company’s ability to collect credits based on SLA terms. However, it is important to leverage independent sources for assessment and testing when evaluating vendors and comparing solutions to determine actual experience versus the vendor’s own internal reports.
Service Level Agreement Comparison
Whether selecting a simple voice solution or a more complex unified communication platform, significant risks lie in the contractual terms. The tables that follow show the results of the Voice Services SLA Surveys across the vendors. Table 6 compares the SLA commitments for the various services for each vendor. Table 7 focuses on the responses to the financially backed service credits for each SLA commitment. Note the lack of public SLA terms from RingCentral that indicates a more complex contract negotiation process. Information from published RingCentral contracts were used for comparative purposes. The asterisk next to RingCentral indicates the uncertainty that exists around SLAs for the provider.
Table 6. Voice Services SLA Targets
Microsoft | RingCentral* | Zoom | |
---|---|---|---|
Network/Server Uptime | 99.99%/yr | 99.999% | 99.90% |
Support Response Time | Respond in 24 hours | No public SLA terms financially backed. Requires NDA/ contract negotiation. | 1 - Urg: 1 Hr 2 - Hi: 4 Hrs 3 /4- Norm/Low: 24 Hrs |
Latency Between Cloud Services | 99.90% | No public SLA terms financially backed. Requires NDA/ contract negotiation. | Requires contract negotiation |
Calling Plans | 99.99% | No public SLA terms financially backed. Requires NDA/ contract negotiation. | Requires contract negotiation |
Voice Quality | 99.90% | 3.8 MOS | Requires contract negotiation |
Source: GigaOm 2022 |
Table 7. Voice Services Financial Backing via Service Credits
Microsoft | RingCentral * | Zoom | |
---|---|---|---|
Customers exit without penalty with repeated SLA failures? | Unable to Find | Failure to meet 99.9% for 3 out of 6 months | Requires contract negotiation |
Network/Server Uptime | 25-100% | 10-30% | Credit equal to downtime percentage x monthly subscription amount. |
Support Response Time | Provided in SLA/online services agreements | No public SLA terms financially backed. Requires NDA/ contract negotiation. | Requires contract negotiation |
Latency Between Cloud Services | 25-100% | No public SLA terms financially backed. Requires NDA/ contract negotiation. | Requires contract negotiation |
Calling Plans | 10-100% | No public SLA terms financially backed. Requires NDA/ contract negotiation. | Requires contract negotiation |
Voice Quality | 25-100% | Based on monthly QoS reports | Requires contract negotiation |
Features Covered by SLA Terms | Voice, video conferencing, contact center, collaboration, team engagement | No response from vendor | Video and voice conferencing |
Burden of Proof for Outages/Issues | Customers must submit a claim | Customers must submit a claim | Unable to find terms |
Source: GigaOm 2022 |
7. Analyst’s Take
The pandemic has pushed UCaaS solutions to become mainstream platforms, including telephony and meetings on one platform. The quality of audio and video is key to a successful meeting. A poor meeting can distract attendees and be very stressful for participants. Quality of service for video and audio and the platform’s overall uptime availability are central components of successful meetings. The stability of the video feed in terms of latency, jitter, and packet loss are far more perceptible and define a good versus bad video experience.
This GigaOm Benchmark report assessed three competitors in the UCaaS space focusing on phone features and video conferencing. Synthetic testing and human testing were performed across all three platforms. In addition, service level agreement (SLA) terms were assessed for each vendor. Overall, all three vendors’ uptime performance was high at 100% for the test period. All three vendors have compelling features that, for our testers, enhanced their call and meeting experience. Conversely, all three vendors have areas of potential improvement in their feature sets, although some are already on the vendors’ roadmaps. All three vendors had varying levels of SLA terms and potential credits for customers.
Following our testing, however, we found that Microsoft performed ahead of the competition in several key areas. Synthetic testing of the platform illustrated an Apdex score of 100% satisfied transactions over the test period, alongside 100% uptime. The other vendors fell short in this area.
Our testers rated Microsoft as the top vendor for both ease of use as well as call clarity and quality. In addition, they highlighted features such as live captioning and transcriptions of audio, video, and voice mail as excellent and very useful. The meeting features such as chat and share are easily accessible for audio calls. Microsoft’s reporting of call quality also more closely reflected the actual tester experiences and the synthetic testing results.
Microsoft SLA terms were also the most publicly available and accessible online. It commits to a slightly lower call quality percentage than RingCentral, however, it is willing to credit customers with 100% of their bill. All three vendors require customers to leverage internal reports to file for call quality and uptime credits. Based on our review of the vendors’ reports, it may be difficult to collect credits from RingCentral.
Vendors in this space continue to evolve and enhance the features of their platforms. Most notably are enhancements to remove impacts of external noise and latency. Noise removal and noise suppression lead to clearer calls even if there is latency, jitter or packet loss. In addition, feature enhancements for meetings and collaboration continue to be introduced. We recommend users review upcoming vendor roadmaps for new features and their future direction.
As a buyer of technology in this space, it is important to perform testing and assessments of each vendor’s solution. These assessments need to be quantitative and qualitative. Leverage both the vendor reports and assessments as well as use third-party tools to validate the vendor reports. Differences in reporting and assessment of quality do exist. Requirements should also be matched to available key features for each platform. SLA contract terms vary by vendor and are always negotiable during the contract phase. It is important to ensure the terms support the user’s business.
8. How to Repeat the Test
Our intent was to push the limits of each vendor’s systems to observe the relative variances in call quality and performance that were produced as a result. To repeat these tests, several steps need to be taken.
First, work with each vendor to obtain an evaluation environment for each solution. We suggest obtaining the environment for 60 days to do a 30-day test. This helps account for set-up time, training prior to testing, and post-test assessment. Vendors were to provide the following:
- Environment (preferable 60 days).
- Basic user training and familiarization with the environment.
- Administrator training for user accounts setup and reporting.
A group of human testers is needed to run tests across all three vendors.
- Use testers in international locations if applicable.
- Determine list of features to test.
- Test at different times of the day.
- Test using a remote, work from home desktop and mobile phone set up.
- Run an application like Speedtest.net at the start of a call and document MPS, download, and upload measurements.
- Make voice calls and meetings with each of the three vendors’ solutions.
- Test with mobile applications, browser, and desktop applications.
- Test vendors at each test point using the same functionality and features.
- Monitor call quality and pull call statistics from each vendor’s tools.
A synthetic testing tool must be deployed to produce machine-level detail of network statistics and service availability.
- Set up transactions to obtain 15-minute point-in-time data samples. Run 24 hours, 7 days a week.
- Review post-measurement-period reports with data collected to evaluate the services on technical metrics.
Perform synthetic and human testing over a 30-day window at different times for variety and covering peak and non-peak times. Synthetic testing should be run every 15 minutes.
9. Report Methodology
This GigaOm Benchmark report analyzes and compares the performance and features of voice services for three leaders in the UCaaS market.
The report explores voice services technology. It assesses call quality, voice services features, and SLA terms for each vendor. The report is designed to educate readers on voice services technology and help them assess different product offerings from key vendors. Our analysis focuses on highlighting core technology, use cases, and differentiating features.
Our approach focuses on defining the basic features users should expect from products that correctly implement the technology, while exploring characteristics that will serve to differentiate the value of solutions over time. In this regard, readers will find similarities with the GigaOm Key Criteria and Radar reports. Our evaluation method for emerging technology is based on:
- Core technology: Table stakes
- Differentiating features: Potential value and key criteria
The objective of this report is to help buyers evaluate and select a voice provider that may be part of an overall UCaaS solution.
10. About Dana Hernandez
Dana Hernandez is a dynamic, accomplished technology leader focused on the application of technology to business strategy and function. Over the last three decades, she had extensive experience with design and implementation of IT solutions in the areas of Finance, Sales, Marketing, Social Platforms, Revenue Management, Accounting, and all aspects of Airline Cargo, including Warehouse Operations. Most recently, she spearheaded technical teams responsible for implementing and supporting all applications for Global Sales for a major airline, owning the technical and business relationship to help drive strategy to meet business needs.
She has led numerous large, complex transformation efforts, including key system merger efforts consolidating companies onto one platform to benefit both companies, and she’s modernized multiple systems onto large ERP platforms to reduce costs, enhance sustainability, and provide more modern functionality to end users.
Throughout her career, Dana leveraged strong analytical and planning skills, combined with the ability to influence others with the common goal of meeting organizational and business objectives. She focused on being a leader in vendor relationships, contract negotiation and management, and resource optimization.
She is also a champion of agile, leading agile transformation efforts across many diverse organizations. This includes heading up major organizational transformations to product taxonomy to better align business with enterprise technology. She is energized by driving organizational culture shifts that include adopting new mindsets and delivery methodologies.
11. About GigaOm
GigaOm provides technical, operational, and business advice for IT’s strategic digital enterprise and business initiatives. Enterprise business leaders, CIOs, and technology organizations partner with GigaOm for practical, actionable, strategic, and visionary advice for modernizing and transforming their business. GigaOm’s advice empowers enterprises to successfully compete in an increasingly complicated business atmosphere that requires a solid understanding of constantly changing customer demands.
GigaOm works directly with enterprises both inside and outside of the IT organization to apply proven research and methodologies designed to avoid pitfalls and roadblocks while balancing risk and innovation. Research methodologies include but are not limited to adoption and benchmarking surveys, use cases, interviews, ROI/TCO, market landscapes, strategic trends, and technical benchmarks. Our analysts possess 20+ years of experience advising a spectrum of clients from early adopters to mainstream enterprises.
GigaOm’s perspective is that of the unbiased enterprise practitioner. Through this perspective, GigaOm connects with engaged and loyal subscribers on a deep and meaningful level.
12. Copyright
© Knowingly, Inc. 2022 "Competitive Voice Services Reliability Benchmark" is a trademark of Knowingly, Inc. For permission to reproduce this report, please contact sales@gigaom.com.