Table of Contents
- Summary
- Overview
- Considerations for Adoption
- GigaOm Sonar
- Vendor Insights
- Analysts’ Take
- Report Methodology
- About GigaOm
- Copyright
1. Summary
Enterprises want to keep more data, for longer periods of time, at affordable costs, and in a way that it’s accessible when needed. However, it can be difficult to find long-term storage for ever-growing data sets that’s cost effective and highly available, so organizations often must prioritize one consideration above the others when comparing vendors.
Data storage in an enterprise generally looks like the pyramid in Figure 1, with the bulk of the older data in cold storage, which is typically cheaper and less available.
Figure 1. Storage Tiers
Tape offers multipetabyte capacity and is cost effective, but it can be difficult to manage and access—notably because of retrieval times. Data management is usually performed using backup and archive software, and access takes significantly longer to achieve than with other media. Moreover, tape is designed for linear access, and even though it has good throughput, it doesn’t cope well with small files.
Today, object storage has become one of the most common options. Object storage works just as its name suggests: it stores objects, each of which consists of data, metadata, and a globally unique identifier that lets the object be found. The metadata is customizable, which makes object storage extremely flexible.
S3 is now the dominant object storage protocol. Following AWS S3’s success in the public cloud, most object storage vendors competed to capture the private cloud object storage market. These solutions offer an interesting $/GB ratio, but organizations want even better cost ratios for longer-term storage. Many cloud providers now offer object storage solutions such as S3 Glacier, Azure Blob, and Google Archival Storage that solve this problem. However, these solutions come with their own set of challenges: retrieval costs may be quite high, and retrieval times from deep storage can be very long—as much as 48 hours.
In addition, data repatriation has become a hot topic because insatiable demand for affordable storage and the growth of data sets year over year force organizations to rethink their cold storage strategies for the long term. Choices are being made between the flexibility offered by cloud platforms and the predictability of on-premises infrastructure deployments. Organizations are intensively reevaluating their cloud strategies while looking for cost savings in an unstable economic environment.
At multipetabyte scale (and with exabyte scale looming), the sooner organizations make a repatriation decision, the easier it is to write off migration costs (from potential egress fees, for example), and the shorter the time needed to perform the migration. When deemed appropriate, these initiatives must be carried out before the cloud data footprint becomes too large to repatriate.
These ongoing challenges highlight the reasons why object storage on tape solutions are becoming increasingly popular. These solutions offer the best $/GB ratio, particularly for long-term storage. They have no issues with scalability and offer very long-term retention guarantees, thanks to extended linear tape-open (LTO) consortium roadmaps related to tape models and compatibility of drives.
The primary challenge with deploying object storage on tape is getting started. Installations are usually extensive: they start at multipetabyte level and can grow beyond hundreds of petabytes.
How We Got Here
Leveraging tape for storage has always been complicated. While the technology itself isn’t complex, tape libraries historically have been managed using backup and archive software, which has a limited scope—storing data copies for eventual retrieval in case of data loss. The way backup software writes to tape makes it impossible to use effectively for storage scenarios.
However, attempts have been made to use tape for data storage: hierarchical storage management (HSM) proposed the use of tape as a storage medium, but it never really worked. Solutions were complex to implement and relied on file system stubs (symlinks) to access tape data, causing issues when data was moved or deleted.
Later, the linear tape file system (LTFS) introduced a method to access and present the content of tapes as a mountable file system. While LTFS was a significant improvement over HSM, it still faced challenges, with latency being the most problematic. The overwhelming majority of existing applications use standard file system calls, and these expect the data to be returned within a reasonably narrow timeframe. They are not designed to tolerate the longer access latencies imposed by the technical limitations of tape and will often return timeout errors.
That’s not the case with S3; S3 client software can cope with high latencies, and the S3 protocol has a standard API. S3 continues to grow, becoming increasingly popular with organizations. This Sonar report will examine object storage on tape, which seems poised to provide the cold storage capabilities enterprises are looking for.
About the GigaOm Sonar Report
This GigaOm report focuses on emerging technologies and market segments. It helps organizations of all sizes to understand a new technology, its strengths and its weaknesses, and how it can fit into the overall IT strategy. The report is organized into five sections:
Overview: An overview of the technology, its major benefits, and possible use cases, as well as an exploration of product implementations already available in the market.
Considerations for Adoption: An analysis of the potential risks and benefits of introducing products based on this technology in an enterprise IT scenario. We look at table stakes and key differentiating features, as well as considerations for how to integrate the new product into the existing environment.
GigaOm Sonar Chart: A graphical representation of the market and its most important players, focused on their value proposition and their roadmap for the future.
Vendor Insights: A breakdown of each vendor’s offering in the sector, scored across key characteristics for enterprise adoption.
Near-Term Roadmap: A 12- to 18-month forecast of the future development of the technology, its ecosystem, and major players of this market segment.