Table of Contents
- Summary
- Overview
- Considerations for Adoption
- GigaOm Sonar
- Vendor Insights
- Near-Term Roadmap
- Analysts’ Take
- Report Methodology
- About GigaOm
- Copyright
1. Summary
The public cloud has a tremendous impact on IT agility and flexibility, but it can also lead to increasing costs and potential lock-ins. Growing business needs often lead to the adoption of hybrid and multicloud solutions, making it more and more difficult to consolidate data on a single public cloud or on-premises. Additionally, applications are becoming more mobile and data should follow along, but this may not be possible due to data gravity, time, and the cost of migration. Such situations create inefficiencies that can limit the ability to build effective multicloud strategies.
To maintain flexibility, minimize the risk of lock-in, and keep costs down, users are looking to build cost-effective data repositories with good performance and standard access interfaces that are accessible from everywhere. They also want to consolidate their data, which simplifies data management and many security aspects. In this regard, decentralized cloud storage has the potential to become an alternative to traditional cloud storage providers.
A decentralized storage system is based on a peer-to-peer (P2P) network, a type of architecture that found some success in the past for data distribution and file sharing. Instead of storing all the data in a centralized system, data is chunked, distributed, and stored in many nodes on a local network or the internet. Figure 1 shows the difference between traditional storage and decentralized storage systems.
Figure 1. Traditional versus Decentralized Storage Systems
Decentralized, or P2P, storage is not new, and we’ve seen attempts—largely unsuccessful ones—to build this kind of infrastructure over the years. However, the inherent risks are now mitigated by other important factors, such as the enormous amount of unused commodity resources across the internet, better security, and blockchain technology, which can assure data immutability and consistency. It’s now possible to take advantage of the abundance of unused (and sometimes even unreliable) resources to build strong and secure storage infrastructures.
How We Got Here
Blockchain technology plays a fundamental role in decentralized storage as well as “Web3”—a vision for the third iteration of the world wide web (not to be confused with Web 3.0 as defined by Tim Berners-Lee). In practice, most implementations of Web3 revolve around the use of cryptocurrencies or non-fungible tokens (NFTs) to consume online services or assert ownership over data.
While some consider NFTs to be proof of ownership of a digital asset, there’s no legal basis to back the ownership claim. From a technical point of view, an NFT consists of two components: a smart contract listing the transaction (thereby establishing ownership) and a URL that provides a link to the digital asset. Because these links are usually public, the asset—even if owned—can be downloaded and copied almost without restriction.
Enterprises view Web3 and NFTs with mistrust. The hype around NFT use as speculative assets, the scams, and the concerns around design shortcomings—namely, the lack of security measures such as role-based access control or a truly enforceable ownership claim—show that Web3 remains an immature technology, and so it’s too early for mainstream adoption.
Decentralized storage is also based on blockchain technologies, but doesn’t have the design shortcomings and other issues associated with NFTs. Not only can it be used to create an internal currency to simplify raw storage resources and data services trading, it can help improve P2P network security, information integrity, and data reliability.
Decentralized storage and P2P networks are not common in the traditional enterprise and they are usually viewed with much skepticism due to the overall complexity, potential risks, and other challenges. That said, the latest generation of solutions based on this technology can hide most of the complexity while providing a user experience similar to that of traditional public cloud storage, with better cost models and security. And thanks to the cloud and the rise of microservices, enterprises are more familiar with highly distributed applications and more open to evaluating decentralized infrastructure solutions.
Even though the basic concepts underpinning decentralized storage technology are similar, implementations differ greatly. There are several players already focused on different use cases and market segments, with many of them targeting developers and enterprises looking for relatively inexpensive and highly secure storage.
About the GigaOm Sonar Report
This GigaOm report focuses on emerging technologies and market segments. It helps organizations of all sizes to understand a new technology, its strengths and its weaknesses, and how it can fit into the overall IT strategy. The report is organized into five sections:
Overview: An overview of the technology, its major benefits, and possible use cases, as well as an exploration of product implementations already available in the market.
Considerations for Adoption: An analysis of the potential risks and benefits of introducing products based on this technology in an enterprise IT scenario. We look at table stakes and key differentiating features, as well as considerations for how to integrate the new product into the existing environment.
GigaOm Sonar Chart: A graphical representation of the market and its most important players, focused on their value proposition and their roadmap for the future.
Vendor Insights: A breakdown of each vendor’s offering in the sector, scored across key characteristics for enterprise adoption.
Near-Term Roadmap: 12- to 18-month forecast of the future development of the technology, its ecosystem, and major players of this market segment.