John Webster, Author at Gigaom Your industry partner in emerging technology research Wed, 14 Oct 2020 00:32:06 +0000 en-US hourly 1 https://wordpress.org/?v=6.5.3 Unifying petabyte-scale unstructured data to enhance enterprise data value https://gigaom.com/report/unifying-petabyte-scale-unstructured-data-to-enhance-enterprise-data-value/ Tue, 08 Jul 2014 07:01:03 +0000 http://research.gigaom.com/?post_type=go-report&p=232669/ Implementing a contiguous, enterprise-wide data storage architecture can be used to break down the barriers and relieve applications of the burden of data silos.

The post Unifying petabyte-scale unstructured data to enhance enterprise data value appeared first on Gigaom.

]]>
Enterprise IT is well-aware of the connection between data storage and information. Yet fragmented data storage continues to hamper the delivery of information to business users. Enterprise IT still manages storage — and therefore information — in disparate silos. Data silos represent barriers to many of the new enterprise initiatives to establish IT as a service, delivery of applications to business users on mobile devices, and analytics applications that leverage all enterprise data — present and past. Reaching an understanding of data value and ownership is also hampered.

The connection between the creation of data silos and the information silos that result has yet to be made at the application level. Application developers and managers are well-aware of information silos but have tried to integrate them with middleware, adding additional expense and complexity to the application environment. Now there is an alternative to middleware that can be implemented at the data layer as opposed to the application layer. Implementing a contiguous, enterprise-wide data storage architecture can be used to break down the barriers and relieve applications of this burden. This data-centric approach is an emerging storage market category. Tarmin has been an early player here, promoting an approach it calls data-defined storage.

A data-centric approach to storage has implications for multiple levels of IT administration, from the CIO to applications managers and storage administrators. It can be used as a powerful tool by a newly emerging executive, the chief data officer (CDO), as well as IT administrators to implement an enterprise-wide data-management strategy based on an understanding of the true value of data, the consistent application of data governance policy, and the ability to leverage all enterprise data for analytics applications. Implement a data-centric approach to:

  • Understand the true value of data
  • Achieve storage management sustainability in an accelerating data-growth environment
  • Consistently enforce data-governance policies and processes
  • Search and retrieve data on an enterprise-wide basis
  • Harness all enterprise unstructured data for big data analytics
  • Position the enterprise to respond quickly to new business opportunities

The post Unifying petabyte-scale unstructured data to enhance enterprise data value appeared first on Gigaom.

]]>
How Hadoop Passes an IT Audit https://gigaom.com/report/how-hadoop-passes-an-it-audit/ Thu, 16 Jan 2014 18:00:41 +0000 http://research.gigaom.com/?post_type=go-report&p=213201/ Enterprise IT will become more directly involved with managing and supporting Hadoop -- a process that is by no means a given.

The post How Hadoop Passes an IT Audit appeared first on Gigaom.

]]>
Not originally created for the enterprise environment, Hadoop was built for internet data center environments like Google, Yahoo, Facebook, and Twitter. These are different IT environments, structured, supported, and managed in different ways from enterprise IT. As a result, Hadoop currently lacks many of the functions and internal processes that enterprise IT needs in terms of security, availability, data integrity, and data governance.

There is no question that there are enterprise-level industry segments where Hadoop has taken hold and is flourishing, such as financial services, health care, pharmaceuticals, and energy. Most deployments are in departments with centralized IT becoming involved from the standpoint of providing and integrating infrastructure (servers with embedded storage, networking gear, etc.). In addition, these grassroots Hadoop projects are still mainly at a secondary level and are not yet considered to be critical, production-level IT.

Hadoop must mature further in order to be regarded as a viable enterprise platform capable of supporting critical business functions running real-time applications. As Hadoop matures so will its critical nature within the organizations that are now learning its ins and outs. Enterprise IT will become more directly involved with managing and supporting Hadoop — a process that is by no means a given. In essence, Hadoop has to follow the rules of centralized IT, and therefore the platform will be subject to production data center security levels, management processes, data protection and data integrity guarantees, data governance policies, and, above all, service level agreements (SLAs).

This report will:

  • Place Hadoop in the context of enterprise IT and help those managing Hadoop platforms make it responsive to the enterprise’s data governance policies and processes
  • Outline these policies using the industry segments and data sources mentioned above
  • Describe ways in which Hadoop can be made responsive to the enterprise’s IT infrastructure, security, auditing, and compliance stakeholders
  • Make the point that by addressing these concerns, Hadoop can advance to full production status, including support for real-time applications

Thumbnail image courtesy of Thinkstock

The post How Hadoop Passes an IT Audit appeared first on Gigaom.

]]>