Voices in Cloud – Episode 14: A Conversation with Dana Gardner

:: ::

Host David Linthicum speaks with Dana Gardner about his long career and the evolution of cloud technology past, present, and future.

Guest

Dana Gardner is president and principal analyst at Interarbor Solutions, an enterprise IT analysis, market research, and consulting firm. Gardner, a leading identifier of software and cloud productivity trends and new IT business growth opportunities, honed his skills and refined his insights as an industry analyst, pundit, and news editor covering the emerging software development and enterprise IT infrastructure arenas for the last 26 years. Gardner tracks and analyzes a critical set of enterprise software technologies and business development issues: Cloud computing, hybrid IT, AI, RPA, next-generation data centers, and IT operations automation. Gardner is a former senior analyst at Yankee Group and Aberdeen Group, and a former editor-at-large at InfoWorld.com. Follow him on Twitter at @Dana_Gardner.

Transcript

David Linthicum: Hey, guys. Welcome to the GigaOm Voices in Cloud podcast. This is your one place where you will hear from industry thought leaders providing no nonsense advice on how to succeed with cloud computing, IoT, edge computing, and cognitive computing. I’m Dave Linthicum, best-selling author and speaker, executive, and B-list geek. With me today is my special guest, Dana Gardner. Dana, how are you doing today?

Dana Gardner: I’m doing great, Dave. Good to be with you again.

“After 30 years of tracking IT advances and disruption – and 35 years of interviewing some of the smartest people in their fields – the combination means the creation and exposure of compelling digital content on what works best as businesses struggle with transformation. That rich experience also generates process innovation and efficiency to produce a continuous stream of narratives (I love this copy, by the way) – now at 850 plus over 14 years on how-to success stories...” (directly on podcasts, which I have actually been on) “– quickly and affordably.” You wrote that yourself?

No, but I like it.

I clipped this directly out of your LinkedIn page. I thought that was great copy.

(Laughter) I probably wrote it ten years ago, so it slipped my mind.

Anyway, Dana, kind of catch us up. I’ve got a new audience here on GigaOm podcast. Tell us what you’re all about, what you’re working on, and what’s a typical week in the life of Dana Gardner?

Yeah, thanks, Dave. Well, a typical week is trying to keep my head above water. It’s been really busy, but I go to a lot of events. I talk to a lot of users. I talk to a lot of executives. I put together podcasts that summarize that. My life is a lot about interviewing, gathering information, synthesizing it, and then coming up with some narratives, making it easy for people to listen to and consume, whether they like podcasts as the medium or written blogs or PDFs, what have you. It’s really the same kind of task I’ve been doing for a long time: learning, synthesizing, and then trying to make the results of that digestible and interesting for as many people as possible.

What do you think is a better medium these days? Are people in the written word, videos, or audio?

You know, it really varies. I even change myself. There are times where I prefer to listen or times I like to read. I think you’ve got to have it all. I think it’s really a continuum.

How ever you intercept that continuum in specific context, whether it has to do with the interface or the time of the day or the place you’re at, I think we’ve got to give people full multi-media and let them take it in how ever suits them in the moment. It’s kind of interesting because it’s a parallel to where we are in the cloud continuum with how IT intercepts data and services and deployment of resources. The person is almost mimicking the whole technology universe.

It’s interesting. I know the way I consume information varies over time. It’s more videos now, but typically it’s a mixture between audios and written word, things like that. I think as people move on, it will become a little bit more automatic and probably the ability to consume the same content different ways. I think that’s going to be a trend going forward.

You wrote a story about how HCI, hyper-converged infrastructure, forms a simple foundation for hybrid cloud, edge, and composable infrastructure. I thought this was going to be an interesting story for our audience because many of them who are in the cloud don’t understand what hyper-converged infrastructure is, how it relates to cloud computing, how things work together, and the advantages of it. Can you enlighten us there?

A cloud audience should get it because it’s really about ‘software defined’ as a way of setting up resources and how those resources have been integrated as an appliance using the software defined capabilities. Just like you would access a cloud service through an API, perhaps, or use microservices to create and make services of your own, the hyper-converged world has taken the server, the network, and the storage and instead of having to work with them individually and be the glue and the smarts and the configuration guru among and between them, a lot of that is automated so that you’re dealing with those deployments and creating your apps and services as you would be in a services environment or a software-driven, software-defined environment. There’s some commonality, and that’s why I wanted to have that discussion.

I think there’s a lot that they can benefit from each other. The cloud benefits from hyper-converged and edge as well. Then the people who are good at hyper-converged should be able to consume and use cloud all the better. We’re looking, again, at a continuum. It’s not either/or. It’s all of the above.

Where did HCI come from? Where did it evolve from?

I think it started in storage. There was complexity with how people are dealing with storage and arrays. There was some automation and intelligence brought to that and some standardization. We saw convergence around different storage environments so that you could just basically declare your storage preferences and then it will be automated as to how that would get played out. Then that got extended to the server and to the network. I’ve had software defined, evolution across those three domains, and then you combine them into a converged and ultimately a hybrid converged offering.

Going forward, how should cloud architects think around HCI? I think you’re right. Many of the same patterns implied computing are there within HCI. It’s kind of derived from HCI. These kind of architectures work and play well together, so they’re not necessarily mutually exclusive. What are the use cases typically for this technology?

HCI had kind of a weird evolution to get started. It became very popular for VDI or virtual desktop infrastructure uses. A lot of that had to do with the fact that you were trying to manage some very daunting requirements for storage and compute, and you’re doing it all in a virtualized environment such that HCI just was the right thing at the right time. VDI sort of drove the point of the arrow into the market for VDI.

I think private cloud is really it’s strong suit. It makes a lot of sense. Again, as you pointed out, the architectural and deployment skills and strategies align between that and cloud. It’s been a very rapidly growing market the last year, year and a half. Dell has done very well. Nutanix has done quite well. There’s several other players, but those are the two biggies.

What about utilization in edge computing?

As we look to bringing essentially the equivalent of standard data center to the edge, we’re creating micro and sometimes ruggedized data centers. It makes sense to me that you would want to have a hyper converged environment there. If you’re having trouble setting up a data center on premises where you have proximity and control, imagine the added complexity of bringing that out to edge in a factory or a warehouse or even further into devices themselves like a 5G transmitter on a telephone pole somewhere. Hyper-converged also to me makes a great deal of sense at the edge.

Now if we have commonality between what we’re doing at the edge, what we’re doing in a hybrid or a private cloud, and also what we’re doing in the public or multi-cloud, suddenly we’re starting to really see a commonality of the likes I don’t think we’ve ever seen before in IT. That’s not only a benefit for skill sets, it really allows for a commonality of services, apps, microservices. I think it’s kind of a very interesting environment where we can start to see commonality of apps and services from edge to core to cloud, which can be quite powerful.

You talked a lot about leaders in the HCI space and some of the interviews you have up there on BriefingsDirect. I urge everybody listening to this podcast to go check those out. Where do you see this technology going? How are you aggregating the opinions of all these various personalities in the HDI market as to where things are going? What’s truly going to be the case going forward? What are they predicting, and how are you feeling it?

We’re going to probably stop talking about HCI and hyper-convergence and maybe even hybrid cloud pretty soon. There won’t really be a factor of how it’s deployed. We’ll be focusing more, as we should be, on the applications, the process, the services, and making intelligence part and parcel of that. Being able to do data ops and have data available across that continuum from the edge to the cloud and then factor whether analytics should take place at the edge or in a cloud or in a data center, depending on things like latency and deliverable access for data sets, as well as cost and security and data sovereignty issues. Having that continuum and factoring in the use of analytics along the way, – that to me is where the new sort of innovation is and where we should be focusing.

We’re going to be able to start doing things with business process and optimization in mind and not be thinking about whether it’s crunching at the core, crunching at the edge, whether it’s hyper-converged or bare metal or standard legacy UNIX deployments. It won’t be that much visible or people won’t even care. Once again we’re abstracting above the plumbing, yet higher and higher as we go through the 30 or 40-year progression of IT. The good news is: a lot of what people wanted to do, write once, run anywhere, if you will, 20 years ago is starting to come true. It’s not even going to be worrying about an application. It’s going to be focused more on accomplishing business, making the best of what people do as people and what machines do as machines, and then letting the people who are on the procurement side, the financial people, let them decide whether it should run in a cloud, a private cloud, or the edge.

It’s not really a technical decision any longer. It’s about what’s the best cost? What’s the best optimization? What’s the best price performance equation? Let that algorithm decide and stepping back and focus on your business benefits.

Boy, I hope so. That will make my job as an architect much easier. Ultimately, the ability to abstract yourself away from all this complexity that’s coming down the line, and it’s getting worse and worse, is going to be the core challenge.

The other article you wrote, and I really enjoyed this, How Real Time Data Streaming and Integration Set the Stage for AI-Driven Data Option; this is an interview, of course, but I thought the topic and the subject matter is spot on. We do have this AI-driven data operations notion that really emerged in the last couple of years. AI ops people are basically binding these things together.

It seems to me kind of a match made in heaven since we are talking about dealing with complexity and we have data streaming, data integration, the ability to have common metadata, virtualized databases, things like that. Things are getting so intense and complex out there in dealing with information. We need to have these learning operational layers in order to basically increase and better things in real time so they can be smarter than we can be. We can’t monitor these things all the time. They can make adjustments for how they need to operate the data effectively. Am I off base?

No, not at all. What’s interesting though, is there are vertical industries that get this and that are applying analytics in ways that replace what is repeatable, drudge-type of affairs that people do. Sometimes that’s in call centers. Sometimes it’s in finance or procurement. What’s odd is that we’re replacing complexity and redundancy in business settings, and the IT people who have the same issues are somewhat reluctant.

It’s the case of the cobbler’s children not having any shoes. IT is providing the tools for data scientists and people that are building bots using programmable robotic applications. The IT people seem to be stuck still using spreadsheets and sometimes using little yellow sticky notes to manage the complexity of their IT operations. Increasingly, as you say, the complexity across a continuum of public cloud, private cloud, multi-public cloud, also factoring in legacy, also factoring in edge, also factoring in analytics and data analytics and warehouses, it can’t be something that [these] roller chair network administrator folks can handle, and they shouldn’t. Increasingly, they should be looking to write the scripts and the bots that will do these configurations and look for issues and even start predicting where breakdowns will take place and optimize and automate and let the machines optimize the machines so that again, we will elevate and abstract what the people do away from maintenance and firefighting and into innovation and business process optimization things.

AI ops to me is fascinating. The tools are there. The technologies are there. I’ve talked to lots of people that are building these analytics applications and engines and capabilities, but I’ve yet to see any vendor or any supplier step up and say “We’re going to be the one name brand that you’re going to recognize for bringing analytics to the AI environment and conquering the complexity issues.” It still seems to be for some reason something that people don’t want to just bite off and own.

“People” being the enterprises consuming the technology or the technology vendors providing the technology?

The vendors. A lot of the vendors apparently resist being associated with an IT AI product set. Do you know of any?

No, I don’t. There’s a few that are thinking about the two. Really, if you think about artificially intelligent systems that are bound into analytical systems and predictive analytical systems, things like that, I don’t think we’ve been able to scratch the surface because we just don’t have the skill sets out there. You don’t have people who have knowledge of engineering, deep skills in business analytics, deep skills in data analytics, and the ability to bind these things together. We may have silos of people and scientists that exist out there.

When you deal with the vendors, you’re typically not dealing with deep academic scientists. You’re dealing with people who are able to collaborate and build things that they’re able to go out and sell in a couple of months or six or seven months down the line. This was always the big dilemma in my world, the ability to kind of either bleed the edge out there and getting too far out ahead of it and not necessarily having the impact in the market.

The other thing was finding the people who were able to actually do it. You can’t do everything yourself. The ideas are leading the actual execution of it. Is that the best way to put it? What are you seeing?

I think there’s cultural issues. The skills thing is big too. Maybe we need to find some data scientists who want to do a summer abroad. Instead of going overseas, actually go and move into the IT department. I think if you brought some data scientists and stuck them in the IT department and left them there for three months and told them they couldn’t leave, they would probably find lots of cool interesting things to do with all the data that’s flowing out of these machines, all of these servers and virtual environments and distributed data centers, whether it’s HCI, on the edge or what have you. If they could pool that – and we have lots of great vendors that are out there helping to make that data available. It’s just no one is seemingly plugging it into the bleeding edge analytics capabilities and then starting to automate from that to run IT better than people can basically.

Again, your lips to God’s ears. We’re getting to the point where we actually need this technology to be in place just because of the complexity of the systems that are showing up. We’re layering upon multi-cloud. We’re layering these complex hybrid cloud infrastructures, the ability to leverage some of these cloud peripherals such as Outpost and Stack that are existing on premise, which are kind of analogs for the cloud. Getting into this architectural complexity, and the data complexity on top of it, and the ability to in essence make sense of that in some sort of a meaningful way and continuously improve that over time is just something I don’t see as a fire in the belly within most of the enterprises and even some of the vendors out there.

Maybe the idea is the fact that we are thinking a little too advanced. People aren’t necessarily ready to take their data to that level. I found back in the integration space they’re even unwilling to integrate the data and get the stuff together. What do you think is going to get them off the dime? Competitive forces, people following the hype ‘management by magazine’ kinds of things?

If you have the philosophy that you don’t want all of your IT to go to a public cloud, you better think about the fact that the hyper scale or public cloud people are putting tools like this to work. They’re not productizing them, of course. They’re not bringing them out to the market. You can bet your sweet bippy that AWS has got some AI going on behind the scenes to help it run its data centers to the optimal degree. Every nickel and dime that they save in terms of efficiency, every core that doesn’t have to go down and be replaced or can be replaced before it goes down is money in the bank. That margin is their business.

If the hyper scalers are creating internal means to do AI ops and they are nowhere near or interested in bringing it out to the market, they’re just going to be able to beat the pants off of people who are trying to run fast data centers in a private setting and do it at a price point that is acceptable or even at a complexity level that’s manageable. I think, perhaps, the ultimate push in the market will be: if you’re thinking that you’re in some ways competing or resisting the public cloud providers, that you don’t want them to make an offer you can’t refuse, then you better start doing AI ops. I’m pretty sure they are.

I couldn’t agree with that more. That’s absolutely what people need to be thinking about right now. Folks out there, if you’re looking in terms of becoming excellent in dealing with operational aspects of data, governance, compliance, the list goes on in terms of the things we’re dealing with right now. Typically you have pressure getting into operations as things get more complex. They are finding they are being set up for failure just because they can’t manage the complexity with the budgets and the resources that they have. This is a place you can look.

Obviously, this is going to be an issue that’s going to hit you in 2021, 2022. You should be solving the problem now. That’s how long it takes to get the approaches in place and the tooling in place to make things happen.

Please pick up a copy of my book, Cloud Computing and SOA Convergence available on Amazon and other places books are sold. Also make sure to follow me on Twitter @DavidLinthicum, as well as LinkedIn where I have several cloud community courses on LinkedIn Learning, including architectural courses that just got out there. Dana, where can we find your stuff on the web?

Yes, indeed. You can find me on Twitter @Dana_Gardner. I’m on LinkedIn. I’m publishing constantly there; BriefingsDirect.com for podcasts and BriefingsDirect blog for the written version, if you’re a reader.

Make sure to follow Dana. He’s a prolific podcaster, blogger, and he’s got probably one of the smartest pieces of information in the business right now. He talks to a lot of interesting people that have a lot of interesting things to say. I think right now it’s at the point where you need to listen to lots of different opinions on how this stuff is going to evolve. So until next time, best of luck in building cloud computing solutions. We’ll talk to you next week. Take care. Bye.

Interested in sponsoring one of our podcasts? Have a suggestion for a great guest? Please contact us and let us know.