Data Management: The Next Generation
What the data management world needs now is a next-generation, integrated and simplified approach for fast backup and recovery that spans all essential corporate data. The solution means bridging legacy and new data, scaling to handle big data, implementing automation and governance, and integrating the functions of backup protection and disaster recovery.
Businesses clearly need a better approach to their data recovery capabilities -- across both their physical and virtualized environments. The current landscape for data management, backup and disaster recovery (DR) too often ignores the transition from physical to virtualized environments and sidesteps the heightened real-time role that data now plays in the enterprise.
What's more, major trends like virtualization, big data and calls for comprehensive and automated data management are also driving this call for change.
What's needed are next-generation, integrated and simplified approaches, the fast backup and recovery that spans all essential corporate data. The solution therefore means bridging legacy and new data, scaling to handle big data, implementing automation and governance, and integrating the functions of backup protection and DR.
To share insights into why data recovery needs a new approach and how that can be accomplished, the next BriefingsDirect discussion joins two experts, John Maxwell, vice president of product management for data protection at Quest Software, and Jerome Wendt, president and lead analyst of DCIG, an independent storage analyst and consulting firm. The discussion is moderated by Dana Gardner, principal analyst at Interarbor Solutions.
Listen to the podcast (39:55 minutes).
Here are some excerpts:
Dana Gardner: Is data really a different thing than, say, five years ago in terms of how companies view it and value it?
Jerome Wendt: Absolutely. There's no doubt that companies are viewing it much more holistically. It used to be just data that was primarily in structured databases, or even semi-structured format, such as email, was where all the focus was. Clearly, in the last few years, we've seen a huge change, where unstructured data now is the fastest growing part of most enterprises and where even a lot of their intellectual property is stored. So I think there is a huge push to protect and mine that data.
But we're also just seeing more of a push to get to edge devices. We talk a lot about PCs and laptops, and there is more of a push to protect data in that area, but all you have to do is look around and see the growth.
When you go to any tech conference, you see iPads everywhere, and people are storing more data in the cloud. That's going to have an impact on how people and organizations manage their data and what they do with it going forward.
Gardner: Now, for more and more companies, data is the business, or at least the analytics that they derive from it.
John Maxwell: It's funny that you mention that, because I've been in the storage business for over 15 years. I remember just 10 years ago, when studies would ask people what percentage of their data was mission critical, it was maybe around 10 percent. That aligns with what you're talking about, the shift and the importance of data.
Recent surveys from multiple analyst groups have now shown that people categorize their mission-critical data at 50 percent. That's pretty profound, in that a company is saying half the data that we have, we can't live without, and if we did lose it, we need it back in less than an hour, or maybe in minutes or seconds.
Gardner: So how is the shift and the change in infrastructure impacting this simultaneous need for access and criticality?
Maxwell: Well, the biggest change from an infrastructure standpoint has been the impact of virtualization. This year, well over 50 percent of all the server images in the world are virtualized images, which is just phenomenal.
Quest has really been in the forefront of this shift in infrastructure. We have been, for example, backing up virtual machines (VMs) for seven years with our Quest vRanger product. We've seen that evolve from when VMs or virtual infrastructure were used more for test and development. Today, I've seen studies that show that the shops that are virtualized are running SQL Server, Microsoft Exchange, very mission-critical apps.
We have some customers at Quest that are 100 percent virtualized. These are large organizations, not just some mom and pop company. That shift to virtualization has really made companies assess how they manage it, what tools they use, and their approaches. Virtualization has a large impact on storage and how you backup, protect and restore data.
Once you implement and have the proper tools in place, your virtual life is going to be a lot easier than your physical one from an IT infrastructure perspective. A lot of people initially moved to virtualization as a cost savings, because they had under-utilization of hardware. But one of the benefits of virtualization is the freedom, the dynamics. You can create a new VM in seconds. But then, of course, that creates things like VM sprawl, the amount of data continues to grow, and the like.
At Quest we've adapted and exploited a lot of the features that exist in virtual environments, but don't exist in physical environments. It's actually easier to protect and recover virtual environments than it is physical, if you have tools that are exploiting the APIs and the infrastructure that exists in that virtual environment.
Wendt: We talk a lot these days about having different silos of data. One application creates data that stays over here. Then, it's backed up separately. Then, another application or another group creates data back over here.
Virtualization not only means consolidation and cost savings, but it also facilitates a more holistic view into the environment and how data is managed. Organizations are finally able to get their arms around the data that they have.
Before, it was so distributed that they didn't really have a good sense of where it resided or how to even make sense of it. With virtualization, there are initial cost benefits that help bring it altogether, but once it's altogether, they're able to go to the next stage, and it becomes the business enabler at that point.
Gardner: The key now is to be able to manage, automate, and bring the comprehensive control and governance to this equation, not just the virtualized workloads, but also of course the data that they're creating and bringing back into business processes.
How do we move from sprawl to control and make this flip from being a complexity issue to a virtuous adoption and benefits issue?
Maxwell: Over the years, people had very manual processes. For example, when you brought a new application online or added hardware, server, and that type of thing, you asked, "Oops, did we back it up? Are we backing that up?"
One thing that's interesting in a virtual environment is that the backup software we have at Quest will automatically see when a new VM is created and start backing it up. So it doesn't matter if you have 20 or 200 or 2,000 VMs. We're going to make sure they're protected.
Where it really gets interesting is that you can protect the data a lot smarter than you can in a physical environment. I'll give you an example.
In a VMware environment, there are services that we can use to do a snapshot backup of a VM. In essence, it's an immediate backup of all the data associated with that machine or those machines. It could be on any generic kind of hardware. You don't need to have proprietary hardware or more expensive software features of high-end disk arrays. That is a feature that we can exploit built within the hypervisor itself.
Even the way that we move data is much more efficient, because we have a process that we pioneered at Quest called "backup once, restore many," where we create what's called image backup. From that image backup I can restore an entire system, individual file, or an application. But I've done that from that one path, that one very effective snapshot-based backup.
If you look at physical environments, there is the concept of doing physical machine backups and file level backups, specific application backups, and for some systems, you even have to employ a hardware-based snapshots, or you actually had to bring the applications down.
So from that perspective, we've gotten much more sophisticated in virtual environments. Again, we're moving data by not impacting the applications themselves and not impacting the VMs. The way we move data is very fast and is very effective.
Wendt: One of the things we are really seeing is just a lot more intelligence going into this backup software. They're moving well beyond just "doing backups" any more. There's much more awareness of what data is included in these data repositories and how they're searched.
And also with more integration with platforms like VMware vSphere Operations, administrators can centrally manage backups, monitor backup jobs, and do recoveries. One person can do so much more than they could even a few years ago.
And really the expectation of organizations is evolving that they don't want to necessarily want separate backup admin and system admin anymore. They want one team that manages their virtual infrastructure. That all kind of rolls up to your point where it makes it easy to govern, manage and execute on corporate objectives.