“In this world, nothing is certain except death and taxes” — in this famous quote from Benjamin Franklin, the founding father highlights two things that are inevitable in a modern society. So what does this have to do with enterprises?
Enterprises today are, for the most part, in a constant struggle to maintain their existing infrastructure while attempting to provide an ever increasing level of data protection. As they are requested to continue to store more data, IT departments dedicate more resources to backing up their data. In a virtualized environment, there are plenty of ways to backup a workload or file. There are backups to separate software or hardware, or backups to cloud. There are snapshots and replicas to on- or off-site locations. There are still even trucks that pull up to pick up your tape copies.
These methods all lead to copies of data to be called on in case of emergency—a fire, a flood, a data breach, or an accidentally deleted database. But every time we back up that application or workload, we are paying a tax.
What do I mean by paying a “tax?” One of the definitions of a tax is “a burdensome charge, obligation, duty, or demand.”
When that legacy backup kicks off, we are taxing the underlining hardware. We are burning resources on every component: the source system, the network, and the backup (destination) environment. This hasn’t changed since the inception of backup and recovery technologies.
Furthermore, what about the human resource tax? Enterprises usually require dedicated individuals just to manage the backup infrastructure. The cost to continue to manage a legacy backup environment is often too high for enterprises, and they have to cut corners on the frequency or number of backups, or the amount of time they’re kept. Babysitting backups on the weekends is not a glamorous job, but is deemed necessary to ensure a sufficient level of protection.
As the modern data center market evolves, there are big differences in how enterprises are managing and protecting their data centers. Hyperconvergence is a revolutionary option that promises simplicity and flexibility. Many of the converged infrastructure vendors, though, haven’t gone as far as solving the data problem itself, and many use too many resources running the platform itself. That obviously detracts from the promised benefits of the technology.
Life After the Infrastructure Tax
SimpliVity’s hyperconvergence technology grew out of the increasingly evident market need to simplify IT. SimpliVity hyperconverged infrastructure runs on the company’s own file and object store, which has the built-in ability to deduplicate all writes from the moment data comes into the system.
Even better, it does it without degrading performance while also removing this infrastructure tax. This is one piece of SimpliVity’s Data Virtualization Platform, which also includes the OmniStack Accelerator Card. The card offloads processing so that the system isn’t overburdened. It adds less CPU and RAM, too, which also tax traditional infrastructures. This all means that predictable performance is possible, since VMs and storage aren’t competing for resources.
Instead of using snapshots, SimpliVity builds in native data protection for all running workloads from inception. Snapshots are old technology, born from the growing pains of virtualized infrastructures. Deduplication was also first used to deal with the growing data problem. A big part of SimpliVity’s tech development was around deduplication. Deduplication still comes in many imperfect forms. Done well, though, as in SimpliVity’s solution, it solves problems all the way through data lifecycles, including easing the burden of protecting and backing up virtualized workloads.
Data backups generate IO, which taxes all infrastructure as soon as it’s started. But SimpliVity backups don’t generate IO. Yes, I said they do not generate IO – a SimpliVity backup is just more like metadata for us, which is significantly smaller than the actual data. Read more on SimpliVity backups here. Furthermore, remote backups only copy the unique blocks to the destination object store. There’s never any utilization of the host or the virtual machine to do the backups—and that opens up a world of possibility for extremely efficient resource usage.
The differences customers see are profound with SimpliVity hyperconvergence. One of my customers backs up his 1.5TB database, and it used to take an hour and a half. He had it scheduled to run every four hours. Now, since he’s implemented SimpliVity, that same database takes only five seconds to back up locally. He’s now backing up every 15 minutes. That’s what built-in data protection looks like—and what a tax-free infrastructure does for businesses.
The transformational impact SimpliVity has made on IT environments is real and here for enterprise workloads. Don’t believe us or still skeptical? Reach out – we’d be more than happy to show a live demo and prove our backups remove this tax.
Daniel Paluszek is a Solutions Architect in Tallahassee, Florida. You can follow him on Twitter at @dpaluszek.