Facing the Backup Window Dilemma

IMGCAP(1)]IT managers face the backup window dilemma all too often as data volumes continue to grow by as much as 50 percent annually and business requirements increasingly dictate an around-the-clock operation. This begs the question: when one backup window closes, will another open for you to complete the process?

Today there is more business-critical data to back up with less time available and as a result, backup operations are straining organizations’ data protection objectives and requiring the development of a more innovative solution. With more than half of businesses running over their allotted backup window, according to a recent IDG study, what processes can be adopted to ensure the data protection strategies deployed at your organization remain viable?

Of the available data optimization technologies, data de-duplication is the one with the greatest potential to deliver substantial and recurring impact on the cost and manageability of data growth.  Backup solutions have employed data de-duplication engines for many years to save storage space in the backup data store.  Although the cost of storage continues to decrease, data growth has expanded faster and as a result more and more storage is being consumed. Additionally, the backup windows mentioned above are more and more critical and any performance hit that de-duplication may impose further impacts the ability of the backup solution to complete within the allocated window. 

Some backup deployments use data optimization in a post-process mode to help shrink the amount of time required for backup in order to meet their backup window objectives.  To do this, they must add data cache storage requirements prior to data optimization, which increases the cost and complexity parts of the IT equation. 

The only thing getting smaller than today’s backup windows are ever-shrinking IT budgets.  Rampant data growth affects budgets, operating costs, floor space and, of course, capital expenditure through the amount of data created and its associated cost. Therefore, the efforts to shorten backup times may be further limited by the need for cost containment.

To address these issues and gain competitive market share and revenues, OEMs need to implement data optimization technologies that satisfy the following key requirements:

  • Performance -- Data optimization must be extremely efficient and maintain a level of performance that does not impede overall storage/backup performance. If the data optimization engine runs at very high performance levels, the engine can run inline. This mitigates the need for a post-process optimization run and its data caching, while also eliminating the need and cost of temporary storage for this cache. Storage vendors have made billion-dollar R&D investments to optimize their storage performance as a means of differentiating their offerings. Any optimization engine today should never impact the speed of the overall backup process.
  • Scalability -- Less than 10 years ago, only a handful of IT organizations had a petabyte of data. Today, thousands of large organizations have requirements for more than that amount. Data optimization solutions must be able to scale to multiple petabytes to address the needs of these customers at petabyte levels and, in the future, exabyte capacities.
  • Resource efficiency-- Whether backup is done in a standalone appliance or with server-based software, RAM and CPU resources are consumed by the de-dupe/data optimization process. By employing highly efficient resource utilization techniques, more scalability is enabled and the ability of the backup process to run as quickly as possible is enabled.
  • Data Integrity -- Data optimization technology must not interfere with the storage application software in a way that increases data risk. The OEM’s storage software must maintain control over writing the data to disk and the data optimization software cannot modify the data format in any way. This has the benefit of eliminating the need for complex data reassembly processes (commonly called ‘rehydration’), and protects data against possible corruption.

OEMs need to evolve a backup solution that can run without latency or performance impact (which may further limit the ability to complete backup within the allocated window) while eliminating the need for the post-process data cache and its cost. And it must work seamlessly across primary, archive and backup storage by broadly addressing the needs of each of these storage tiers.
Backup windows are getting smaller and smaller yet more and more critical. When properly deployed, data optimization technologies help IT complete backup operations within the allotted time and at an improved total cost of ownership over alternative techniques. 

Wayne Salpietro is the director of product and social media marketing at data storage and cloud backup services provider Permabit Technology Corp. He has served in this capacity for the past six years, prior to which he help product marketing and managerial roles at CA, HP, and IBM.

For reprint and licensing requests for this article, click here.
Technology
MORE FROM ACCOUNTING TODAY