Copy Data Management Could Save Billions On Redundant Data
A recent study has shown that improved copy data management could save government agencies billions of dollars currently being wasted on redundant copy data.
Copy data is everywhere. A 2012 IDC study found that nearly 75 percent of all data generated is copy data, which means only 25 percent of all data generated is unique — everything else is copy data.
FDCCI (Federal Data Center Consolidation Initiative) deadlines are looming. Data centers are supposed to be shrinking, yet the underlying systems required to ensure government resilience are causing a data explosion. Unchecked copy data growth and management challenges are leaving agencies flailing to meet consolidation mandates.
While federal agencies have prioritized consolidation and transitioned to more efficient and agile cloud-based systems, 72 percent of federal IT managers say their agency has maintained or increased their number of data centers since FDCCI launched in 2010. Only 6 percent gave their agency an “A” for consolidation efforts against FDCCI’s 2015 deadline.
To examine how well agencies were complying with the FDCCI initiative, MeriTalk and Actifio engaged in a study, “Consolidation Aggravation: Tip of the Data Management Iceberg,” that revealed that by 2024, agencies will spend as much as $16.5 billion storing redundant copies of non-production data , in clear opposition to the FDCCI. The report was based on an online survey of 150 Federal IT managers in May 2014. The full study may be downloaded here.
According to the report, key barriers to consolidation — including overall resistance, data management challenges, and data growth — are preventing data center optimization and actually driving copy data growth, resulting in increased storage costs.
The study revealed that more than 25 percent of the agencies utilize 50 to 88 percent of data storage to store copy or non-primary data. In 2013, 27 percent of the average agency’s storage budget paid for non-primary data, a number that is predicted to rise to 31 percent this year. That translates to cost of $2.7 billion in 2014, $3.1 billion in 2014, and as high as $16.5 billion over the next decade.
One in three agencies admits that they do not vary the number of copies based on an original copy’s significance or the likelihood that it will be used again. In fact, 40 percent of federal data assets exist four or more times.
Ironically, when asked for the top pain points associated with copy management, respondents listed regulatory requirements, culture challenges, and storage shortfalls — all ahead of data growth.
“We’ve seen the dramatic impact of a more holistic approach to copy data management in the private sector for years now,” said Ash Ashutosh, founder and CEO of Actifio. “Frankly I’m not surprised by the magnitude of the potential savings at the federal level, or that this has now come to light as a significant barrier to FDCCI. Copy data virtualization is today where server virtualization was 10 years ago. We’re thrilled it’s now been identified as a strategy that can dramatically accelerate the process of data center consolidation, and get FDCCI back on track.”
Although most respondents agreed that better copy data management would improve their agency’s compliance with FDCCI, only 9 percent of agencies have implemented projects to better manage storage and data growth to date. The study shows that as agencies move to comply with the FDCCI deadline and a transition to the cloud, they must also shift from server virtualization to enhanced data management and virtualization.
Steve O’Keeffe, founder of MeriTalk, explained: “With the public flogging that is healthcare.gov, agencies’ IT departments have a siege mentality …. Data and application sprawl are the enemies of government IT efficiency. We need leadership to empower Federal IT innovators to change the failing equation. We need a cultural and acquisition shift to enable new models and the shared services that will unlock new efficiencies and real savings.”