U.S. government agencies are struggling to manage the huge amount of data they generate or process, despite the goals of a program designed to operate thousands of data centers more efficiently. The idea behind the Federal Data Center Consolidation Initiative, or FDCCI, was to save space, energy and IT costs by consolidating woefully underutilized electronic data storage centers into fewer sites and servers.
The battle is ongoing, with some advances here, and some retreats there. However, on the whole, federal agencies are far from meeting FDCCI goals and other data management challenges, according to a recent study from MeriTalk.
Vendors that can address these data handling and storage issues should be on the lookout for continuing opportunities in the federal sector.
“As a result of flawed data management practices, federal agencies will spend as much as (US)$16.5 billion storing redundant copies of non-production data — working directly against the Federal Data Center Consolidation Initiative,” the study predicts, based on a survey of 150 federal IT managers.
The federal General Accountability Office has expressed concerns about the FDCCI in a series of reports.
As of May 2013, agencies had closed 484 data centers, with a target of shuttering 571 more — for a total of 1,055 — by September 2014, David Powner, GAO’s director of IT management, reported at a U.S. Senate hearing in June of last year.
It appears agencies will fall well short of that goal, though. As of this May, just 746 data centers had been closed.
Data Center Population Gains
“The Federal IT managers on the front lines are definitely having a different experience than what some White House reports suggest,” Steve O’Keeffe, MeriTalk’s founder, told the E-Commerce Times.
The MeriTalk survey, which was supported by Actifio, painted a more discouraging picture.
“While federal agencies have prioritized consolidation, and transitioned to more efficient and agile cloud-based systems, 72 percent of federal IT managers said their agency has maintained or increased their number of data centers since FDCCI was launched in 2010,” MeriTalk reported.
Taken together, the reports may indicate that most of the data center closings have come within a few agencies, while other agencies have struggled to generate efficiencies.
Meanwhile, the federal chief information officer changed the definition of “data center” to apply it to smaller storage units, thus significantly increasing the total number of centers.
Twenty two of 24 agencies participating in the initiative collectively reported 6,836 data centers in their inventories as of July 2013 — approximately 3,700 more than a December 2011 estimate.
The Office of Management and Budget (OMB) now puts the total at 9,000, O’Keeffe said.
While GAO remains concerned about the program and has urged the OMB to do a better job of oversight, Powner nevertheless found that the consolidations that have been implemented have generated $1.1 billion in savings.
“The closures to date and associated cost savings are significant. If you consider planned closures and potential future cost savings, the FDCCI initiative — if carried out correctly — could be a great success story,” he told the E-Commerce Times.
Multiple Copies Clog the System
In addition to addressing the closure issue, the MeriTalk study revealed some underlying factors behind inefficient data practices.
Key barriers to consolidation — including overall “cultural” resistance, data management challenges, and data growth — have been preventing data center optimization and actually driving copy data growth, resulting in increased storage costs, according to the report.
The major causes for redundant copies and unnecessary storage include ineffective management tools, defining who owns which data, the cost of a separate infrastructure for secondary data copies, and explosive growth in copies.
As important as consolidation may be, the problem goes beyond the scorecard element related to variations in estimating the number of data centers or closures.
Failure to reduce the numbers of centers “is not necessarily because those agencies have too many servers in too many data centers,” O’Keeffe said.
“The bigger issue is that they have too many systems creating redundant copies of data for multiple purposes. About 40 percent of federal data assets exist four or more times, and many agencies do not vary the number of copies they make based on an original copy’s significance,” he explained.
On average, more than one in four agencies utilize 50 to 88 percent of available resources to store copies, of non-primary data — and storing these copies is costly.
Twenty-seven percent of the average agency’s storage budget went toward non-primary data in 2013, and this year, agencies expect that number to grow to 31 percent, managers reported. That amounts to a cost of $2.7 billion in 2014, and to as much as $16.5 billion over 10 years.
The solution isn’t simply the closure of centers — or just moving servers from two rooms into a single space and calling it consolidation.
“It’s about using resources more effectively. The White House chief information officer, along with many senior government executives, and legislation like the recent Senate bill, all should focus on this key point of consolidation efforts,” Andrew Gilman, senior director at Actifio, told the E-Commerce Times.
The opportunities for vendors largely lie in assisting agencies in the management of data.
“In terms of technology directions, it means using more software ‘platforms’ that can combine many capabilities into a single, unified solution versus multiple tools. Federal organizations are looking for opportunities to reduce the overall number of products and vendors in place,” Gilman said.
“For the products they already have, these organizations are looking to increase their utilization and avoid shelfware. Old and expensive maintenance contracts are being reviewed to see if they are really using that product and [to investigate] if there is something better or less costly to replace it,” he noted.
Action Plans and Legislation Spur Market
Market opportunities could develop rapidly. More than 60 percent of respondents planned to implement a data management strategy within the next two years, according to the MeriTalk survey.
That finding signals that “the pressure is on and that agencies need to select a strategy which will ensure they can deliver results,” Gilman said.
“There is a lot of examination now, so each agency is creating plans to begin or complete data center consolidation. If they can’t create or complete a plan, many will look to cloud-enabled solutions,” he suggested.
To prod agencies, as well as the White House, to improve data management performance, Sen. Tom Carper, D-Del., introduced a revised data consolidation bill (S. 1611) this spring.
“The implementation of the data center consolidation initiative is even more important today than when it was launched four years ago, which is why I am working with my colleagues in the Senate and House,” he said.
“This measure builds off of the administration’s efforts and will help agencies focus their efforts on consolidation, better manage their inventories, and ensure that the consolidation initiative is seen through to its conclusion,” Carper told the E-Commerce Times. “While some agencies have fully embraced consolidation, it’s clear that a number of agencies have more work to do on this important initiative.”
The bill requires specified agencies to submit to OMB a comprehensive annual inventory of data centers they own, operate or maintain. Agencies also must report on their multiyear strategies for data consolidation and savings.
OMB is required to set compliance requirements, publicly reveal cost savings goals and results, and provide reports to Congress. The bill also directs the GAO to review and verify the quality and completeness of the asset inventory and strategy of each agency.
“The American people and our budget situation demand robust results,” said Carper, “and this measure is an important part of that effort.”