The Federal government has a massive data center footprint: 6.76 million square feet, to be precise. Of its total portfolio, an astonishing 62 percent of all floor center space goes unused. To add insult to injury, it’s extremely expensive to maintain all that unused space.
“To do right by the American taxpayer, the first thing we should ask is… are we using everything that we have? A lot of individual [data center managers] have done a lot of great things for [individual application owners], but collectively, it creates new deficiencies,” said Dan Pomeroy, director at GSA’s Infrastructure Optimization Center of Excellence, on a GCN webinar this week.
While the Data Center Optimization Initiative aims to aggressively reduce the number of unused data centers, change has come at a crawling pace and savings have been negligible. In 2014, Federal agencies projected collective target savings of $4 billion on closed or consolidated data centers. By 2016, the same agencies reported $378 million in savings, a $3.6 billion shortfall from their 2014 targets.
The challenges to boosting that savings total are not insurmountable, said Pomeroy. In fact, some savings can be realized from very basic facility changes, like turning up the thermostat and installing motion sensor lights.
“Turn off the lights,” advised Pomeroy. “Most of the equipment we have is certified by OEM providers to operate at a higher heat tolerance than at what we currently store them. They can handle more heat. Older data centers can benefit from cold aisle containment. Invest in building management software. Finally, you just have to be smart about what goes in the data center. Is it really necessary? Keeping workloads at a data center just because we have the cow-path to do that is probably the wrong kind of thinking for the Federal government to have.”
Pomeroy recommended data center portfolio managers look closely at data center performance metrics such as PUE (power usage effectiveness), and review those metrics frequently and make sure they are accurate. Also, importantly, he recommends that application owners shoulder the costs for the resources they use.
“If application owners are on an all-you-can-eat model, that’s going to drive inefficiencies… When it’s time to migrate data centers, it’s going to be very difficult for those people to find the budget to move to an enterprise data center.”
Another key problem, said Kirk Kern, chief technology officer of NetApp’s U.S. Public Sector division, is that managers do not have detailed views of the resources used by applications hosted in their data centers.
“Application owners make way more resources requests than their applications can logically consume,” said Kern. “A lot of data centers are over-provisioned because [owners] are making unreasonable requests and infrastructure managers have no mechanism of assessing what the application requires.”
When all else fails, Pomeroy said CIOs need to be given the authority to close underutilized data centers.
“One of the most efficient ways to optimize your portfolio is to shed data centers” said Pomeroy. “Under FITARA [Federal IT Acquisition Reform Act], the CIO has the power to close data centers. If you need to give your CIO teeth to get in there and say, ‘This data center needs to close because it’s too inefficient,’ the law was designed so that he or she can make that a reality.”