This summer the Department of Defense (DOD) has been under fire for its failing grade in the latest FITARA scorecard.  The DOD received the lowest grade of any of the 24 agencies currently on the scorecard and received a grade of ‘F’ in three major areas, including data center optimization. Despite projecting nearly $2 billion in savings through consolidating the number of data centers and optimizing the utilization and performance of the remainder, the agency has only been able to realize $396 million.

While federal law makers have been quick to criticize the agency, NetApp’s U.S. Public Sector Chief Technology Officer for the Department of Defense and Intelligence Community, Scott Rich, made the case for a little leniency in a recent conversation. “The Department of Defense is by far the largest government agency and it’s very difficult to drive and deliver change in a short period of time,” he shared.  “In contrast, smaller agencies, like the USAID, which was atop the scorecard for this quarter, are much more agile and able to demonstrate forward movement on scorecard goals much more quickly.”

Though Rich is sympathetic to the challenges acting DOD Chief Information Officer, John Zangardi, he identified some strategies that could help the agency demonstrate gains in a relatively short period of time.  For Rich, the key areas to tackle to both reduce the number of data centers and optimize the existence of the remaining facilities create a three point plan.

“First,” he said, “you need to know where your data resides and how you use it.”  A common problem for large agencies operating complex environments is that data is stored in many locations and no-one on the IT team has a real understanding of how frequently it is being accessed and used.  By understanding these patterns of usage and access, it’s possible to understand how best to manage data sets both technical and financially.  “This level of insight and planning guides how much capacity is needed and helps determine if data should be stored on-premise, in a private cloud, or in a hybrid environment so that the agency can manage workloads and choose the right storage tier to avoid overspending, or the need to buy extra capacity at a premium.” Rich shared.

With and understanding of where data resides and how it is used in hand, Rich’s second target it to evaluate the type of storage media being used.  Rich said that “with recent advances in flash dramatically changing the equation for public sector customers, it’s time to rethink storage media.”  Adding in flash to the DOD’s data management environment would not only reduce the amount of physical space needed to house media, but also reduce the amount of power being consumed by data centers for heating and cooling.

“Finally,” Rich said, “it’s time to move to the cloud.”  Far from being an all-in or all-out situation, Rich advocated for a hybrid approach.  “Other parts of the DOD have moved to the cloud using a hybrid approach to take advantage of the compute power and elasticity, but keep highly sensitive, mission-critical, information protected from cyberattackers.  Other parts of the DOD have already embraced a hybrid cloud approach and are reaping the benefits of this highly flexible approach to data management.

“Obviously choosing a FedRAMP certified cloud provider is the first decision, but then to build a highly efficient data management environment, the DOD will want to ensure that it has the tools available to manage and tier data across all storage media and one that is reliable, resilient, and scalable,” he concluded.

While the Department of Defense might not be the darling of the data center optimization scorecard this quarter, Rich is confident that the agency’s failing grade can be turned around. “The key to success in data center optimization is developing a plan – not just a high level strategic plan, but smaller tactical plans to address the key areas that stand in the way of optimized performance and an optimized grade.