Panelists at AWS summit discuss benefits and cautions for migration

After years of discussion, consideration, and delay, government agencies seem to have finally reached consensus. For many if not most of them, moving to hybrid clouds – a mix of public and private – appears to offer the best mix of flexibility, agility, cost savings, control, and security, according to a panel of public and private sector executives at the AWS Public Sector Summit.

But all those advantages doesn’t mean it’s easy. Making the switch requires a lot of planning, tackling the government IT culture, and starting simple, the panelists said.

“It starts with your strategic planning; before what would be best, [it’s] what would your environment be,” said James Graham, director of enterprise content management (ECM) and enterprise data management (EDM), U.S. Treasury Department. “We’re currently doing our five-year roadmap. It’s our opportunity to look at each of our programs – where do we see that program in five years, what will we need to do, will we even want to be in that business?”

A critical part of the planning process is considering where an agency’s data should be located. The panel discussed the practical aspects of the debate over what can be moved to the cloud and what needs to remain in-house.

Dan Thomas, chief engineer for the Washington, D.C., Health Benefit Exchange Authority, said his agency had to go through some extra layers of regulatory review when it was considering keeping personally identifiable information (PII) in the cloud.

“There is information we receive from other … agencies that we purposely decided not to put in the cloud,” he said. “For us, it was more politics-driven. [For] other agencies, it might be keeping management over particular types of data.”

Phil Brotherton, vice president of NetApp’s data fabric group, offered another perspective. “Rule One – if you can avoid moving the data, avoid moving the data,” he said. An agency’s data should be close to the cloud, “within a few milliseconds, [but] large WAN hauls are very expensive.”

Red River’s vice president for digital and cloud solutions, Paul Krein, observed that the IT professionals are the ones who worry about where data is located, not users. “Our client’s don’t care where the data goes as long as it’s always there [and] the bills are predictable,” he said. “They’re more concerned about continuity of operations.”

Treasury’s Graham agreed with that perspective. “My customers should never worry about where anything is being hosted,” he said. “My requirement is a seamless environment for my users. It should just work without having to sign in, sign out … It should work as one unified system.”

Achieving that seamless experience is neither free nor easy, however. “Both IT and business disciplines matter more with cloud, not less,” Krein said.

At Treasury, which offers shared services to federal agency clients, “I have to collect from partners [to cover] salaries, benefits, rent, computers, everything it takes to run the program,” Graham said. “There are plenty of commercial companies agencies can buy from, so I have to have something that’s cost-effective and meets their needs … There are regulatory issues that I’m not going to get a return on investment [for] – I have to do it, [but] I need to find the most cost-effective way to implement so my partners are compliant.”

When it comes to moving applications to the cloud, both government and business speakers advocated for moving easy things first.

“What we did was, a year [or so] ago, we migrated all our development environments out to the cloud,” Graham said. “That was a great place for us to start … Start with something super simple, so if something goes wrong you’re not affecting your mission.”

Brotherton suggested another good place to start is with back-up and archiving. “It’s a good ROI and it gets you used to the operations of the cloud … The cloud is a powerful tool; you’ve got to learn how to use powerful tools.”