The volume, velocity and variety of the data pouring into government agencies can become problematic if not managed properly, according to Gus Horn, NetApp Senior Global Consulting Engineer. But, if handled strategically, data can become the “next generation of currency, because it is going to drive decisions and drive how government agencies are able to support constituents to provide programs,” he shared. “It is going to be driven by analytics, not on an ad hoc basis.”

Horn will present at the Public Sector Partners , Government Transformation & Innovation 2017, A Blueprint for Citizen Centric Government, and will discuss how data analytics are critical to helping government leaders make the best decisions possible regarding resources and priorities.  GovDataDownload spoke with Horn in anticipation of the Summit, and he reminded us that one “attribute that everyone needs to be keenly aware of is that we are all data donors. Willingly or unwillingly, we are donating data into the ether” and government agencies can tap into that data to help solve problems for all of us.

GovDataDownolad (GDD): The theme of this year’s summit is demonstrating that government agencies can be efficient and accomplish the mission even with limited resources. What role does data management play in helping state and local governments deliver on this commitment?

Gus Horn (GH): Our governments have a vast amount of data that is being collected and already being monetized by private companies that are looking at housing and building records, tax records, amongst other data sources. But agencies can use that same data to become much more efficient at what they are doing and to make better decisions. They should be looking at historical data and going through initiatives of digitizing and categorizing all of the vast amounts of information they already have to find patterns, instead of guessing at what has happened. A great example is the California Department of Water Services, which has become more efficient through effective use of big data. It recently implemented a large data analytics platform based on Hadoop and is using it to look at creating efficiencies with water resources. In California, water is like gold. It is needed for the for the bread basket of America, as well as for drinking water and other things. They are using data analytics to see where water is being used, how much are they sending and to where, to predict usage based on climate conditions and to be really responsible about what they are doing. Rather than guessing, the department is using analytics platform to work smarter. On top of that, they can also spot problems that are very difficult to spot, such as wasted water or leaky pipes before they become catastrophic. They can do prescriptive and predictive maintenance rather than fix on fail, which is probably the most expensive way to do things.  There is a tangible effect that can come from analyzing the data that we have.

Allows smarter management of resources, smarter deployment of maintenance are just some of the promises that big data can bring. That’s the low-hanging fruit – the easy stuff. Big Data effectively pays for itself in what it can save for the user.

GDD: What are some of the challenges that state and local governments are facing with their data management infrastructure?

GH: I see the challenges from the data management infrastructure side. Platforms need to be designed to help our customers capitalize on the promise of big data, but typically they are built on a commodity approach. We have to be cognizant that our municipalities and government agencies are not big search engines or social media platforms, so they aren’t geared for that size or scale, and they can’t lock themselves into any one particular architecture because then they can’t be agile. The focus should be on providing the most agile and flexible architecture that is still congruent with how these analytic platforms are designed and built. To get there, we loosely decouple the compute from the actual storage and provide on ramps and off ramps to cloud infrastructures, because all states and municipalities are moving to a hybrid model where they store data on premise, but also utilize the cloud and its resources.  In this scenario, the ‘Holy Grail’ is encryption and protection of the data, because we must ensure that personally identifiable information doesn’t get out. You have to design security into the architecture, because if you don’t, you effectively are ‘leaving the barn door open.’ The biggest mistake I see is that security is an afterthought rather than the first thought.

According to Horn, we are at a tipping point with Big Data and all of the supporting technologies, which are becoming more mainstream. “There is a cadre students coming out of many schools, not just Harvard, Berkeley or Stanford, who can analyze data, so it is becoming more affordable and a lot more reasonable for agencies to find the talent to implement.” Up until last year it was almost impossible to find a “reasonably priced” individual to do this for you, but today the world of big data and analytics is affordable for government agencies.

Fields marked with an * are required

Want to get more information? Email us directly at or fill out the form below.