Halloween Spirit: Data Management Shouldn’t be Scary

NetApp October Blog 2014.png

 

Big data offers incredible promise. But agencies continue to face numerous hurdles in their efforts to make the most of analytics. Whether agencies are trying to improve citizen services or combat fraud, waste, and abuse, overcoming the challenges posed by managing and understanding big data will be the difference between getting tricked… or treated.


At a very fundamental level, agencies continue to grapple with data storage – relative to accessing existing data sets or ingesting new data sets. With data in numerous locations – on-prem, off-prem, in private clouds and hybrid clouds – analytics become demonstrably harder to execute. With agencies shopping for the best deals while juggling increasing amounts of data, it doesn’t appear that storage challenges will get easier anytime soon.


This compartmentalization isn’t inherently bad, it just adds a level of complexity when it comes to big data analysis – you can’t analyze what you can’t see. Looking ahead, one thing we can do to help agencies manage the data deluge is to help them understand where their data resides and how to store newly captured data. Surprisingly, that’s harder than it sounds, but it’s an important first step.

 

Over time Federal agencies, just like any organization, can lose track of data. So gaining more visibility into where data lives will give agencies more control over their most important asset. That visibility will also allow agencies to conduct analytics where data rests, which is preferable to moving large data sets around – a sometimes cumbersome process.

 

In some cases, agencies may want to consolidate their data once they discover that they are storing it in numerous locations. Data management solutions (and related system architectural structures) are key to facilitating future big data analytics value – even when data is stored in hybrid cloud environments. That’s where our data management solutions will come into play.

 

In other cases, agencies may decide they want to standardize their data. This would assist agencies in numerous ways. Since there will be diverse data sets, solutions aimed at increasing standardization would provide agencies with opportunities for better big data analysis.

 

It would allow them to better use old and new data for the sake of conducting longer-term analysis, enabling valuable research into critical data-related findings ranging from cyber security threat analysis and mitigation to advances in fraud detection.

 

It would provide all range of Federal and State Agencies with important opportunities to share data and salient findings. Cross-agency access to data has been discussed, but its full potential remains a mystery. That sort of interoperability and integration could help improve cybersecurity and healthcare delivery, to name a few key initiatives.


Data management represents the key to getting the most utility out of big data analytics. We have a clear role in helping agencies achieve that goal, without getting spooked.


Gary Hale, Vice President, Systems Engineering, U.S. Public Sector, NetApp