Mining Big Data for What It’s Worth

December 5, 2014 Chuck Brooks

When it comes to the collection, digitization and organization of data, federal agencies today have their work cut out for them responding to a twofold challenge. In one respect, as we’ve discussed previously, agencies have increasingly followed suit in the face of the Obama administration’s Digital Government Strategy and evolved from paper-based to electronically-based systems of document management.

The benefits are clear – improved accessibility to electronic records, reduced costs through faster document capture and recognition, and a more seamless customer experience, among others. We’ve seen the Department of Homeland Security taste success in automating paper-based processes, cutting processing time and costs while expediting the process for identifying dangerous foreign visitors. The significant challenge for all federal agencies, though, has become how to extract relevant information from decades’ worth of documents, without overloading databases with unnecessary information.

We can now circle back to the second part of our twofold challenge in managing data – sifting through the mounds of complex information collected each day from a variety of sources – combining to form the all-encompassing term: big data. As noted in Government Executive, the public sector is in the early stages of capturing large datasets, with big data programs expected to emerge in the vast majority of government agencies in the next few years. Agencies are already collecting much more data than they used to, from a wide variety of sources including blogs, emails, videos, social media and photographs – primarily technologies that were nonexistent twenty years ago. Agencies have been told how important big data is, but according to Nextgov, its true value remains elusive for many because managers find it either unreliable, insufficient or overly complex.

The term big data can be (rightly or wrongly) applied to so many functions and processes that it often feels like an abstract concept. But utilizing big data – again, the preponderance of data that agencies have access to – has led to significant innovation in federal projects related to healthcare, national defense, energy, financial oversight and fraud detection, and back office projects. From the breadth of data available, agencies can arrive at powerful insights around supply and demand projections, behavioral trends, market risks and externalities, to cite a few.

Federal agencies must address the issue of how to make sense of the substantial business data it has accumulated and turn it into smart information; in other words, they need to employ data analytics. To maximize the value of data in their possession, agencies must extract relevant information and find patterns in order to optimize business processes and improve efficiency, all while reducing overheads and eliminating overlapping deliverables. Realizing this potential will require overcoming many challenges, including digitizing data from legacy systems, accounting for duplicate and unstructured data, and navigating the volume, variety and velocity of data in today’s digital world.

For any monumental task like this, an agency needs the right personnel. The key is developing and promoting data-savvy executives and managers, who can then lead training programs for front-line and customer-facing employees so they’ll know which data is relevant and how they can drive value from it. For example, a potential customer’s purchasing habits and history of interaction with the company mean little to a call center associate unless this associate can access this information and glean actionable insights to stimulate the customer relationship. The combination of advanced analytics and proper education have made answering calls predictable and scalable, helping agencies staff call centers appropriately and cut down on inefficiencies.

The President’s Council of Advisors on Science and Technology has decreed that all federal agencies need a big data strategy. Implementing a data analytics solution that meets organizational needs can seem daunting, but by applying the right technical staff and methodologies, agencies can achieve their desired results.

About the Author

Chuck Brooks

Charles (Chuck) Brooks serves as Vice President/Client Executive for DHS at Xerox. Xerox is a global product and services company that serves clients in 160 countries. Chuck served in government at the Department of Homeland Security as the first Director of Legislative Affairs for the Science & Technology Directorate. He also spent six years on Capitol Hill as a Senior Advisor to the late Senator Arlen Specter and was Adjunct Faculty Member at Johns Hopkins University where he taught homeland security and Congress. Chuck has an MA in International relations from the University of Chicago, and a BA in Political Science from DePauw University. Chuck is published on the subjects of innovation, public/private partnerships, emerging technologies, and issues of cybersecurity.

Follow on Twitter Follow on Linkedin Visit Website More Content by Chuck Brooks
Previous Article
An Insider’s View to Modernizing Pension Administration
An Insider’s View to Modernizing Pension Administration

The number one priority for government agencies is to serve their constituents. The challenge is that tight...

Next Article
What do Airports and Courthouses Have in Common?
What do Airports and Courthouses Have in Common?

Not many of us see an airport and a courthouse as having similar challenges. However, proven technologies c...