For the 2020 Census, America will for the first time conduct digital forms of outreach – a massive collection that both relies on and results in big data. The organization will use a combination of in-office and in-field canvassing and more importantly, rely on new efforts tied to advanced analytics.
For previous census collection, the Census Bureau depended on manual field collection to walk every block in the United States, driving when necessary – about 6.7 million blocks and 137 million miles, according to the United Nations Statistical Commission.
“[W]e will continue to canvas every block, but we will only conduct in-field, on the ground canvassing where it is necessary,” said Lisa M. Blumerman, assistant director, Decennial Census Programs, U.S. Census Bureau. The Census Bureau is working with terabytes worth of data from Federal, state, and local government sources to determine where that in-field work is required.
The use of big data isn’t new in the United States. Administrative data, such as tax data, has helped improve census data collection for decades. Back in March 2012, the Obama Administration announced a $200 million investment in big data projects with the goal of advancing scientific research in areas such as energy, environment, and healthcare. And, within the Federal Government, there are approximately 200,000 data sets available to the public through www.data.gov.
Just as the government is using data to help it work more efficiently for the 2020 Census, public sector organizations must learn how to extract meaning, turning big data into advanced analytics. “McKinsey & Co. estimates that by digitizing information, disseminating public data sets, and applying analytics to improve decision making, government agencies can act as catalysts for more than $3 trillion in economic value,” as reported by Forbes.
It could be argued that public sector organizations stand to make the greatest impact on our society by employing advanced analytics. Earlier this year, the Executive Office of the President tasked the National Security Telecommunications Advisory Committee (NSTAC) with identifying how big data analytics could enhance the government’s national security and preparedness capabilities. “Technologies and analytics can help government protect against, prepare for, respond to and detect emergencies more rapidly and effectively through augmented situational awareness and more accurately projected outcomes,” said Chase Gunter, FCW editorial fellow in his assessment of NSTAC’s draft report findings.
In Chicago, CIO Brenna Berman is leading an effort to create a predictive analytics platform that will process more than 7 million rows of city-collected data daily, according to Government Technology. Chicago’s SmartData project will collect data, examine it for trends, and offer problem-solving predictions. “I think this city has the ability of putting predictive analytics into the hands of every department in the city and unlocking the value of predictive analytics,” Berman said.
Whether with the NSTAC, Chicago’s SmartData, or the 2020 Census, we’re seeing advanced analytics serve our nation in a big way. Yet, there are still challenges that face public sector organizations trying to make sense of this inpouring of data. For instance, organizations often lack the tools and training to extract value from big data, as well as the IT capabilities to manage the data consistently. Adding big data to existing architecture is complex and requires detailed preparation to organize data relationships. And, as we’ve discussed, this data often comes from disparate, siloed resources, making it more difficult to correlate. Advanced analytics is also a relatively new concept in government, meaning routes to governance or enforcement can be hazy.
Achieving New Outcomes Leveraging Data Analytics
According to IDC, “By 2017 unified data platform architecture will become the foundation of BDA [big data analytics] strategy.” Oracle uniquely offers organizations a complete and truly integrated solution to address the full spectrum of enterprise big data requirements. Oracle Cloud Infrastructure’s framework enables converged hardware and software integration at every level of the IT stack. By evolving enterprise architecture, public sector organizations can leverage the proven reliability, security, and performance of Oracle systems, including the new Oracle Big Data Management system.
Oracle addresses agencies’ big data requirements through a number of tightly integrated solutions, including: Oracle Big Data Appliance, Oracle Event Processing, Oracle Real-Time Decisions, and Oracle Business Intelligence Enterprise. These solutions can be deployed on-premises or in the cloud.
Oracle’s big data framework is built on strong analytics, helping ensure that organizations can discover and make data-related predictions faster than ever. Furthermore, with tighter integration of software and hardware, a true converged infrastructure stack empowers government agencies to better utilize their data, enabling them to:
Oracle’s unique architecture development process guides agencies through each of the following stages: context evaluation, architecture vision, current state assessment, future state definition, road map development, and change management.
To learn more about Oracle Cloud Infrastructure, please contact Amit Sharma at email@example.com. For more news, tips, and information about government technology, check out Oracle Public Sector’s Facebook page and Twitter.