Monday Jul 14, 2014

Data vs. Information in Higher Education

I thought I would share the text of a letter that we've submitted to Senators Tom Harkin and Lamar Alexander, Chairman and Ranking Member of the HELP (Health, Education, Labor and Pensions) Committee where the Higher Education Re-authorization Act (HRA) and higher ed reform more broadly is being discussed and addressed.  While some of what is being discussed around higher ed reform doesn't intersect directly with Oracle's interests or strengths, the issue of leveraging data to make better decisions (i.e., converting data into useful information) is something that Oracle has not only a keen interest in but also considerable expertise from our work across many different industries.



July 10, 2014

The Honorable Tom Harkin
United States Senate
Chairman, Senate Committee on Health, Education, Labor and Pensions
428 Dirksen Senate Office Building
Washington, DC 20510

The Honorable Lamar Alexander
United States Senate
Ranking Member, Senate Committee on Health, Education, Labor and Pensions
835 Hart Senate Office Building
Washington, DC 20510

Dear Chairman Harkin and Senator Alexander:

Oracle has a shared interest in data related to higher education, and have watched with interest the series of hearings held by the Senate Health, Education, Labor and Pensions Committee (HELP Committee) related to the reauthorization of the Higher Education Act. We appreciate the thoughtful and deliberative approach taken by the Committee related to this reauthorization. While we realize the hearings thus far have not directly dealt with data they have touched on the collection of better data in a general sense. Now that legislation has been introduced, we are offering our comments on the topic of data as it relates to higher education and look forward to working with the Committee as the bill makes its way through the legislative process.

Oracle is a leader in providing innovative and comprehensive data systems for institutions of higher education in the United States. Our software and hardware systems are the foundations for the data platform in higher education throughout the country, with over 1400 campuses leveraging Oracle technology, 430 of which run Oracle Peoplesoft Campus Solutions student information system, the core application of the academic enterprise. Our products are used everywhere, from small community colleges to the largest world-class, multi-campus university systems. Over 11 million students have data stored and processed using Oracle’s systems, more than any other provider of its kind in the United States.

PeopleSoft Campus Solutions is a comprehensive software suite that provides institutions with support for the full student life-cycle, from prospects and recruiting to enrollment and alumni management. Our products allow institutions to provide services and information to students, as well as prospective students, in an easy to use format both online and in real-time.

Industry Strategy Council

Oracle incorporates feedback from its most strategic customers through industry strategy councils. The longest standing of these councils is Oracle’s Education & Research Industry Strategy Council. Meeting semi-annually, this group consists of a broad range of 29 higher education institutions (with representation from community colleges to the largest AAU research universities) which provides input on the most pressing issues facing higher education where technology can play a pivotal role.


Without question, higher education institutions are “data-rich” organizations that collect information on students at multiple points and for various purposes as they progress through educational systems. In fact, higher education institutions likely possess more raw data and information on their students than any other type of organization in any sector. While some institutions have structures and processes in place to analyze and use the data, few have the ability to quickly turn that knowledge into timely action.

Data Silos
Within most higher education institutions there exists a complex web of disconnected systems such as learning management, library, fund-raising, recruiting, human resources, financial systems, research grants, and more. Although these systems could benefit from sharing information with one another, most institutions do not have a uniform way to collect and compile the data produced. As a result, information technology budgets on most campuses are heavily burdened by high costs to connect and maintain integrations between all these systems and the information they hold, which is money that could be spent on higher-value projects to support the institution’s mission.

Particularly at larger institutions, these divisions lead to numerous data silos. A large research university is not unlike a mid-sized city with its own police and fire departments, hundreds of buildings in a multitude of locations, thousands of employees and tens of thousands of constituents. The departments within such a system routinely collect information on students for a variety of purposes, and often times this happens without coordination or sharing of information. For example, Student Financial Aid, the Office of the Registrar, and the Student Affairs Department with a university system all generate separate data that could be useful together, but the lack of collaboration and organization result in a missed chance to use data to tell a comprehensive story. What are absent are the tools, data and IT governance structure, and the organizational capacity to turn data into meaningful information to drive student success.

The keepers of the most data within college systems, Institutional Research departments, typically spend much of their time compiling statistics into fact books and meeting basic regulatory reporting requirements, and are rarely focused on the strategic use of data to support one of the core missions of the university –increasing the productivity of teaching and learning. Operational and performance reporting has fallen to the diverse silos of operations within the complex enterprise that is a higher education institution.

Duplicative Data and Data Management
While one might assume that there exists just one record for each student within a system, this often is not the case and can cause problems on many levels. Human error is one contributing factor to duplicative data. A seemingly insignificant mistake entered into a system from day one can result in duplicate data that over time, contributes to inaccuracies in reporting and in many cases, a false representation of student success (or lack thereof). For example, if an applicant uses the name “John B. Smith” during his first interface with a college, and simply “John Smith” during a subsequent interaction, two records have been created for the one student. In many instances, the “John B. Smith” record is never replaced or deleted, and can actually count against an institution as a non-progressing student or a dropout.

Duplicative data is often created as a result of discrepancies in “data definitions.” Institutions are routinely asked for data on their student population, from various local, state and federal sources, all of which may ask for the exact same information, but in different ways. For example, there are numerous definitions that vary across federal and state program reporting, such as the definition of a full-time student, ethnicity, dependency status, date of birth vs. age as of a certain date and residency, to name just a few. These definitions vary across programs from the Department of Education to the Department of Labor to the Department of Health and Human Services to other agencies interacting with institutions of higher education. Time wasted to report a duplicate record, as well as the amount of unnecessary data amassed as a result are both serious and costly problems for institutions.

Definitional confusion frequently becomes a volume management issue. The volume of required compliance reports is vast and made cumbersome by the multiple definitions of data that vary by report. This pulls limited resources from Institutional Research departments to focus on non-strategic activities like compliance instead of valuable strategic analysis such as evaluating student success and risk factors.

Integrated Postsecondary Education Data System (IPEDS) As required by the Higher Education Act, the Integrated Postsecondary Education Data System (IPEDS) is a system of surveys that collect data from all U.S. postsecondary institutions that participate in Title IV federal financial aid programs. IPEDS collects data within a variety of categories, including institutional characteristics, completions, enrollment, student financial aid, graduation rates, finance, and human resources. While it may have been an appropriate tool for capturing a snapshot of student populations in the past, it is essential that as the Committee considers reauthorization of the Higher Education Act, IPEDS is brought into the twenty first century.

Students in 1966 experienced higher education very differently than students in 2014. Rather than applying for a specific program, staying in it until completion and subsequently becoming employed in that specific field, a typical student today may start out at a community college, change his or her major two or three times, transfer institutions, take a leave of absence and return to school, or even reduce at some point to half-time status. The availability of on-line courses from many different institutions contributes to the swirl though an academic system versus a linear path at one institution. Market forces and employment trends push students into different majors and schools, and family circumstances or simply the high cost of a college degree can also impact how, where and why a student attends college. We need to study students, not just institutions. Further, we need to recognize that each institution has a different mission, and meets the needs of its students in a unique way.

Specific problems with IPEDS are well-documented and agreed upon across industries. For example, as noted in a 2010 GAO Report, “IPEDS graduation rates only measure the outcomes for first-time, full-time, degree/certificate seeking students, which comprise 49 percent of entering students nationwide according to IPEDS data. Students who attend part-time or transfer in are not counted toward a school’s graduation rate. All nongraduates are treated as dropouts, even if they go on to graduate from another institution.”[1] We agree that this is a problem and symptomatic of an outdated system in need of refreshing.

A more longitudinal method that provides a comprehensive, 360 degree view of a student moving through and out of a system can make the same data much more valuable and worth the time and financial resources institutions of higher education invest into complying with IPEDS reporting rules. While a longitudinal view of a student’s progression through higher education is a laudable goal, the endpoint should be inclusive of all learning, K-through-adult continuing education.

Contextualizing Data

It is important to note that the repeated calls to link data to outcomes should be tempered with the reality that defining meaningful outcomes for all interested parties is extremely difficult. Although institutions can influence and support student success, we must remember that the mission of Federal Student Financial Aid programs, as defined by the Department of Education, are to “make college education possible for every dedicated mind.”[2] Although the ultimate goal of every academic institution is to educate students, the way each institution approaches that mission can be almost as varied as the populations they serve. As a result, the outcomes for each institution may be very different, but no less effective.

For example, students often begin a program and acquire the sufficient skills and credentials to be hired by a company without having to complete the entirety of the program or earning a diploma. In such instances, the Department of Education fulfilled its mission by providing education to a dedicated student, and the student achieved his or her goal by gaining the skills necessary for employment in the field of his or her choice. We can all agree that these are not “failure” type situations, and must not allow extremely strict or “one size fits all” definitions for student success.


The reauthorization of the Higher Education Act presents an opportunity for Congress to amend antiquated policies and complex and duplicative reporting requirements that are costly and burdensome to institutions of higher education. In addition, improving data reporting rules can help companies responsible for storing and processing student data, like Oracle, to create products and applications that can help students, faculty and employees of higher education institutions get the most out of their time and efforts.

Creating Additional Resources for Student Services

Our Education & Research Industry Strategy Council members have shared concerns with us regarding the countless hours and resources they must commit to ensuring they are in compliance with federal regulations.

Simplification of data collection requirements across the board would allow institutions to better fulfill their individual missions by utilizing their resources to focus on serving the students themselves. Regardless of type, every single department within an institution of higher education desires for its staff to spend less time behind computer screens collecting, analyzing and reporting on data, and more time enriching and improving upon the experiences of their student population. Academic Advising Departments can spend more time keeping students on the quickest and most efficient pathways to completion, and provide students with additional resources on their career choice or professions. Researchers could spend more time producing research, and less time jumping through administrative hoops.

One Council member recently reported that its Financial Aid Office spends approximately 85% of its resources tracking, monitoring or reporting in order to follow federal regulations. This is clearly a tremendous burden and expense for the institutions, which must abide by a number of separate programs that each come with its own rules and regulations that do not coincide with one another. Further, such systems must be set up and maintained separately, but must work together as each influences the eligibility for the other. For instance, each institution has its own refund policies for students who do not complete their enrollment in a term. Financial Aid must independently perform intricate calculations using specific guidelines to determine if funds must be returned to various federal programs.

It is our firm belief that it is possible to simplify reporting requirements and enrich student experiences while at the same time collecting and synthesizing information necessary to ensure the integrity and quality of our nation’s institutions of higher education.

Increased Efficiency through Simplification

Streamlining data collection requirements could also allow companies like Oracle to
improve the products we offer to institutions of higher education. If the systems we create, operate and maintain are less complex to develop, we can increase our speed of delivery and deployment to our higher education clients, enabling them to operate more efficiently.

With additional time and resources, Oracle could shift emphasis towards more modern technologies such as predictive analytics, device-aware access (mobility), and embedded social capabilities, which would lead to increased utilization and collaboration among our higher education constituents. As mentioned previously, common data definitions could allow for data to be transferred more easily and efficiently, which would make the interface easier to navigate and draw conclusions from. Further, harmonization of data definitions could help us to make the interface more user-friendly, and ultimately result in greater adoption among students, faculty and staff.

Finally, data standardization would be an enabler for common business processes across institutions, which could lead to increases in shared services between and among higher education institutions. Significant efficiency and effectiveness gains could be realized if there were a greater reliance on shared infrastructure (public and/or private cloud technologies). For example, community college systems within states could share a common instance of human resources (human capital management), financial, and student administration systems, among others.

The goal shared among servicers, institutions, and Congress during the Higher Education process is above all, to improve opportunities for students to gain a quality education. It is our belief that the challenges and suggestions outlined above could help that goal become a reality.


Cole Clark
Global Vice President, Higher Education and Research Industry Business Unit, Oracle
Chair, Higher Education and Research, Industry Strategy Council

[1] GAO Report to Congress – “Institutions’ Reported Data Collection Burden is Higher Than Estimated but Can be Reduced through Increased Coordination” (2010) p22

[2] “Who We Are” United States Department of Education Federal Student Financial Aid website. <>

Thursday Jun 26, 2014

Summer in DC

I just wrapped up a week in DC for our Education & Research Industry Strategy Council (ISC) - the seventh meeting over which I've presided since stepping into my current role.

It's exceptionally gratifying to see how much we've progressed in three short years.  We now have a fairly regular dialog with policy officials in Washington, a robust agenda touching on a variety of issues that are in focus for higher education executives, and tying all of that together with a technology underpinning.   We had exceptional turnout of the members as well, including new participation from Vanderbilt, Illinois State, Seneca, McMaster, Chicago, and Valdosta State.

The agenda themes for this session included a Cybersecurity in Higher Education, Information Discovery, Student Success, and Higher Education Cloud.  Two days was not enough time!  While we did spend a considerable portion of the discussing and deliberating, I do think we need more time to tee up issues and have more open discussion than presentations.  It's a hard balance to strike, given that the mission of the ISC is multifaceted (exposing the ISC to new ideas and technologies, getting input on our strategy in education and research, providing access to Oracle executives, and facilitating dialog with policy officials) but the real value comes from the interactions and we need have more of that throughout the time we are together.

I was most impressed by the amount of interest we had from the members of congress that spent time with the council.  We had three Senators (Isaacson from GA, Murphy from CT and Casey from PA), and two members of the House (Foxx from NC and Petri from WI).  Further, Undersecretary of Education Jamie Studley joined us for a long conversation about the proposed higher ed rating system and the implications for data and information in driving those rankings.

The real in-depth discussions, however, were reserved for our Higher Ed Cloud session.  It's clear to me that while the broader industry in moving to Cloud aggressively, higher ed is taking a more deliberate approach, and we need to provide guidance and leverage some of the lessons learned and best practices from other industries who've already made this journey.  There is a real opportunity here for higher ed to become more agile and nimble in order to adapt more rapidly to the dynamics in higher education, but equally possible that they could rush headlong into Cloud for Cloud's sake without a plan and create more issues than already exist in higher ed IT today.

Overall I was very pleased with the outcome but the real test will be in the feedback we receive from the approx. 30 member institutions.  I am already looking forward to December when we reconvene in Redwood Shores!

Thursday Oct 17, 2013

College Ratings via the Federal Government

A few weeks back you might remember news about a higher education rating system proposal from the Obama administration. As I've discussed previously, political and stakeholder pressures to improve outcomes and increase transparency are stronger than ever before. The executive branch proposal is intended to make progress in this area. Quoting from the proposal itself, "The ratings will be based upon such measures as: Access, such as percentage of students receiving Pell grants; Affordability, such as average tuition, scholarships, and loan debt; and Outcomes, such as graduation and transfer rates, graduate earnings, and advanced degrees of college graduates.”

This is going to be quite complex, to say the least. Most notably, higher ed is not monolithic. From community and other 2-year colleges, to small private 4-year, to professional schools, to large public research institutions…the many walks of higher ed life are, well, many. Designing a ratings system that doesn't wind up with lots of unintended consequences and collateral damage will be difficult. At best you would end up potentially tarnishing the reputation of certain institutions that were actually performing well against the metrics and outcome measures that make sense in their "context" of education. At worst you could spend a lot of time and resources designing a system that would lose credibility with its "customers".

A lot of institutions I work with already have in place systems like the one described above. They are tracking completion rates, completion timeframes, transfers to other institutions, job placement, and salary information. As I talk to these institutions there are several constants worth noting:

• Deciding on which metrics to measure is complicated. While employment and salary data are relatively easy to track, qualitative measures are more difficult. How do you quantify the benefit to someone who studies in one field that may not compensate him or her as well as another field but that provides huge personal fulfillment and reward is a difficult measure to quantify?

• The data is available but the systems to transform the data into actual information that can be used in meaningful ways are not. Too often in higher ed information is siloed. As such, much of the data that need to be a part of a comprehensive system sit in multiple organizations, oftentimes outside the reach of core IT.

• Politics and culture are big barriers. One of the areas that my team and I spend a lot of time talking about with higher ed institutions all over the world is the imperative to optimize for student success. This, like the tracking of the students’ achievement after graduation, requires a level or organizational capacity that does not currently exist. The primary barrier is the culture of "data islands" in higher ed, and the need for leadership to drive out the divisions between departments, schools, colleges, etc. and institute academy-wide analytics and data stewardship initiatives that will enable student success.

• Data quality is a very big issue. So many disparate systems exist (some on premise, some "in the cloud") that keep data about "persons" using different means to identify them. Establishing a single source of truth about an individual and his or her data is difficult without some type of data quality policy and tools. Good tools actually exist but are seldom leveraged.

Don't misunderstand - I think it's a great idea to drive additional transparency and accountability into the system of higher education. And not just at home, but globally. Students and parents need access to key data to make informed, responsible choices. The tools exist to not only enable this kind of information to be shared but to capture the very metrics stakeholders care most about and in a way that makes sense in the context of a given institution's "place" in the overall higher ed panoply.


Comments, news, updates and perspectives from Oracle's global vice president of the education and research industry--which includes higher education, research, and primary/secondary education (K-12) organizations worldwide.


« July 2016