by Aaron Lazenby
Big data has grown beyond buzz. According to PwC’s 18th Annual Global CEO Survey, 80 percent of CEOs now consider data mining and analysis to be strategically important.
“It’s no longer a question of, ‘Do we need big data?’” says Jeff Pollock, vice president of product management at Oracle. “Our customers start with the assumption that there is value in big data, and that they need to make it a part of their business.”
Pollock should know: his job at Oracle brings him close to the IT and business decision-makers looking to expand existing big data experiments to scale for the enterprise. Here, he talks to Profit about how he sees big data changing, and the steps every executive should take to launch smoother, more successful projects—including how to avoid common stumbling blocks when it comes to choosing the right technology.
Profit: What are the biggest challenges you see customers facing when launching big data projects?
Pollock: One of the challenges I see quite a bit is cultural in nature. A lot of big data programs were stood up in the early R&D phases of a program, and there’s this idea out there that anything open source is great, and that anything vendor-provided is bad. Open source can be appealing from an ethos standpoint for some IT folks, and it also tends to be very appealing to an R&D group in a large organization, because they get to go basically tinker and do a lot of the assembly.
But the whole thing with big data is being able to use the data you already have more effectively. So, one of the first things business leaders try to tackle when making a big data solution work effectively is bringing data in from existing IT systems that are there. And it turns out open source tools and technologies around this are limited, because they are about the data management platform, not the data integration platforms.
What that begs the need for are enterprise-class tools normal businesspeople can use to be productive. One of the reasons Oracle has been so successful is we have our engineered systems strategy, which can be rolled off the truck and plugged into the data center, working within a matter of hours. We have this big data appliance that allows customers to bypass the whole process of doing installation, setup, configuration, and optimization, as well as the preinstalling of Hadoop infrastructure and integration technologies.
Profit: What is the secret to successful big data projects?
Pollock: The number one characteristic of successful big data proof-of-concept implementations is that the business has been involved early in order to define a concrete objective for the laboratory people to prove. They need to provide very directed guidance: What type of value do they want the R&D team to try to achieve with the big data environment? This could be working with telematics data from a manufacturing company, or it could be driving some type of fraud detection or advanced predictive analytics with financial data for a financial services customer. Or it might be providing a more effective 360-degree view of the customer from a retail perspective, from a marketing perspective, and so on.
Our customers start with the assumption that there is value in big data, and that they need to make it a part of their business.”
Of course, the challenges don’t stop there: When you’re successful with that initial R&D effort, what then happens is the larger initiative gets funded to roll that out. Typically, the next step would be to think about how to make it operational. All of a sudden, you don’t have this little sandbox in the R&D lab. Now you have to think about what does a whole ecosystem look like where you have a development environment, multiple test environments, and production environments that are operating 24/7/365.
Profit: Big data isn’t a new phrase anymore. How is it evolving—and changing organizations?
Pollock: A couple of years ago, the dominant pattern was IT played with the technology and looked for business value. Our customers now know the technology can work as advertised, so the onus is on the business side to come up with high-value use cases to take to full production and fund at a significant level. And they need to start building large computer infrastructure to handle the data.
Also, I see a lot of our customers, particularly in financial services and insurance, establishing what’s called the chief data officer [CDO] role. One of the functions of that CDO role is to employ all data people: data analysts, who are close to the business, but experts in data; data stewards, who are experts in data rules, policies, and governance; and data scientists, who are the folks closer to being statisticians or mathematicians, and who look for undiscovered patterns in the business data. Five years ago, people thinking about data were living in many groups, but now we’re seeing the institutionalization of data science as a role.
Photography by Shutterstock