Monday Feb 03, 2014

Where Next for Big Data? A look at the ways in which big data analysis might shape the future

If we look at the core elements of big data – volume, variety and velocity - the future looks to be headed in one direction only: more volume, greater variety and increased velocity as more devices come online, more transactions are captured, more personal data provided and more organisations learn to capitalize upon the data created within their business ecosystem. New breakthroughs in technology and the adoption of technologies complementary to big data will drive these increases, not least the ‘internet of things’.

The rise of the machines

There is a lot of talk about the ‘internet of things’ – the idea that one day most things will be connected to the internet. From our fridges creating information about replenishment and talking directly to our online shopping accounts and supermarket loyalty schemes, to the anticipated growth in wearable technology such as smart watches and smart glasses which will relay information on location and behaviors. All of this will create valuable data.

The increasingly powerful smartphones in our pockets will have the power to change the world around us, from the offers we see advertised in supermarkets as our past purchasing history and other behaviors stored on the phone – such as the movies we watch, the apps we use and the places we visit – create a near faultless picture of us as a consumer to the ways in which our banks tailor offers specifically to our lifestyles.

Our cars will transmit ever-more information, creating better deals on insurance and reducing instances of expensive repairs, removing cost from maintaining a car. Our home entertainment systems will intuitively learn more about the content we want and when we want it. What we are prepared to pay for now and what we are willing to wait for to get free.

Our world, only better

The world will become tailored towards our wants and needs. Some changes will be imperceptible to the naked eye or the rational mind but much will be driven and governed by big data. It might just feel that things work a little better or more efficiently but behind the scenes the analysis of big data will be working harder than ever to shape all the moving parts of our physical and experiential environment.

Take transport. Anecdotally many Londoners claimed during the 2012 Olympics that public transport, a much feared weak link in London’s Olympic offering, was faultless for the duration of the games. This is not because issues did not arise but because every scenario was catered for and solutions existed ready to be deployed at the moment of need.

That was thanks to many years or planning but with the analysis of big data, from the analysis of passenger flow into transport hubs, via pedestrian routes and terminus points such as major airports, to likely weather conditions and related disruptions, to real-time location based monitoring of replacement bus services and traffic conditions on alternate transport routes, that level of service is replicable, consistently on a daily basis.

Even scenarios around unforeseeable incidents and the indicators they might be about to occur – such as analysis of data from sensors along water pipes pre-empting or forewarning of a burst water main that could close a road – can be modelled into scenario planning or allow for a fix to be applied before the issue occurs. This will require investment but as metropolitan areas around the world compete for inward investment in a global economy it would be a mistake to overlook the long term benefits.

Big data means safer communities

Law enforcement is one area where major change can happen and we are already seeing the seeds of unprecedented transformation being sewn. Big data analysis can play an important role in identifying trends which allow police forces to better anticipate when and where crimes may be committed.

It is possible to model crimes and predict their outcomes and repercussions and to identify what crimes may breed other crimes in the neighbourhood or within specific groups within society. Big data can help predict which crimes become part of an unfolding spree and which are most likely to be isolated incidents. This will enable police forces to plan resources and ensure units are in the right place at the right time.

Structured, relational data may inform us that burglaries tend to happen more during public holidays when many houses are empty as people stay with friends and family and invariably burglaries happen during the night. The relational data may tell us past victims of burglary are more likely to be victims again. But there are layers upon layers of non-relational data which can be factored into predicting when crimes are going to happen – and where – which is obviously preferable to simply developing a better understanding of where and when they have already happened. Similarly if an incident can be isolated and prevented from developing into a crime spree that too is a marked improvement.

Big Data and Privacy

Of course, it is impossible to have a discussion of big data without discussing privacy. It is every individual’s right to withhold personal information and we can elect to switch off location-based services on our phones and we can politely decline the offer of a customer loyalty card from our supermarket. We can choose not to use car insurance based on in-car telematics. But at the heart of this is a point of cultural tension.

People will resist the gifting of data to businesses and organisations unless it is a mutually beneficial transaction. Organisations need to help consumers see the benefits in order to enlist them in a willing development of truly powerful big data-based businesses.

There is undoubtedly gold to be found among big data but it must line the pockets of consumers and businesses alike. We must get better banking products, an improved retail experience, better home entertainment options, an improved commute, cheaper insurance, a better seat on the plane and a better glass of wine. We need to all feel that our lives are about to get a lot better. And if organisations can help us to feel that, there is no limit to what big data can do.

Friday Jan 31, 2014

Modernizing Business Intelligence

Business intelligence is no longer a luxury; it’s a crucial component of business today. But what’s changed so much that we need to modernize now? Nigel Youell, Senior Product Marketing Director for Oracle Performance Management Applications, interviewed Barry Mostert, a Director in the Global Business Analytics Product Group for Oracle, about Business Intelligence modernization, what it’s all about, and why it needs to be addressed now.

Nigel got right to the heart of the interview by asking Barry why there was even a need to modernize BI, and what exactly that means. Barry told our listeners that Business Intelligence is continually evolving and, like any technology, it’s driven by current industry trends. Organizations are now looking to analyze valuable information sourced across vast stores of distributed and varying data types – commonly called Big Data – to extract important business information to be competitive, and to keep costs low. Using mobile analytics (consuming analytics on mobile personal smart devices) is gaining more and more momentum as our culture begins to see these personal smart devices as part of our everyday lives.

Barry told our listeners that a recent study demonstrated that about 80% of business users are already using their smart devices for work. The dropping price of random access memory (RAM) chip has widely enabled the use of in-memory analytics. There is also a strong push for user self-service that Gartner is calling the “consumerisation of IT,” which means users can’t wait for legacy tools and traditional processes to deliver their information. Users need to be empowered to take action for themselves. This includes the use of predictive analytics (using historic information to help foretell the future) to enable business people to make the best decisions possible. Just like in life, it’s a case of learning from your past to do better in your future.

In fact, every business is already affected or will be affected shortly by these trends - hence the need to move to a modern analytic platform that is capable of addressing these new data types and analytic methods. In fact, most older BI platforms were built and implemented before these trends even existed.

Nigel recapped Barry’s full explanation for the listeners this way. The benefits of modernizing your BI system are:

High performance analytics with no limits – more granular data for more accurate decisions without sacrificing performance

Robust and reliable backend architecture – to ensure availability on what is a critical business system

Employees that are empowered to quickly discover insight and then take action – and have less reliance on IT

Business activity that matches the corporate strategy to maintain organizational focus and direction

Improved user productivity and efficiency – enabling business users to work smarter

Nigel and Barry discussed the fact that modernizing a platform appears to be a big undertaking. Nigel asked Barry to break it down for the listeners and outline the main components of a modern BI platform.


Barry explained that it could be broken down to the 5 pillars of value, or potential gains where users stand to get significant and tangible benefits:

Mobility. A hot new analytic trend affecting users at all levels in the modern enterprise. Oracle has made significant investments in this area.

Advanced User Experience. How users interact with the analytic platform. New analytic techniques, not previously possible.

The Connected Enterprise. Connecting business strategy with operational execution. Better navigate the business using all relevant data available and modeling the future to reduce risk.

Engineered Systems. Promise of hardware and software working better together. Blinding fast performance with in-memory analytics and scalability without the bottlenecks.

Ironclad Architecture. A reliable, robust backend to support business critical systems, while reducing Total Cost of Ownership. Components are completely integrated allowing administrators to do more in less time.

As always, it’s the bottom line of moving to a Modern BI System that will excite executives to pursue the move. Barry explained to our listeners that it’s all about being able to do more with less. Empowering business users to satisfy their own analytic needs will lead to a more agile business. At the same time, this ability will free up IT resources. By optimizing and simplifying your business analytic platform, not only do you gain better Total Cost of Ownership, but you can begin to innovate. Innovation leads to doing business in new and better ways than your competition; being more competitive in this economy ultimately leads to a more successful business; and successful businesses are generally better off financially.

To listen to the entire podcast click here

To learn more about Business Intelligence at Oracle, click here

Friday Dec 20, 2013

Year in Review – Oracle Business Analytics in 2013

2013 was a busy year for Oracle Business Analytics and as it comes to end, we wanted take a moment to thank all of our customers and partners for another great year together. At Oracle, we enjoy a good year-end recap so here is a look at Oracle Business Analytics top 10 moments in 2013 (in no particular order).  Relax and take a stroll down memory lane with us!

10. The Release of Oracle Endeca Information Discovery 3.1 – The latest release of Oracle Endeca Information Discovery 3.1 incorporates new enterprise self-service discovery capabilities for business users, allowing them to easily make information-based business decisions with greater success, safety and confidence. Learn more about Oracle Endeca

9. Big Data at Work Webcast Series – 5 webcasts, 1000’s of attendees with featured guest speakers from Dell, Passoker, Cloudera, Delphi, as well as MIT’s Andrew McAfee. View on-demand

8. Oracle Exalytics T5-8 Scales Up to Deliver Customers with Analytic Insights Oracle Exalytics In-Memory MachineT5-8, the new engineered system with 4TB of memory per machine, delivers extreme performance for business intelligence (BI) and enterprise performance management (EPM) applications, helping organizations drive better efficiency by speeding answers to complex business scenarios. Learn more about Oracle Exalytics

7. Mark Hurd - Oracle OpenWorld 2013 Keynote – Oracle President, Mark Hurd, gives his keynote on transforming businesses with Big Data and Analytics at Oracle OpenWorld 2013. Watch the Video

6. Oracle Exalytics Strong Customer Adoption– Sodexo, SoftBank, Cablemas, WorleyParsons, Santos, Zagrebacka banka , Cablevisón, Avago Technologies, United Supermarkets, Immonet GmbH, Nilson Group AB, Siemens Healthcare, Pinellas County, Ministero del Lavoro were among the many organizations recognized at Oracle OpenWorld 2013 for leveraging Oracle Exalytics to deliver extreme performance for their mission-critical BI and enterprise performance management (EPM) applications. Learn more about Oracle Exalytics

5. The Release of Oracle BI Mobile App Designer A new design tool with which business users can easily create stunning and interactive analytical applications for use on any major mobile device. With this release, Oracle adds major innovations to Oracle Business Intelligence, extending the capabilities of the Oracle BI Mobile solution, and reinforcing Oracle’s commitment to empowering organizations to stay connected to their businesses with real-time insights while on the go. Learn more about Oracle BI Mobile App Designer

4. Oracle Exalytics X3-4 Powers Real-Time Analytical Insights – The new system features significant software enhancements and hardware updates, dramatically expanding the capabilities of the industry’s first high-speed engineered system for business analytics. Learn more about Oracle Exalytics

3. The release of Oracle Business Intelligence Applications 11.1.1.7.1 Completely redesigned to increase implementation productivity, the new release incorporates significant enhancements across the entire BI Applications product line and introduces new in-memory analytic applications. Learn more about Oracle BI Applications

2. The Oracle Business Intelligence Foundation Suite Release 11.1.1.7 Delivers significant enhancements to usability, mobility, user experience and Big Data integration, enabling organizations to analyze critical information and get the intelligence they need to optimize their business. Learn more about Oracle BI Foundation Suite

1. Oracle Positioned in Leaders Quadrant for BI and Analytics Platforms by Gartner – Oracle has been named Leader in Business Intelligence for the seventh consecutive year. Read the report. Plus, Oracle customer, Land O Lakes, wins Gartner BI and Analytics Excellence award for their innovative use of Oracle Endeca Information Discovery. Read the story

Historic Moment - Oracle Team USA puts Big Data and Analytics to work and fuels the most dramatic victory in the history of the America's Cup. Watch the Video

Tuesday Apr 09, 2013

Big Data Analytics - Advanced Analytics in Oracle Database

That's the title of a new white paper we've just posted. From the executive summary:

Big data doesn’t only bring new data types and storage mechanisms, but new types of analysis as well. In the following pages we discuss the various ways to analyze big data to find patterns and relationships, make informed predictions, deliver actionable intelligence, and gain business insight from this steady influx of information. 

You can check it out here.

Tuesday Mar 05, 2013

Reducing Hadoop TCO

I've been to a number of big data trade shows over the last year, and without fail I have the same conversation with many different people. It goes something like this.

 We discuss Oracle's Big Data Platform and I mention the Big Data Appliance (BDA). "Oh, yes" they say. "That's a great looking machine, but we can build a Hadoop cluster much cheaper than that, so we're not interested." 

 The first thing I do is ask them what kind of cluster they are building. They always say something like "I can get 40 $5K servers in a rack for $200K".

"But that's not an equivalent cluster," I will say. The most important number in Hadoop clusters is the amount of storage. When was the last time you heard somebody talk about a 400 core Hadoop cluster? They always say how many terabytes (or even petabytes) their cluster can store. Those smaller servers often only have a few TB of storage, compared with 36TB on each BDA node. So we quickly establish that their equivalent cluster is no such thing. Often it would actually take 2 or 3 such racks to match the capacity of the Big Data Appliance and their "equivalent" system is much more expensive than they thought.

But it's not just about buying servers. When you buy an engineered system you're also getting the rack, the cables, the switches, pre-installed software, tuning, optimization, integrated support and so on. Add those into the picture, and the Big Data Appliance is much lower cost.  Take a look at this ESG white paper that goes through all the numbers in detail. Here's the key segment from the executive summary:

"Based on ESG's modeling of a medium-sized Hadoop-oriented big data project, the preconfigured Oracle Big Data Appliance is 39% less costly than a “build” equivalent do-it-yourself infrastructure. And using Oracle Big Data Appliance will cut the project length by about one-third."

If you're building a Hadoop cluster, or looking to expand an existing one, you should keep Oracle Big Data Appliance on your shortlist and give it a closer look. 

Thursday Jan 31, 2013

Using In-Database Analytics to Predict Fraud

Your data warehouse stores critical data telling you what is happening in your business and sometimes why it’s happening. But you can go beyond understanding why something went wrong. You can use past data to predict the future, correcting problems before they happen. In a recent survey that Oracle did of over 300 C level executives, 93% of them thought that their companies were losing an average of 14% of their total revenue because they couldn’t fully leverage the information they had already collected. One key way to do this (and you’ll hear more about this in a future survey) is to use predictive analytics. Let’s take a quick look at why and how.

Turkcell is a leading mobile phone provider in Turkey, with over 34 million subscribers. And like most mobile providers a majority of those subscribers use pre-paid accounts and pre-paid cards. Money launderers take advantage of this, and losses for this business are of the order of $5 for every $10,000. This may not seem like much, but with billions of transactions, this adds up to millions of dollars a year.

Like other companies, Turkcell examine huge quantities of data and build models that help it identify and ultimately predict and prevent fraudulent transactions.  Unlike many other companies, Turkcell does this analysis in its data warehouse. With 100 TB of compressed data – representing over a petabyte uncompressed – it would take a long time to move that data out of the warehouse and keep it up to date as new data arrived. And the window to stop the next fraudulent transaction might have already closed.

Oracle Advanced Analytics enables you to perform sophisticated predictive analytics inside a data warehouse. You can mine your data directly while it is inside the Oracle Database using either SQL or R language APIs or the Oracle Data Miner SQL Developer “work flow” GUI extension, depending on your need and existing skills. You build models for past behavior and use that to predict future behavior, improving your accuracy with time. And best of all, there’s no need to move the data around which takes time you might not have and also leaves you exposed to security risks. As Turkcell said “...we can analyze large volumes of customer data and call-data records easier and faster than with any other tool”

Wednesday May 02, 2012

Analytics And Agility – Why It Is So Important Today

No question that many IT professionals are feeling the pressure to rapidly deliver analytic solutions that respond to the continuous demands of the business user. Primary research shows that analytics in the context of Big Data continues to be top of mind for executive teams at global enterprises. Executives clearly understand the value of Big Data and analytics and many are very vocal proponents of the value it can bring to bear on the business. What executive is not interested in understanding sales trends, KPI’s, social sentiment about the business and critical metrics, even predictive views of the business?

All these requirements and many more have created a tremendous backlog of analytic application requests. This backlog grows significantly as IT professionals are successful in delivering highly tuned internal analytic solutions that quickly deliver value. Everyone begins to say “I want one of those”!

We are no longer operating in the past when these analytic projects took years to complete and where a “boil the ocean” approach was the norm. Today, we see a changing landscape where Agility is what matters most when it comes to delivering rapid returns on key Big Data and analytic investments. In fact, just about every IT professional is pursuing the Agile model for development where product development efforts of the past are streamlined using methodologies that lean towards delivering high value features that are delivered in short bursts of time. This is clearly the method that is emerging when it comes to rapidly delivering analytic solutions to the end user in short bursts of time where value is immediately evident.

Agility in analytics doesn’t require that the product be a desktop business intelligence solution delivered in silos. In fact, Agile analytics requires that IT and the business users work collaboratively and quickly to ensure strong IT governance while also providing powerful analytic solutions to the end user rather than having the user take on the whole effort.

New Information Discovery solutions such as Oracle Endeca Information Discovery clearly have embraced the notion of delivering high value agile analytic applications in short bursts of time. Some including situational analytic apps.  We have seen this play out frequently where POC’s (proof of concepts) quickly go into production in a very short period of time because of the compelling value that was delivered in the POC.

Agility also means developing in short product iterations, this enables one to deliver an analytic application, get immediate feedback from end-users and rapidly iterate to deliver new or expanded requirements that deliver further value. We are talking weeks not many months. Some projects have gone from 8 week iterations down to 2 week iterations. Because Agile analytic solutions are typically delivered through a browser in a private or public cloud, versus a silo desktop only tool, the new enhancements are readily available for test and validation by the end user. This capability combined with agility also gives the analytic application developer a chance to quickly show that they are meeting the key requirements of the end user.

With increased agility and business user success, we will eventually turn the corner and agility will be the new norm in analytics.

New Oracle Endeca Information Discovery YouTube Channel

The Oracle Endeca Information Discovery Product Management team has been busy building a new YouTube Channel to showcase the capabilities of the Endeca Information Discovery product. The team has started to release a new screencast series for "Getting Started With Endeca Information Discovery. This series will help showcase the strong capabilities of the product. It will also give you a sense of what the business user experience is like and also show you how innovative this solution is for building highly interactive, search driven analytics applications on a variety of data including structured, multi-structured and unstructured data, especially on Big Data. 

We encourage you to check it out at http://www.youtube.com/user/OracleEID/


Tuesday Feb 07, 2012

Big Data Analytics – The Journey from Transactions to Interactions

Big Data Defined

Enterprise systems have long been designed around capturing, managing and analyzing business transactions e.g. marketing, sales, support activities etc. However, lately with the evolution of automation and Web 2.0 technologies like blogs, status updates, tweets etc. there has been an explosive growth in the arena of machine and consumer generated data. Defined as “Big Data”, this data is characterized by attributes like volume, variety, velocity and complexity and essentially represents machine and consumer interactions.

Case for Big Data Analysis

Machine and consumer interaction data is forward looking in nature. This data available from sensors, web logs, chats, status updates, tweets etc. is a leading indicator of system and consumer behavior. Therefore this data is the best indicator of consumer’s decision process, intent, sentiments and system performance. Transactions on the other hand are lagging indicators of system or consumer behavior. By definition leading indicators are more speculative and less reliable compared to lagging indicators; however, to predict the future with any confidence a combination of both leading and lagging indicators is required. That’s where the value of big data analysis comes in, by combining system and consumer interactions and transactions, organizations can better predict the consumer decision process, intent sentiments and future system performance leading to revenue growth, lower costs, better profitability and better designed systems.

So, which business areas will benefit via big data analysis? Think of areas where decision-making under uncertainty is required. Areas like new product introduction, risk assessment, fraud detection, advertising and promotional campaigns, demand forecasting, inventory management and capital investments will particularly benefit by having a better read on the future.

 Big Data Analytics Lifecycle

The big data analytics lifecycle includes steps like acquire, organize and analyze. Big data or machine/consumer interaction data is characterized by attributes like volume, velocity and variety and common sources of such data include sensors, web logs, status updates and tweets etc. The analytics process starts with data acquisition. The structure and content of big data can’t be known upfront and is subject to change in-flight so the data acquisition systems have to be designed for flexibility and variability; no predefined data structures, dynamic structures are a norm. The organization step entails moving the data in well defined structures so relationships can be established and the data across sources can be combined to get a complete picture. Finally the analysis step completes the lifecycle by providing rich business insights for revenue growth, lower costs and better profitability. Flexibility being the norm, the analysis systems should be discovery-oriented and explorative as opposed to prescriptive.

Getting Started

Oracle offers the broadest and most integrated portfolio of products to help you acquire and organize these diverse data sources and analyzes them alongside your existing data to find new insights and capitalize on hidden relationships. Learn how Oracle helps you acquire, organize, and analyze your big data by clicking here.

About

We're taking the pulse of the Business Intelligence and Analytics market based on our insights and our experiences with colleagues, customers,and partners.

Search

Categories
Archives
« April 2014
SunMonTueWedThuFriSat
  
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
   
       
Today