Tuesday Jun 16, 2015

Big Data Spatial and Graph Analytics for Hadoop

We have just added Oracle Big Data Spatial and Graph support for Hadoop and NoSQL database technologies.  For over a decade, Oracle has offered leading spatial and graph analytic technology for the Oracle Database: we have now applied this expertise to work with social network data and to exploit Big Data architectures.  

Oracle Big Data Spatial and Graph includes two main components:

  1. A distributed property graph database with 35 built-in graph analytics to discover graph patterns in big data, such as communities and influencers within a social graph
  2. A wide range of spatial analysis functions and services to evaluate data based on how near or far something is to one another, or whether something falls within a boundary or region

Property Graph Data Management and Analysis

Property graphs are commonly used to model and analyze relationships, such as communities, influencers and recommendations, and other patterns found in social networks, cyber security, utilities and telecommunications, life sciences and clinical data, and knowledge networks.  

Property graphs model the real-world as networks of linked data comprising vertices (entities), edges (relationships), and properties (attributes) for both. Property graphs are flexible and easy to evolve; metadata is stored as part of the graph and new relationships are added by simply adding an edge.

Oracle Big Data Graph provides an industry leading property graph capability on Apache HBase and Oracle NoSQL Database with a Groovy-based console; parallel bulk load from common graph file formats; text indexing and search; querying graphs in database and in memory; ease of development with open source Java APIs and popular scripting languages; and an in-memory, parallel, multi-user, graph analytics engine with 35 standard graph analytics.

Spatial Analysis and Services – Enrich and Categorize Your Big Data with Location

With the spatial capabilities, users can take data with any location information, enrich it, and use it to harmonize their data.  For example, Oracle Big Data Spatial can look at datasets like Twitter feeds that include a zip code or street address, and add or update city, state, and country information. These results can be visualized on a map with the included HTML5-based web mapping tool.  Location can be used as a universal key across disparate data commonly found in Hadoop-based analytic solutions. 

“Big Data systems are increasingly being used to process large volumes of data from a wide variety of sources. With the introduction of Oracle Big Data Spatial and Graph, Hadoop users will be able to enrich data based on location and use this to harmonize data for further correlation, categorization and analysis. For traditional geospatial workloads, it will provide value-added spatial processing and allow us to support customers with large vector and raster data sets on Hadoop systems.” - Steve Pierce, CEO, Think Huddle

Your Spatial & Graph specialist contact in EMEA is Hans Viehmann (hans.viehmann@oracle.com).

You can attend a live web-conference on Spatial & Graph on Tuesday, July 21st at 6:00 PM UK / 7:00 PM CET

Tuesday May 19, 2015

OBI 11g Release 11.1.1.9 Now Available

This new release of Oracle Business Intelligence v.11.1.1.9, includes a number of new features and a focus on overall quality improvement. Significant new features and enhancements include:

Expanded Data Source Support, including support for Cloudera Impala and Hive2.  Users can also access information directly from their Hyperion Planning applications to build and deliver analytical content with OBI. Integration with the Oracle 12c database has been enhanced with support for compression, Exadata Hybrid Columnar Compression (EHCC), and in-memory Oracle database features.

Improved User Experience, including a new Tree Map visualization. Subject area search makes it easier to locate elements to add to an analysis. Users can now save and re-use custom column definitions.  More options are available to users when exporting content.  Customers can now take advantage of HTML5 for rendering charts.

New Capabilities for Exalytics, including support for count distinct aggregations.  Aggregations for levels with non-unique level keys, ragged and skip level hierarchies, and time levels with no chronological keys are also supported.  A new Summary Advisor command line utility is available to assist with scripting maintenance tasks.

New Capabilities for Administration, including Mobile App Designer installed by default.  A database policy store is available for security administration.

BI Publisher Integration with WebCenter Content: This integration will enable customers to deliver BI Publisher output directly to WebCenter Content.

More information:

Monday May 18, 2015

Cloudera is Hadoop Market Leader

Have you connected with Cloudera ? – are you one of their partners yet, because this is a win-win for us all.

Join us for a live Oracle & Cloudera webcast on Friday, May 29 – 10.0 am UK / 11.0 am CET - Rapidly Unlock the Value of Big Data for your Organisation

Oracle has a close partnership with Cloudera in that we resell their Hadoop version on our engineered platform, the Big Data Appliance. Additionally we obviously therefore target our big data software stack (e.g. Big Data Discovery, Big Data SQL, Connectors, ODI, ...) to run on Cloudera, as well as on Apache Hadoop and some of the other vendors.

If you had not noticed yet, Hadoop is growing really fast, and Cloudera has the lion’s share of this market: e.g. read this analyst report, saying:

“According to Cloudera’s CEO, Tom Reilly, Cloudera’s strategic decision to provide proprietary solutions in an open-source market paid off when Cloudera earned more than $100 million in revenues in 2014, which is more than double the revenues of Hortonworks and MapR.”

You may also have noticed that Intel announced a substantial equity investment in Cloudera ($740 million): and Intel works closely with Oracle on our engineered platforms, including the Big Data Appliance.

Training your people in Hadoop is critical to our combined success in this market, and Cloudera leads the market in high quality Hadoop courses, plus see the Cloudera Product Webinars tutorials and Training Webinars. Oracle will of course invest in training our partners on our specific product offerings for Big Data, but for a generic understanding of the base platform, I suggest you plug into Cloudera’s education programmes. They also have a certification programme so you can Become a Cloudera Certified Big Data Professional.

If you need an introduction, please contact me (Mike.Hallett@Oracle.com ) or Cloudera’s EMEA Partner Director, Jonathan Cooper @ jcooper@cloudera.com.

Friday Feb 20, 2015

Oracle Big Data Discovery Now Available - The Visual Face of Hadoop


There has been a lot of excitement about Oracle Big Data Discovery, and it is now generally available to customers and partners and can be sold and installed now.

This is a stunningly visual, intuitive tool that enables you to leverage the power of Hadoop and turn raw data into business insight in minutes — without learning complex products or relying only on specialists.

From a partner perspective it is perfect for proof of concept work with your clients to reveal the value they can find in their large information reservoirs combining a variety of data and text sources; potentially leading to much more and deeper analysis as part of an overall big data information architecture.

To find out more see these videos:

Check out the Capabilities of Big Data Discovery to

  1. Easily find relevant data by browsing a rich, interactive catalog of all data in hadoop using familiar keyword search and guided navigation.
  2. Explore data to understand its’ potential by visualizing the shape and quality of unfamiliar data and combining attributes to uncover interesting relationships.
  3. Transform and enrich to make data better with intuitive, user-driven data wrangling.
  4. See rich, interactive visualizations that reveal new patterns by blending diverse data sets for deeper perspectives.

More Resources:

Wednesday Nov 26, 2014

Oracle Enterprise Metadata Management 12c for Big Data and Analytics

One of the best reasons for introducing Hadoop into your clients is as a lower cost ETL and staging platform for any data warehouse... the so called “Data Lake” Concept. But as discussed by Gartner “Beware of the Data Lake Fallacy”... this still needs governance: “Without descriptive metadata and a mechanism to maintain it, the data lake risks turning into a data swamp.”

So, as a follow on to my last blog ( Partners’ Get-Started-Kit with Oracle Big Data and Analytics ), let’s now explore how you can help your clients to better govern and manage a cost effective data lake that delivers additional value to their data warehouse.

Oracle has expanded its Oracle Data Integration portfolio with the addition of Oracle Enterprise Metadata Management 12c, a comprehensive platform that helps reduce compliance risks and ensure the success of governance programs within organizations by providing much-needed business and data transparency (read the Press Release). Together with Oracle Enterprise Data Quality, Oracle Big Data SQL, and Oracle Database security you can manage and control all aspects of big data stewardship, lifecycle management, data protection, auditing, security, and compliance.

To find out more, start with these two webcasts:

Data Quality and Metadata Management for Big Data Governance webcast:
  • Understand how data governance can be applied to big data
  • Explore new Oracle Enterprise Metadata Management technology
  • Learn how data quality is integral to any data governance initiative

Big Data Integration for the Big Data Reservoir webcast:

  • Keep big data reservoirs accurate and real-time using Oracle Data Integrator and Oracle GoldenGate
  • Leverage Hadoop and Data Integration technologies across heterogeneous environments
  • Implement best practices using big data reservoirs to unlock the most value from your enterprise big data

Then try the Tutorial @ Tame Big Data with Oracle Data Integration

And Download a full version of Oracle Enterprise Metadata Management 12c software and documentation.

Tuesday Jul 01, 2014

ODI12c ETL on Hadoop and Oracle Big Data Appliance

One of the best reasons to start using Hadoop, is to off-load ETL processing away from a potentially higher cost “Data Warehouse staging system” and deploy it onto a platform with a better performance-to-cost ratio for this ETL load.

If you do this, you will likely still want high productivity ETL tools such as Oracle Data Integrator (ODI12c), and if you are handling large volumes of data in a limited batch window, you need fast processing and most importantly high speed loading into the Data Warehouse.

ODI12c on Hadoop gives you this, when combined with the Oracle Big Data Connectors. This works especially well on our engineered systems (Big Data Appliance to Exadata), but is still also the best solution for any ETL work from Hadoop to an Oracle database, even on so called “commodity hardware”.

Mark installed all the software elements directly, but if you need to get going quickly, you may be able to use our downloadable VM ( Demonstration VM “BigDataLite 2.4.1” Available on OTN ... although this is now updated to version 3.0) which works on non-BDA hardware.

Monday Feb 03, 2014

Demonstration VM “BigDataLite 2.4.1” Available on OTN

The Demonstration VM “BigDataLite 2.4.1” is now available for download from OTN.  Now, customers and partners can have easy access to many of our big data software products - all configured in an integrated VirtualBox environment.

“BigDataLite” is an Oracle VM VirtualBox that contains many key components of Oracle's big data platform, including:

  • Cloudera Distribution including Apache Hadoop
  • Oracle Database 12c Enterprise Edition
  • Oracle Advanced Analytics and "R"
  • Oracle Data Integrator 12c, Oracle Big Data Connectors
  • Oracle NoSQL Database, and more....

It's been configured to run on at least two cores and about +5Gb memory (so this means that your computer should have at least 8Gb total memory). With BigDataLite, you can develop your big data applications and then deploy them to any compatible hardware including the Oracle Big Data Appliance.

To expand this demonstration platform to include OBI and Endeca, you then also download these VMs and inter-connect them. For this you will likely need +16Gb of RAM and at least 4 cores to get them all running. This is targeted at “BIG data analytics”, so giving this integrated platform set +32Gb of RAM and more cores will help you to show it in its’ best light.

Tuesday Nov 26, 2013

Small Steps to Big Data BI&EPM Partner Community Forum January 2014

Open to all OPN partners in EMEA, we are running the Business Analytics Partner Community Forum over two days in London, on 16th and 17th January 2014 - Register Now Here.

This forum entitled “Small Steps to Big Data” will focus on discussing with Partners, how best to exploit the tremendous interest in “Big Data Analytics” and to clarify under what circumstances “R” and “Hadoop” are best deployed, and how these co-exist, integrate with, and extend the capabilities of tools you are already familiar with such as Oracle BI, ODI, Endeca and the Oracle Database. We will consider guidelines to hardware deployments, but the main focus of the forum will be how the software inter-operates: with guest speakers from Cloudera, and other partners who have experience in this field.

On the one hand, I do not think “Big Data=Hadoop”. While on the other, Oracle whole-heartedly embraces useful Open Source innovations such as “R” and “Hadoop”: we are, after all, a big player in Open Source with for example JAVA, NoSql and MySql.

You can download the agenda here.  We will seek to answer questions such as:

· How Big is “Big” ? ... at what size is Hadoop’s MPP approach beneficial ?

· What about “Variety” ? ... how do we digest “Any Data” ?

· Who uses this analytics ? ... a few “Data Scientists” or 100s of “Business Users” ?

· How do you spot a “Big Data Analytics” opportunity ?

· If someone is already using Hadoop, do they want to talk to Oracle ?

· What is NEW, and what is Business-as-Usual ?

Audience: This forum will appeal to CTOs, Solution architects and consultants in Oracle Partners familiar with Oracle’s Business Analytics solutions. We will examine the economics and business cases driving “Big Data Analytics” projects, and dive into the pros-and-cons of technology options available to your customers.

· Day 1 – Thursday 16th Jan. 2014: Starts 11.0 am – Sales and Executive briefing.

o Networking Dinner in Evening

· Day 2 – Friday 17th Jan. 2014: Ends 4.0 pm – Deeper dive technical discussions.

Register Now Here

About



Search

Archives
« June 2015
SunMonTueWedThuFriSat
 
1
2
3
4
5
6
7
8
9
11
12
13
14
15
17
18
19
20
21
22
23
24
25
26
27
28
29
30
    
       
Today