Tuesday Apr 01, 2014

Limited Edition Exadata X4-2C - Brighten Up Your Data Center

Oracle has always been at the forefront of efforts to revolutionise your data center. To date, for obvious reasons, the focus has been on optimizing energy and space efficiency. As of today we are moving into an exciting new phase in terms of the look and feel of your data center. Oracle recently added a new fashion design team to its engineered system group to help us re-imagine the next generation data center and the first exciting fruits of this new partnership of both technology and fashion are now available for our customers to order…..

For a short period only, Oracle is offering its data warehouse customers the chance to buy a limited edition EXADATA X4-2C. This new Exadata configuration is going to brighten up your data center with its exciting range of color coordinated racks! Now you can enjoy running those really sophisticated business queries in glorious technicolor. Most importantly, the great news is that we are not charging you anything extra for this fabulous new technicolor data warehouse experience:

X4 2C

HARDWARE, SOFTWARE AND COLORENGINEERED TO WORK TOGETHER

Each color-coded rack comes with its own color-linked version of Enterprise Manager to add more colour, brightness and joy to all those day-to-day tasks as you can see below on these specially designed monitoring screens: 

EMC

 Your Exadata DBA is really going to thank you!

So what happens if you buy a 1/2 rack then slowly add more Exadata nodes? Great question - well, while stocks last you can actually create your own multi-colored Exadata rack. As always we are ahead of the game because we know what our customers want. SO WHY NOT HAVE A  TECHNICOLOR DATA WAREHOUSE in your data center! Go on, you know it makes sense….

X4 2C 2

BUT YOU GOTTA HURRY - This new Exadata X4-2C range is a limited edition, special order only model. Stocks are limited. To brighten up your data center make sure you contact your Oracle Sales Representative right now because you do not want to miss out on this exciting opportunity to put one of these gorgeous, colour-coded dudes in your data center. And don't forget, only Oracle gives you  HARDWARE, SOFTWARE AND COLORENGINEERED TO WORK TOGETHER

Oracle 1-800-633-0738


Thursday Dec 12, 2013

Oracle releases Exadata X4 with optimizations for data warehousing

Exadata top closed 0056

Support Quote

”Oracle Exadata Database Machine is the best platform on which to run the Oracle Database and the X4 release extends that value proposition,” said Oracle President Mark Hurd. “As private database clouds grow in popularity, the strengths of Oracle Exadata around performance, availability and quality of service set it apart from all alternatives.”

We have just announced the release of the fifth-generation of our flagship database machine: Oracle Exadata Database Machine X4. This latest release introduces new hardware and software to accelerate performance, increase capacity, and improve efficiency and quality-of-service for enterprise data warehouse deployments.

Performance of all data warehousing workloads is accelerated by new flash caching algorithms that focus on table and partition scan workloads that are common in Data Warehouses. Tables that are larger than flash are now automatically partially cached in flash and read concurrently from both flash and disk to speed throughput.

Other key highlights are:

1) Improved workload management
Exadata X4-2 includes new workload management features that will improve the management of data warehouse workloads. Exadata now has the unique ability to transparently prioritize requests as they flow from database servers, through network adapters and network switches, to storage, and back.

We are using a new generation of InfiniBand network protocols to ensure that network-intensive workloads such as reporting, batch and backups do not delay response-time sensitive interactive workloads. Which is great news for IT teams that have to define and manage service level agreements.

2) Bigger flash cache for even faster performance
We have increased the amount of physical flash within a full rack to 44 TB per full rack. However, the capacity of the logical flash cache has increased by 100% to 88 TB per full rack.

3) Hardware driven compression/decompression

A feature that is unique to Exadata is the Flash Cache Compression. This transparently compresses database data into flash using hardware acceleration to compress and decompress data with zero performance overhead.

4) In-memory processing
For in-memory workloads we increased maximum memory capacity by 100% to 4TB in full rack (using memory expansion kits) which means more workloads will be able to run in-memory with extremely fast response times.

5) Increased support for big data
To support big data projects we increased the capacity of the high performance disks to over 200 TB per full rack and for high capacity disks the storage capacity is now 672 TB per full rack. Once you factor in Oracle Exadata's compression technologies then a full rack is capable of storing petabytes of user data. 

The full press release is here: http://www.oracle.com/us/corporate/press/2079925

Wednesday Dec 11, 2013

dunnhumby increases customer loyalty with Oracle Big Data

dunnhumby presented at this year's OpenWorld where they outlined the how and why of data warehousing on Exadata.  Our engineered system delivered a performance improvement of more than 24x. dunnhumby pushes its data warehouse platform really hard with more than 280 billion fact rows and 250 million dimension rows for one large retailer client alone, dunnhumby’s massive data requires the best performance the industry has to offer.

In Oracle Exadata, dunnhumby has found that solution. Using Oracle Exadata’s advanced Smart Scan technology and robust Oracle Database features. This new environment has empowered its analysts to perform complex ad hoc queries across billions of fact rows and hundreds of millions of dimension rows in minutes or seconds, compared to hours or even days on other platforms. 

You can download the presentation by Philip Moore - Exadata Datawarehouse Architect, Dunnhumby USA LLC -  from the OpenWorld site, see here: https://oracleus.activeevents.com/2013/connect/sessionDetail.ww?SESSION_ID=3412.

If you missed Philip's session at OpenWorld then we have just released a new video interview with Chris Wones, Director of Data Solutions at dunnhumby. During the interview Chris outlines some of the challenges his team faced when trying to do joined up analytics across disparate and disconnected data sets and how Exadata allowed them to bring everything together so that they could run advanced analytical queries that were just not possible before and that meant being able to bid on completely new types of contracts. The combination of Exadata and Oracle Advanced Analytics are delivering real business benefit to dunnhumby and its customers.

For more information about Oracle's Advanced Analytics option checkout Charlie Berger's advanced analytics blog: http://blogs.oracle.com/datamining and Charlie's twitter feed: https://twitter.com/CharlieDataMine

To watch the video click on the image: 

Dunnhumby

If the video does not start follow this link: http://medianetwork.oracle.com/video/player/2889835899001

Friday Nov 22, 2013

Using Oracle Exadata to improve crop yields

It is not often you read about how the agricultural industry uses data warehousing so this article in latest edition of Oracle Magazine, with the related video from OOW 2013, on how Land O'Lakes is using Exadata caught my attention:  The Business of Growing, by Marta Bright http://www.oracle.com/technetwork/issue-archive/2013/13-nov/o63lol-2034253.html

A little background on Land O'Lakes: Land O’Lakes is a US company that has grown far beyond its roots as a small cooperative of dairy farmers with forward-thinking ideas about producing and packaging butter. It is a Fortune 500 company and is now the second-largest cooperative in the United States, with annual sales of more than US$14 billion. Over the years, Land O’Lakes has expanded its operations into a variety of subsidiaries, including WinField Solutions (WinField), which provides farmers with a wide variety of crop seeds and crop protection products.

This implementation on our engineered systems highlights one of the key unique features of Oracle: the ability to run a truly mixed operation + data warehouse workload on the same platform, in the same rack. Land O'Lakes uses a lot of Oracle applications which push data into their data warehouse. In many cases we talk about the need for the data warehouse to support small windows of opportunities. For WinField solutions this is exactly why they invested in Oracle. 

What makes seed sales unique and challenging is that they are directly tied to seasonal purchasing. “There’s somewhat of a Black Friday in the seed business,” explains Tony Taylor, director of technology services at Land O’Lakes. “WinField is a US$5 billion company that sells all of its seed during about a six-week period of time”. With that sort of compressed sales cycle you need to have unto date information at your finger tips so you make the all the right decisions at the right time. Speed and efficiency are the key factors. If you watch the video Chris Malott, Manger of the DBA team at Land O'Lakes, explains the real benefits that her team and the business teams have gained from moving their systems (operational and data warehouse) on to our engineered systems platform.

Land OLakes

Click on the image above to link to the video. If the link does not work then click here: http://www.youtube.com/watch?v=-MDYI7IakR8

WinField is maximising the use of Oracle's analytical capabilities by incorporating huge volumes of imaging data which it then uses to help farmers make smarter agronomic decisions and ultimately gain higher yields. How many people would expect Oracle's in-database analytics to be driving improved crop yields? Now that is a great use case!

WinField also drills down and across all the information they collect to help identify new opportunities and what in the telco space we would call "churn" - for instance in January they run reports to identify particular farmers who typically purchases seed and products in November but has not yet ordered. This provides WinField the information it needs in order to perform proactive outreach to farmers to find out if they simply haven’t had time to place an order.

“We’re able to help farmers and our co-op members, even in cases where we’re not sure whether it’s going to directly benefit Land O’Lakes or WinField. Because this is truly a cooperative system, these are the people we work for, and we’re willing to invest in them.” -  Mike Macrie, vice president and CIO at Land O’Lakes. 

For next steps, Land O-Lakes is looking to move to Database 12c and I am sure that will open up even more opportunities for their business users to help farmers. Hopefully, they will be able to make use of new analytical features such as SQL Pattern Matching.


Wednesday Nov 13, 2013

ADNOC talks about 50x increase in performance

In this new video Awad Ahmen Ali El Sidddig, Senior DBA at ADNOC, talks about the impact that Exadata has had on his team and the whole business. ADNOC is using our engineered systems to drive and manage all their workloads: from transaction systems to payments system to data warehouse to BI environment. A true Disk-to-Dashboard revolution using Engineered Systems. This engineered approach is delivering 50x improvement in performance with one queries running 100x faster! The IT has even revolutionised some of their data warehouse related processes with the help of Exadata and now jobs that were taking over 4 hours now run in a few minutes.

[Read More]

Wednesday Nov 06, 2013

Swiss Re increases data warehouse performance and deploys in record time

Great information on yet another data warehouse deployment on Exadata.

A little background on Swiss Re:

In 2002, Swiss Re established a data warehouse for its client markets and products to gather reinsurance information across all organizational units into an integrated structure. The data warehouse provided the basis for reporting at the group level with drill-down capability to individual contracts, while facilitating application integration and data exchange by using common data standards. Initially focusing on property and casualty reinsurance information only, it now includes life and health reinsurance, insurance, and nonlife insurance information.

Key highlights of the benefits that Swiss Re achieved by using Exadata:

  • Reduced the time to feed the data warehouse and generate data marts by 58%
  • Reduced average runtime by 24% for standard reports
  • comfortably loading two data warehouse refreshes per day with incremental feeds
  • Freed up technical experts by significantly minimizing time spent on tuning activities

Most importantly this was one of the fastest project deployments in Swiss Re's history. They went from installation to production in just four months! What is truly surprising is the that it only took two weeks between power-on to testing the machine with full data volumes! Business teams at Swiss Re are now able to fully exploit up-to-date analytics across property, casualty, life, health insurance, and reinsurance lines to identify successful products.

These points are highlighted in the following quotes from Dr. Stephan Gutzwiller, Head of Data Warehouse Services at Swiss Re: 

"We were operating a complete Oracle stack, including servers, storage area network, operating systems, and databases that was well optimized and delivered very good performance over an extended period of time. When a hardware replacement was scheduled for 2012, Oracle Exadata was a natural choice—and the performance increase was impressive. It enabled us to deliver analytics to our internal customers faster, without hiring more IT staff"

“The high quality data that is readily available with Oracle Exadata gives us the insight and agility we need to cater to client needs. We also can continue re-engineering to keep up with the increasing demand without having to grow the organization. This combination creates excellent business value.”

Our full press release is available here: http://www.oracle.com/us/corporate/customers/customersearch/swiss-re-1-exadata-ss-2050409.html. If you want more information about how Exadata can increase the performance of your data warehouse visit our home page: http://www.oracle.com/us/products/database/exadata-database-machine/overview/index.html


Tuesday Oct 25, 2011

Read Up on the Overall Big Data Solution

On top of the NoSQL Database release I wanted to share the new paper on big data with all. It gives you an overview of the end-to-end solution as presented at Openworld and places it in context of the importance of big data for our customers.

This is a a quick look at the Executive Summary and the Introduction (or click here for the paper):

Executive Summary

Today the term big data draws a lot of attention, but behind the hype there's a simple story. For decades, companies have been making business decisions based on transactional data stored in relational databases. Beyond that critical data, however, is a potential treasure trove of non-traditional, less structured data: weblogs, social media, email, sensors, and photographs that can be mined for useful information. Decreases in the cost of both storage and compute power have made it feasible to collect this data - which would have been thrown away only a few years ago.  As a result, more and more companies are looking to include non-traditional yet potentially very valuable data with their traditional enterprise data in their business intelligence analysis.

To derive real business value from big data, you need the right tools to capture and organize a wide variety of data types from different sources, and to be able to easily analyze it within the context of all your enterprise data. Oracle offers the broadest and most integrated portfolio of products to help you acquire and organize these diverse data types and analyze them alongside your existing data to find new insights and capitalize on hidden relationships.

Introduction

With the recent introduction of Oracle Big Data Appliance, Oracle is the first vendor to offer a complete and integrated solution to address the full spectrum of enterprise big data requirements. Oracle's big data strategy is centered on the idea that you can evolve your current enterprise data architecture to incorporate big data and deliver business value. By evolving your current enterprise architecture, you can leverage the proven reliability, flexibility and performance of your Oracle systems to address your big data requirements.

Defining Big Data

Big data typically refers to the following types of data:

  • Traditional enterprise data - includes customer information from CRM systems, transactional ERP data, web store transactions, general ledger data.
  • Machine-generated /sensor data - includes Call Detail Records ("CDR"), weblogs, smart meters, manufacturing sensors, equipment logs (often referred to as digital exhaust), trading systems data.
  • Social data - includes customer feedback streams, micro-blogging sites like Twitter, social media platforms like Facebook

The McKinsey Global Institute estimates that data volume is growing 40% per year, and will grow 44x between 2009 and 2020. But while it's often the most visible parameter, volume of data is not the only characteristic that matters. In fact, there are four key characteristics that define big data:

  • Volume. Machine-generated data is produced in much larger quantities than non-traditional data. For instance, a single jet engine can generate 10TB of data in 30 minutes. With more than 25,000 airline flights per day, the daily volume of just this single data source runs into the Petabytes. Smart meters and heavy industrial equipment like oil refineries and drilling rigs generate similar data volumes, compounding the problem.
  • Velocity. Social media data streams - while not as massive as machine-generated data - produce a large influx of opinions and relationships valuable to customer relationship management. Even at 140 characters per tweet, the high velocity (or frequency) of Twitter data ensures large volumes (over 8 TB per day).
  • Variety. Traditional data formats  tend to be relatively well described and change slowly. In contrast, non-traditional data formats exhibit a dizzying rate of change. As new services are added, new sensors deployed, or new marketing campaigns executed, new data types are needed to capture the resultant information.
  • Value. The economic value of different data varies significantly. Typically there is good information hidden amongst a larger body of non-traditional data; the challenge is identifying what is valuable and then transforming and extracting that data for analysis.

To make the most of big data, enterprises must evolve their IT infrastructures to handle the rapid rate of delivery of extreme volumes of data, with varying data types, which can then be integrated with an organization's other enterprise data to be analyzed. 

The Importance of Big Data

When big data is distilled and analyzed in combination with traditional enterprise data, enterprises can develop a more thorough and insightful  understanding of their business, which can lead to enhanced productivity, a stronger competitive position and greater  innovation - all of which can have a significant impact on the bottom line.
For example, in the delivery of healthcare services, management of chronic or long-term conditions is expensive. Use of in-home monitoring devices to measure vital signs, and monitor progress is just one way that sensor data can be used to improve patient health and reduce both office visits and hospital admittance.
Manufacturing companies deploy sensors in their products to return a stream of telemetry. Sometimes this is used to deliver services like OnStar, that delivers communications, security and navigation services. Perhaps more importantly, this telemetry also reveals usage patterns, failure rates and other opportunities for product improvement that can reduce development and assembly costs.

The proliferation of smart phones and other GPS devices offers advertisers an opportunity to target consumers when they are in close proximity to a store, a coffee shop or a restaurant. This opens up new revenue for service providers and offers many businesses a chance to target new customers.
Retailers usually know who buys their products.  Use of social media and web log files from their ecommerce sites can help them understand who didn't buy and why they chose not to, information not available to them today.  This can enable much more effective micro customer segmentation and targeted marketing campaigns, as well as improve supply chain efficiencies.

Finally, social media sites like Facebook and LinkedIn simply wouldn't exist without big data. Their business model requires a personalized experience on the web, which can only be delivered by capturing and using all the available data about a user or member.
-------

The full paper is linked here. Happy reading...

Monday Jun 13, 2011

Parallel Execution – Precedence of Hints and other Factors

The following table is a reflection of the precedence of hints, things like alter session enable parallel DML when using Auto DOP. It is also important to understand how the DML and query parts work together and how they influence each other.

All of the below is based on a simple statement:

insert into t3 as select * from t1, t2 where t1.c1 = t2.c1;

px_precedence_overview

Some explanatory words based on the lines in the picture above:

Line 1 => The cleanest way to run PX statements, where Auto DOP gets to do its work and we will use the computed DOP of the statement

Line 4 => Because a FORCE parallel is used, we must ensure that the DOP > 1

Line 9 => The statement level hint over rides all other means and we run the statement with the DOP in the hint

A word on internal degree limit. This is NOT (repeat NOT) a parameter you can find, or set or find an underscore for. It is the built in limit to DOPs (set to default DOP or parallel_degree_limit = CPU). It is an internal boundary to ensure we do not blast through the upper bound of CPU power. Also note, for any non-compute degree, those boundaries and limits do not apply. That in itself is a reason to go look at and understand Auto DOP.

About

The data warehouse insider is written by the Oracle product management team and sheds lights on all thing data warehousing and big data.

Search

Archives
« April 2014
SunMonTueWedThuFriSat
  
2
4
5
6
7
8
9
10
11
12
13
14
16
18
19
20
21
23
24
25
26
27
28
29
30
   
       
Today