Upcoming events: INOUG Oracle Training Day!

The Indiana Oracle User Group (INOUG) have been kind enough to ask me to present an all day Optimizer workshop at their annual training day on July 26th at Eli Lilly & Co. The workshop will consist of four 90 minute session and will explain the fundamentals of the cost based Optimizer and the statistics that feed it, as well as providing a methodology for diagnosing and resolving SQL execution performance problems.

The workshop begins with an in-depth look at statistics and statistics maintenance, a challenge all DBAs must face in order to prevent suboptimal execution plans. It continues with a detailed explanation on how to interpret an execution plan. Each aspect of an execution plan will be examined, from cardinality estimates to parallel execution, and you will learn what information you should be gleam from the plan and how it affects the execution performance of a SQL statement. After lunch the workshop focuses on the process of analyzing and resolving the most common SQL execution performance problems including poor cardinality estimations, bind peeking issues, selecting the wrong access method and much, much more.

You can register for the event here. I hope you can join us!

+Maria Colgan

Comments:

The workshop yesterday was excellent! Thank you so much for presenting.

Posted by guest on July 27, 2012 at 08:13 AM PDT #

THANK YOU for a great presentation. You sent me back to my offive sufficiently abashed for using hints in SQL and started be on a quest for extended statistics.

On that quest I noticed some information in the dba_stat_extensions view that I wasn't expecting. Since I have not created any extended statistics I was surprised to find rows for an application schema with droppable=NO. It looks like the system is automatically creating extended statistics for columns in function based indices.

I started playing with the dbms_stats.report_col_usage function and saw columns that were used in RANGE and LIKE prediacates. You told us that extended statistics wouldn't help for these predicates, I was wondering if regular column statistics would be used?

We have historically always computed stats with a METHOD of 'FOR ALL INDEXED COLUMNS SIZE 254', knowing that the database will only create as many buckets as necessary. The thinking was that at least we would get the indexed columns right and let the optimizer guess at the other columns. So far this has worked for us. Would we benefit from a METHOD of 'FOR ALL COLUMNS SIZE 254' or A METHOD that specified the columns identified by dbms_stats.report_col_usage?

We compute statistics on some specific tables immediately after the application has loaded a significant amount of data. What is the appropriate way to compute those statistics so it will use the global (or appropriate table) preferences? I currently have a script that dynamically builds the DBMS_STATS.gather_table_stats job from the appropriate table or global preferences. Is there an easier way to do this?

Posted by guest on July 27, 2012 at 12:17 PM PDT #

Maria,
Thank you for the presentation last Thursday. It was my first opportunity to listen to a discussion on how to effectively work with, rather than against, the optimizer. You're approach was both informative and entertaining. If I ever get the opportunity to listen to you again, I wouldn't hesitate to grab a seat for the entire day.
Regards,

Posted by Tony on July 30, 2012 at 04:54 AM PDT #

Post a Comment:
  • HTML Syntax: NOT allowed
About

The Oracle Optimizer blog is written by members of the Optimizer development team. The goal of this blog is to provide an insight into the workings of the Optimizer and the statistics it relies on. The views expressed on this blog are our own and do not necessarily reflect the views of Oracle and its affiliates. The views and opinions expressed by visitors on this blog are theirs solely and may not reflect ours.

Search

Archives
« April 2014
SunMonTueWedThuFriSat
  
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
   
       
Today