Tuesday Dec 23, 2014

Solaris Studio : C/C++ Dynamic Analysis

First, a reminder - Oracle Solaris Studio 12.4 is now generally available. Check the Solaris Studio 12.4 Data Sheet before downloading the software from Oracle Technology Network.

Dynamic Memory Usage Analysis

Code Analyzer tool in Oracle Solaris Studio compiler suite can analyze static data, dynamic memory access data, and code coverage data collected from binaries that were compiled with the C/C++ compilers in Solaris Studio 12.3 or later. Code Analyzer is supported on Solaris and Oracle Enterprise Linux.

Refer to the static code analysis blog entry for a quick summary of steps involved in performing static analysis. The focus of this blog entry is the dynamic portion of the analysis. In this context, dynamic analysis is the evaluation of an application during runtime for memory related errors. Main objective is to find and debug memory management errors -- robustness and security assurance are nice side effects however limited their extent is.

Code Analyzer relies on another primary Solaris Studio tool, discover, to find runtime errors that are often caused by memory mismanagement. discover looks for potential errors such as accessing outside the bounds of the stack or an array, unallocated memory reads and writes, NULL pointer deferences, memory leaks and double frees. Full list of memory management issues analyzed by Code Analyzer/discover is at: Dynamic Memory Access Issues

discover performs the dynamic analysis by instrumenting the code so that it can keep track of memory operations while the binary is running. During runtime, discover monitors the application's use of memory by interposing on standard memory allocation calls such as malloc(), calloc(), memalign(), valloc() and free(). Fatal memory access errors are detected and reported immediately at the instant the incident occurs, so it is easy to correlate the failure with actual source. This behavior helps in detecting and fixing memory management problems in large applications with ease somewhat. However the effectiveness of this kind of analysis highly depends on the flow of control and data during the execution of target code - hence it is important to test the application with variety of test inputs that may maximize code coverage.

High-level steps in using Code Analyzer for Dynamic Analysis

Given the enhancements and incremental improvements in analytical tools, Solaris Studio 12.4 is recommended for this exercise.

  1. Build the application with debug flags

    –g (C) or -g0 (C++) options generate debug information. It enables Code Analyzer to display source code and line number information for errors and warnings.

    • Linux users: specify –xannotate option on compile/link line in addition to -g and other options
  2. Instrument the binary with discover

    % discover -a -H <filename>.%p.html -o <instrumented_binary> <original_binary>

    where:

    • -a : write the error data to binary-name.analyze/dynamic directory for use by Code Analyzer
    • -H : write the analysis report to <filename>.<pid>.html when the instrumented binary was executed. %p expands to the process id of the application. If you prefer the analysis report in a plain text file, use -w <filename>.%p.txt instead
    • -o : write the instrumented binary to <instrumented_binary>

    Check Command-Line Options page for the full list of discover supported options.

  3. Run the instrumented binary

    .. to collect the dynamic memory access data.

    % ./<instrumented_binary> <args>

  4. Finally examine the analysis report for errors and warnings

Example

The following example demonstrates the above steps using Solaris Studio 12.4 C compiler and discover command-line tool. Same code was used to demonstrate static analysis steps as well.

Few things to be aware of:

  • If the target application preloads one or more functions using LD_PRELOAD environment variable that discover tool need to interpose on for dynamic analysis, the resulting analysis may not be accurate.
  • If the target application uses runtime auditing using LD_AUDIT environment variable, this auditing will conflict with discover tool's use of auditing and may result in undefined behavior.

Reference & Recommended Reading:

  1. Oracle Solaris Studio 12.4 : Code Analyzer User's Guide
  2. Oracle Solaris Studio 12.4 : Discover and Uncover User's Guide

Saturday Apr 03, 2010

Oracle 11g R1: Poor Data Pump Performance when Exporting a Partitioned Table

Symptom(s)

Data Pump Export utility, expdp, performs well with non-partitioned tables, but exhibits extreme poor performance when exporting objects from a partitioned table of similar size. In some cases the degradation can be as high as 3X or worse.

SQL traces may show that much of the time is being spent in a SQL statement that is similar to:


	UPDATE "schema"."TABLE" mtu 
        SET mtu.base_process_order = NVL((SELECT mts1.process_order FROM "schema"."TABLE" mts1 
        WHERE ..

Here is an example data export session:


Export: Release 11.1.0.7.0 - 64bit Production on Wednesday, 31 March, 2010 6:56:50

Copyright (c) 2003, 2007, Oracle.  All rights reserved.
;;; 
Connected to: Oracle Database 11g Enterprise Edition Release 11.1.0.7.0 - 64bit Production
With the Partitioning, OLAP, Data Mining and Real Application Testing options
Starting "SCHMA"."SYS_EXPORT_TABLE_01":  SCHMA/\*\*\*\*\*\*\*\* DIRECTORY=exp_dir DUMPFILE=SOME_DUMMY_PART_FULL.DMP TABLES=SOME_DUMMY_PART
Estimate in progress using BLOCKS method...
Processing object type TABLE_EXPORT/TABLE/TABLE_DATA
Total estimation using BLOCKS method: 20.56 GB
Processing object type TABLE_EXPORT/TABLE/TABLE
Processing object type TABLE_EXPORT/TABLE/INDEX/INDEX
Processing object type TABLE_EXPORT/TABLE/INDEX/STATISTICS/INDEX_STATISTICS
Processing object type TABLE_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS
. . exported "SCHMA"."SOME_DUMMY_PART":"DUMMY_PART_P01"  1.143 GB 13788224 rows
. . exported "SCHMA"."SOME_DUMMY_PART":"DUMMY_PART_P02"  1.143 GB 13788224 rows
. . exported "SCHMA"."SOME_DUMMY_PART":"DUMMY_PART_P03"  1.143 GB 13788224 rows
...
. . exported "SCHMA"."SOME_DUMMY_PART":"DUMMY_PART_P32"  151.1 MB 1789216 rows
. . exported "SCHMA"."SOME_DUMMY_PART":"DUMMY_PART_P33"  11.37 MB  136046 rows
. . exported "SCHMA"."SOME_DUMMY_PART":"DUMMY_PART_P00"      0 KB       0 rows
Master table "SCHMA"."SYS_EXPORT_TABLE_01" successfully loaded/unloaded
\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*
Dump file set for SCHMA.SYS_EXPORT_TABLE_01 is:
  /DBDUMP/SOME_DUMMY_PART_FULL.DMP
Job "SCHMA"."SYS_EXPORT_TABLE_01" successfully completed at 11:22:36

Solution(s) / Workaround

This is a known issue (that is, a bug) and a solution is readily available. Try any of the following to resolve the issue:

  • Apply the 11g database patch 8845859
  • Upgrade to 11.2.0.2 patchset when it is available, or
  • Specify "VERSION=10.2.0.3" expdp option as a workaround

I ran into this issue and I chose the workaround to make some quick progress. With the string "VERSION=10.2.0.3" appended, export time went down from 265 minutes to 60+ minutes.

Tuesday Feb 02, 2010

Extracting DDL Statements from a PeopleSoft Data Mover exported DAT file

Case in hand: Given a PeopleSoft Data Mover exported data file (db or dat file), how to extract the DDL statements [from that data file] which gets executed as part of the Data Mover's data import process?

Here is a quick way to do it:

  1. Insert the SET EXTRACT statements in the Data Mover script (DMS) before the IMPORT .. statement.

    eg.,
    
    % cat /tmp/retrieveddl.dms
    
    ..
    SET EXTRACT OUTPUT /tmp/ddl_stmts.log;
    SET EXTRACT DDL;
    ..
    
    IMPORT \*;
    
    

    It is mandatory that the SET EXTRACT OUPUT statement must appear before any SET EXTRACT statements.

  2. Run the Data Mover utility with the modified DMS script as an argument.

    eg., OS: Solaris

    
    % psdmtx -CT ORACLE -CD NAP11 -CO NAP11 -CP NAP11 -CI people -CW peop1e -FP /tmp/retrieveddl.dms
    
    

    On successful completion, you will find the DDL statements in /tmp/retrieveddl.dms file.

Check chapter #2 "Using PeopleSoft Data Mover" in Enterprise PeopleTools x.xx PeopleBook: Data Management document for more ideas.

---

Updated 07/16/2010:

It appears PeopleSoft introduced a bug in Data Mover functionality on \*NIX platforms somewhere in PeopleTools 8.49 releases. If Data Mover repeatedly fails with "Error: Unable to open OUTPUT: " when extracting statements or actions using "SET EXTRACT", run the same DMS script on any Windows system as a workaround. For more information, check Oracle Support Document "E-DM: 'Error: Unable to open OUTPUT:' when attempting to extract DDL with Data Mover (Doc ID 887792.1)". From the same document: the fix to this bug on \*NIX platforms is targeted to fix after PeopleTools 8.51 release.

About

Benchmark announcements, HOW-TOs, Tips and Troubleshooting

Search

Archives
« April 2015
SunMonTueWedThuFriSat
   
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
  
       
Today