When Conventional Thinking Fails: A Performance Case Study in Order Management Workflow customization


                    small things put together can achieve great things.gif:

Think about it..

Order
management workflow interacts with a variety of components like core
workflow, order management and pricing application call interfaces.

Customizing it without understanding the implications can be a double edged sword and  the
nature of customization may unwittingly worsen the situation. An
undesirable combination of the aspects can cause a very difficult to
diagnose and troubleshoot situation as shall be demonstrated in this
case study.


This
article is essentially the distilled wisdom of severity 1 situation at
a client site who was unable to ship ordered items to their customers
since OM order line workflow background process had a severe
performance issue. The business impact was tremendous: $ 1.5 to 2
million
 worth of shipping and invoicing was being prevented.

Many
a times, performance troubleshooting involves having some functional
insight as well. TKPROF and 10046 trace is not the panacea for all
performance issues. In this case, some insight into the nature of Order
Management APIs was also required for getting traction on the
performance problem. A painful discovery path followed, riddled with
bumps and insightful discoveries. The troubleshooting approach used was
really out of the box and involved breaking down the code being
executed and using logical deduction.


The
generic learnings from the ordeal are presented in this case study. It
is hoped that the learnings will help oracle applications user
community to be concious of hidden implications of customizing OM
workflow.

Summary of learnings from the Case study..



  • Don't have expensive custom
    activities defined
    after START_FULFILLMENT step in OM
    workflow. It will pronounce the performance hit many times, especially when
    Oracle configurator and order management modules are being used in tamdem
    (
    Better, don't  have expensive workflow
    custom activities at all
    )
  • Batch processing
    should be over piece-meal processing, especially when an API has provision for
    the same. This reduces most of the repetitive processing

  • The whitebox
    (drill down) approach works for taking apart a baffling performance problem.
    Trying to simulate a test case usually leads to the heart of the performance
    issue in a more reliable way

  • Getting extensive
    debugs and traces is great, but only when it is known what is being looked for.
    Asking the right probing questions is very important. Question, question,
    question.

  • A well though out
    plan with minimal tracing and using a drill down approach, can bring better
    results than a shot gun or blunderbuss approach

  • Sometimes, a high
    level functional knowledge of the processing being done can be very useful in
    understanding the nature of problem. A balance between strict technical and pure
    functional knowledge can be fruitful towards solving performance
    issues

                                            Here's how it was done...

          man pointing.JPG: 

Comments:

Post a Comment:
  • HTML Syntax: NOT allowed
About

bocadmin_ww

Search

Archives
« April 2014
SunMonTueWedThuFriSat
  
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
   
       
Today