Saturday Dec 15, 2012

B2B Agreement Life-Cycle Management, Part 1 - Best Practices for High Volume CPA Import Operations with ebXML

Introduction

This is Part 1 of the B2B Life-Cycle Management Series. The series will discuss the best practices around various aspects of the complete B2B agreement life cycle. This post will discuss the first part which is related to the import of data artifacts into the repository. The specific use case discussed here is for CPAs related to ebXML data but the concepts are fairly generic and the process can be extended to any document or protocol.

Background

B2B 11g supports ebXML messaging protocol, where multiple CPAs can be imported via command-line utilities. 

This note highlights one aspect of the best practices for import of CPA, when large numbers of CPAs in the excess of several hundreds are required to be maintained within the B2B repository.

Symptoms

The import of CPA usually is a 2-step process, namely creating a soa.zip file using b2bcpaimport utility based on a CPA properties file and then using b2bimport to import the b2b repository.  The commands are provided below:

  1. ant -f ant-b2b-util.xml b2bcpaimport -Dpropfile="<Path to cpp_cpa.properties>" -Dstandard=true
  2. ant -f ant-b2b-util.xml b2bimport -Dlocalfile=true -Dexportfile="<Path to soa.zip>" -Doverwrite=true

Usually the first command completes fairly quickly regardless of the number of CPAs in the repository. However, as the number of trading partners within the repository goes up, the time to complete the second command could go up to ~30 secs per operation. So, this could add up to a significant amount, if there is a need to import hundreds of CPA in a production system within a limited downtime, maintenance window. 

Remedy

In situations, where there is a large number of entries to be imported, it is best to setup a staging environment and go through the import operation of each individual CPA in an empty repository. Since, this will be done in an empty repository, the time taken for completion should be reasonable. 

After all the partner profiles have been imported, a full repository export can be taken to capture the metadata for all the entries in one file. 

If this single file with all the partner entries is imported in a loaded repository, the total time taken for import of all the CPAs should see a dramatic reduction.

Results

Let us take a look at the numbers to see the benefit of this approach. With a pre-loaded repository of ~400 partners, the individual import time for each entry takes ~30 secs. So, if we had to import another 100 partners, the individual entries will take ~50 minutes (100 times ~30 secs). On the other hand, if we prepare the repository export file of the same 100 partners from a staging environment earlier, the import takes about ~5 mins.

The total processing time for the loading of metadata, specially in a production environment, can thus be shortened by almost a factor of 10.

Summary

The following diagram summarizes the entire approach and process.

Acknowledgements

The material posted here has been compiled with the help from B2B Engineering and Product Management teams.

Friday Jun 29, 2012

BAM design pointers

In working recently with a large Oracle customer on SOA and BAM, I discovered that some BAM best practices are not quite well known as I had always assumed ! There is a doc bug out to formally incorporate those learnings but here are a few notes..

 EMS-DO parity

When using EMS (Enterprise Message Source) as a BAM feed, the best practice is to use one EMS to write to one Data Object. There is a possibility of collisions and duplicates when multiple EMS write to the same row of a DO at the same time. This customer had 17 EMS writing to one DO at the same time. Every sensor in their BPEL process writes to one topic but the Topic was read by 1 EMS corresponding to one sensor. They then used XSL within BAM to transform the payload into the BAM DO format. And hence for a given BPEL instance, 17 sensors fired, populated 1 JMS topic, was consumed by 17 EMS which in turn wrote to 1 DataObject.(You can image what would happen for later versions of the application that needs to send more information to BAM !). 

We modified their design to use one Master XSL based on sensorname for all sensors relating to a DO- say Data Object 'Orders' and were able to thus reduce the 17 EMS to 1 with a master XSL.

For those of you wondering about how squeaky clean this design is, you are right ! This is indeed not squeaky clean and that brings us to yet another 'inferred' best practice. (I try very hard not to state the obvious in my blogs with the hope that everytime I blog, it is very useful but this one is an exception.)

Transformations and Calculations

It is optimal to do transformations within an engine like BPEL. Not only does this provide modelling ease with a nice GUI XSL mapper in JDeveloper, the XSL engine in BPEL is quite efficient at runtime as well. And so, doing XSL transformations in BAM is not quite prudent. 

The same is true for any non-trivial calculations as well. It is best to do all transformations,calcuations and sanitize the data in a BPEL or like layer and then send this to BAM (via JMS, WS etc.) This then delegates simply the function of report rendering and mechanics of real-time reporting to the Oracle BAM reporting tool which it is most suited to do.

All nulls are not created equal
Here is yet another possibly known fact but reiterated here.
For an EMS with an Upsert operation:

a) If Empty tags or tags with no value are sent like <Tag1/> or <Tag1></Tag1>, the DO will be overwritten with --null--
b) If Empty tags are suppressed ie not generated at all, the corresponding DO field will NOT be overwritten. The field will have whatever value existed previously. 

For an EMS with an Insert operation, both tags with an empty value and no tags result in –null-- being written to the DO.

Hope this helps ..

Happy 4th!

About


This is the blog for the Oracle FMW Architects team fondly known as the A-Team. The A-Team is the central, technical, outbound team as part of the FMW Development organization working with Oracle's largest and most important customers. We support Oracle Sales, Consulting and Support when deep technical and architectural help is needed from Oracle Development.
Primarily this blog is tailored for SOA issues (BPEL, OSB, BPM, Adapters, CEP, B2B, JCAP)that are encountered by our team. Expect real solutions to customer problems, encountered during customer engagements.
We will highlight best practices, workarounds, architectural discussions, and discuss topics that are relevant in the SOA technical space today.

Search

Archives
« April 2014
SunMonTueWedThuFriSat
  
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
   
       
Today