Monday Dec 30, 2013

Going Native with JCA Adapters

Formatting JCA Adapter Binary Contents

Sometimes you just need to go native and play with binary data rather than XML.  This occurs commonly when using JCA adapters, the file to be written is in binary format, or the TCP messsages written by the Socket Adapter are in binary format.  Although the adapter has no problem converting Base64 data into raw binary, it is a little tricky to get that data into base64 format in the first place, so this blog entry will explain how.

Adapter Creation

When creating most adapters (application & DB being the exceptions) you have the option of choosing the message format.  By making the message format “opaque” you are telling the adapter wizard that the message data will be provided as a base-64 encoded string and the adapter will convert this to binary and deliver it.

This results in a WSDL message defined as shown below:

<wsdl:types>
<schema targetNamespace="http://xmlns.oracle.com/pcbpel/adapter/opaque/"
        xmlns="http://www.w3.org/2001/XMLSchema" >
  <element name="opaqueElement" type="base64Binary" />
</schema>
</wsdl:types>
<wsdl:message name="Write_msg">
    <wsdl:part name="opaque" element="opaque:opaqueElement"/>
</wsdl:message>

The Challenge

The challenge now is to convert out data into a base-64 encoded string.  For this we have to turn to the service bus and MFL.

Within the service bus we use the MFL editor to define the format of the binary data.  In our example we will have variable length strings that start with a 1 byte length field as well as 32-bit integers and 64-bit floating point numbers.

The example below shows a sample MFL file to describe the above data structure:

<?xml version='1.0' encoding='windows-1252'?>
<!DOCTYPE MessageFormat SYSTEM 'mfl.dtd'>
<!--   Enter description of the message format here.   -->
<MessageFormat name='BinaryMessageFormat' version='2.02'>
    <FieldFormat name='stringField1' type='String' delimOptional='y' codepage='UTF-8'>
        <LenField type='UTinyInt'/>
    </FieldFormat>
    <FieldFormat name='intField' type='LittleEndian4' delimOptional='y'/>
    <FieldFormat name='doubleField' type='LittleEndianDouble' delimOptional='y'/>
    <FieldFormat name='stringField2' type='String' delimOptional='y' codepage='UTF-8'>
        <LenField type='UTinyInt'/>
    </FieldFormat>
</MessageFormat>

Note that we can define the endianess of the multi-byte numbers, in this case they are specified as little endian (Intel format).

I also created an XML version of the MFL that can be used in interfaces.

The XML version can then be imported into a WSDL document to create a web service.

Full Steam Ahead

We now have all the pieces we need to convert XML to binary and deliver it via an adapter using the process shown below:

  • We receive the XML request, in the sample code, the sample delivers it as a web service.
  • We then convert the request data into MFL format XML using an XQuery and store the result in a variable (mflVar).
  • We then convert the MFL formatted XML into binary data (internally this is held as a java byte array) and store the result in a variable (binVar).
  • We then convert the byte array to a base-64 string using javax.xml.bind.DatatypeConverter.printBase64Binary and store the result in a variable (base64Var).
  • Finally we replace the original $body contents with the output of an XQuery that matches the adapter expected XML format.

The diagram below shows the OSB pipeline that implements the above.

A Wrinkle

Unfortunately we can only call static Java methods that reside in a jar file imported into service bus, so we have to provide a wrapper for the printBase64Binary call.  The below Java code was used to provide this wrapper:

package antony.blog;

import javax.xml.bind.DatatypeConverter;

public class Base64Encoder {
    public static String base64encode(byte[] content) {
        return DatatypeConverter.printBase64Binary(content);
    }
    public static byte[] base64decode(String content) {
        return DatatypeConverter.parseBase64Binary(content);
    }
}

Wrapping Up

Sample code is available here and consists of the following projects:

  • BinaryAdapter – JDeveloper SOA Project that defines the JCA File Adapter
  • OSBUtils – JDeveloper Java Project that defines the Java wrapper for DataTypeConverter
  • BinaryFileWriter – Eclipse OSB Project that includes everything needed to try out the steps in this blog.

The OSB project needs to be customized to have the logical directory name point to something sensible.  The project can be tested using the normal OSB console test screen.

The following sample input (note 16909060 is 0x01020304)

<bin:OutputMessage xmlns:bin="http://www.example.org/BinarySchema">
    <bin:stringField1>First String</bin:stringField1>
    <bin:intField>16909060</bin:intField>
    <bin:doubleField>1.5</bin:doubleField>
    <bin:stringField2>Second String</bin:stringField2>
</bin:OutputMessage>

Generates the following binary data file – displayed using “hexdump –C”.  The int is highlighted in yellow, the double in orange and the strings and their associated lengths in green with the length in bold.

$ hexdump -C 2.bin
00000000  0c 46 69 72 73 74 20 53  74 72 69 6e 67 04 03 02  |.First String...|
00000010  01 00 00 00 00 00 00 f8  3f 0d 53 65 63 6f 6e 64  |........?.Second|
00000020  20 53 74 72 69 6e 67                              | String|

Although we used a web service writing through to a file adapter we could have equally well used the socket adapter to send the data to a TCP endpoint.  Similarly the source of the data could be anything.  The same principle can be applied to decode binary data, just reverse the steps and use Java method parseBase64Binary instead of printBase64Binary.

Thursday Dec 26, 2013

List Manipulation in Rules

Generating Lists from Rules

Recently I was working with a customer that wanted to use rules to do validation.  The idea was to pass in a document to the rules engine and get back a list of violations, or an empty list if there were no violations.  Turns out that there were a coupe more steps required than I expected so thought I would share my solution in case anyone else is wondering how to return lists from the rules engine.

The Scenario

For the purposes of this blog I modeled a very simple shipping company document that has two main parts.   The Package element contains information about the actual item to be shipped, its weight, type of package and destination details.  The Billing element details the charges applied.

For the purpose of this blog I want to validate the following:

  • A residential surcharge is applied to residential addresses.
  • A residential surcharge is not applied to non-residential addresses.
  • The package is of the correct weight for the type of package.

The Shipment element is sent to the rules engine and the rules engine replies with a ViolationList element that has all the rule violations that were found.

Creating the Return List

We need to create a new ViolationList within rules so that we can return it from within the decision function.  To do this we create a new global variable – I called it ViolationList – and initialize it.  Note that I also had some globals that I used to allow changing the weight limits for different package types.

When the rules session is created it will initialize the global variables and assert the input document – the Shipment element.  However within rules our ViolationList variable has an uninitialized internal List that is used to hold the actual List of Violation elements.  We need to initialize this to an empty RL.list in the Decision Functions “Initial Actions” section.

We can then assert the global variable as a fact to make it available to be the return value of the decision function.  After this we can now create the rules.

Adding a Violation to the List

If a rule fires because of a violation then we need add a Violation element to the list.  The easiest way to do this without having the rule check the ViolationList directly is to create a function to add the Violation to the global variable VioaltionList.

The function creates a new Violation and initializes it with the appropriate values before appending it to the list within the ViolationList.

When a rule fires then it just necessary to call the function to add the violation to the list.

In the example above if the address is a residential address and the surcharge has not been applied then the function is called with an appropriate error code and message.

How it Works

Each time a rule fires we can add the violation to the list by calling the function.  If multiple rules fire then we will get multiple violations in the list.  We can access the list from a function because it is a global variable.  Because we asserted the global variable as a fact in the decision function initialization function it is picked up by the decision function as a return value.  When all possible rules have fired then the decision function will return all asserted ViolationList elements, which in this case will always be 1 because we only assert it in the initialization function.

What Doesn’t Work

A return from a decision function is always a list of the element you specify, so you may be tempted to just assert individual Violation elements and get those back as a list.  That will work if there is at least one element in the list, but the decision function must always return at least one element.  So if there are no violations then you will get an error thrown.

Alternative

Instead of having a completely separate return element you could have the ViolationList as part of the input element and then return the input element from the decision function.  This would work but now you would be copying most of the input variables back into the output variable.  I prefer to have a cleaner more function like interface that makes it easier to handle the response.

Download

Hope this helps someone.  A sample composite project is available for download here.  The composite includes some unit tests.  You can run these from the EM console and then look at the inputs and outputs to see how things work.

Tuesday Dec 24, 2013

Cleaning Up After Yourself

Maintaining a Clean SOA Suite Test Environment

Fun blog entry with Fantasia animated gifs got me thinking like Mickey about how nice it would be to automate clean up tasks.

I don’t have a sorcerers castle to clean up but I often have a test environment which I use to run tests, then after fixing problems that I uncovered in the tests I want to run them again.  The problem is that all the data from my previous test environment is still there.

Now in the past I used VirtualBox snapshots to rollback to a clean state, but this has a problem that it not only loses the environment changes I want to get rid of such as data inserted into tables, it also gets rid of changes I want to keep such as WebLogic configuration changes and new shell scripts.  So like Mickey I went in search of some magic to help me.

Cleaning Up SOA Environment

My first task was to clean up the SOA environment by deleting all instance data from the tables.  Now I could use the purge scripts to do this, but that would still leave me with running instances, for example 800 Human Workflow Tasks that I don’t want to deal with.  So I used the new truncate script to take care of this.  Basically this removes all instance data from your SOA Infrastructure, whether or not the data is live.  This can be run without taking down the SOA Infrastructure (although if you do get strange behavior you may want to restart SOA).  Some statistics, such are service and reference statistics, are kept since server startup, so you may want to restart your server to clear that data.  A sample script to run the truncate SQL is shown below.

#!/bin/sh
# Truncate the SOA schemas, does not truncate BAM.
# Use only in development and test, not production.

# Properties to be set before running script
# SOAInfra Database SID
DB_SID=orcl
# SOA DB Prefix
SOA_PREFIX=DEV
# SOAInfra DB password
SOAINFRA_PASSWORD=welcome1
# SOA Home Directory
SOA_HOME=/u01/app/fmw/Oracle_SOA1

# Set DB Environment
. oraenv << EOF
${DB_SID}
EOF

# Run Truncate script from directory it lives in
cd ${SOA_HOME}/rcu/integration/soainfra/sql/truncate

# Run the truncate script
sqlplus ${SOA_PREFIX}_soainfra/${SOAINFRA_PASSWORD} @truncate_soa_oracle.sql << EOF
exit
EOF

After running this script all your SOA composite instances and associated workflow instances will be gone.

Cleaning Up BAM

The above example shows how easy it is to get rid of all the runtime data in your SOA repository, however if you are using BAM you still have all the contents of your BAM objects from previous runs.  To get rid of that data we need to use BAM ICommand’s clear command as shown in the sample script below:

#!/bin/sh
# Set software locations
FMW_HOME=/home/oracle/fmw
export JAVA_HOME=${FMW_HOME}/jdk1.7.0_17
BAM_CMD=${FMW_HOME}/Oracle_SOA1/bam/bin/icommand
# Set objects to purge
BAM_OBJECTS=/path/RevenueEvent /path/RevenueViolation

# Clean up BAM
for name in ${BAM_OBJECTS}
do
  ${BAM_CMD} -cmd clear -name ${name} -type dataobject
done

After running this script all the rows of the listed objects will be gone.

Ready for Inspection

Unlike the hapless Mickey, our clean up scripts work reliably and do what we want without unexpected consequences, like flooding the castle.

Friday Dec 20, 2013

Supporting the Team

SOA Support Team Blog

Some of my former colleagues in support have created a blog to help answer common problems for customers.  One way they are doing this is by creating better landing zones within My Oracle Support (MOS).  I just used the blog to locate the landing zone for database related issues in SOA Suite.  I needed to get the purge scripts working on 11.1.1.7 and I couldn’t find the patches needed to do that.  A quick look on the blog and I found a suitable entry that directed me to the Oracle Fusion Middleware (FMW) SOA 11g Infrastructure Database: Installation, Maintenance and Administration Guide (Doc ID 1384379.1) in MOS.  Lots of other useful stuff on the blog so stop by and check it out, great job Shawn, Antonella, Maria & JB.

Thursday Dec 19, 2013

$5 eBook Bonanza

Packt eBooks $5 Offer

Packt Publishing just told me about their Christmas offer, get eBooks for $5.

From December 19th, customers will be able to get any eBook or Video from Packt for just $5. This offer covers a myriad of titles in the 1700+ range where customers will be able to grab as many as they like until January 3rd 2014 – more information is available at http://bit.ly/1jdCr2W

If you haven’t bought the SOA Developers Cookbook then now is a great time to do so!

About

Musings on Fusion Middleware and SOA Picture of Antony Antony works with customers across the US and Canada in implementing SOA and other Fusion Middleware solutions. Antony is the co-author of the SOA Suite 11g Developers Cookbook, the SOA Suite 11g Developers Guide and the SOA Suite Developers Guide.

Search

Archives
« December 2013 »
SunMonTueWedThuFriSat
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
21
22
23
25
27
28
29
31
    
       
Today