Tuesday Oct 02, 2012

OpenWorld Day 1

A Day in the Life of an OpenWorld Attendee Part I

Lots of people are blogging insightfully about OpenWorld so I thought I would provide some non-insightful remarks to buck the trend!

With 50,000 attendees I didn’t expect to bump into too many people I knew, boy was I wrong!  I walked into the registration area and immediately was hailed by a couple of customers I had worked with a few months ago.  Moving to the employee registration area in a different hall I bumped into a colleague from the UK who was also registering.  As soon as I got my badge I bumped into a friend from Ireland!  So maybe OpenWorld isn’t so big after all!

First port of call was Larrys Keynote.  As always Larry was provocative and thought provoking.  His key points were announcing the Oracle cloud offering in IaaS, PaaS and SaaS, pointing out that Fusion Apps are cloud enabled and finally announcing the 12c Database, making a big play of its new multi-tenancy features.  His contention was that multi-tenancy will simplify cloud development and provide better security by providing DB level isolation for applications and customers.

Next day, Monday, was my first full day at OpenWorld.  The first session I attended was on monitoring of OSB, very interesting presentation on the benefits achieved by an Illinois area telco – US Cellular.  Great discussion of why they bought the SOA Management Packs and the benefits they are already seeing from their investment in terms of improved provisioning and time to market, as well as better performance insight and assistance with capacity planning.

Craig Blitz provided a nice walkthrough of where Coherence has been and where it is going.

Last night I attended the BOF on Managed File Transfer where Dave Berry replayed Oracles thoughts on providing dedicated Managed File Transfer as part of the 12c SOA release.  Dave laid out the perceived requirements and solicited feedback from the audience on what if anything was missing.  He also demoed an early version of the functionality that would simplify setting up MFT in SOA Suite and make tracking activity much easier.

So much for Day 1.  I also ran into scores of old friends and colleagues and had a pleasant dinner with my friend from Ireland where I caught up on the latest news from Oracle UK.  Not bad for Day 1!

Friday Jul 13, 2012

Deploying Fusion Order Demo on 11.1.1.6

How to Deploy Fusion Order Demo on SOA Suite 11.1.1.6

We need to build a demo for a customer, why not use Fusion Order Demo (FOD) and modify it to do some extra things.  Great idea, let me install it on one of my Linux servers I said…

Turns out there are a few gotchas, so here is how I installed it on a Linux server with JDeveloper on my Windows desktop.

Task 1: Install Oracle JDeveloper Studio

I already had JDeveloper 11.1.1.6 with SOA extensions installed so this was easy.

Task 2: Install the Fusion Order Demo Application

First thing to do is to obtain the latest version of the demo from OTN, I obtained the R1 PS5 release.

Gotcha #1 – my winzip wouldn’t unzip the file, I had to use 7-Zip.

Task 3: Install Oracle SOA Suite

On the domain modify the setDomainEnv script by adding “-Djps.app.credential.overwrite.allowed=true” to JAVA_PROPERTIES and restarting the Admin Server.

Also set the JAVA_HOME variable and add Ant to the path.

I created a domain with separate SOA and BAM servers and also set up the Node Manager to make it easier to stop and start components.

Taking a Look at the WebLogic Fusion Order Demo Application

Note that when opening the composite you will get warnings because the components are not yet deployed to MDS.

Deploying Fusion Order Demo

Task 1: Create a Connection to an Oracle WebLogic Server

If some tests complete when you test the connection to the WebLogic domain but other tests fail, for example the JSR-88 tests, then you may need to go into the console and under each servers Configuration->General->Advanced setting, set the “External Listen Address” to be the name that JDeveloper uses to access the managed server.

Task 2: Create a Connection to the Oracle BAM Server

I can’t understand why customers wouldn’t want to use BAM.  Monitor Express makes it a matter of a few clicks to provide real time process status information to the business.

Oh yes!  I remember now, several customers IT staff have told me they don’t want the business seeing this data because they will hassle the IT department if something goes wrong, and BAM lets them see it going wrong in real time…

Task 3: Install the Schema for the Fusion Order Demo Application

When editing the Infrastructure->MasterBuildScript->Resources build.properties make sure that you set jdeveloper.home to the jdeveloper directory underneath the directory that you installed JDeveloper into.  I installed JDeveloper Studio into “C:\JDev11gPS5” so my jdeveloper.home is “C:\JDev11gPS5\jdeveloper”.

Gotcha #2 – the ant script throws an error partway through but does not report it at the end, so check carefully for the following error:

oracle.jbo.JboException: JBO-29000: Unexpected exception caught: java.lang.NoClassDefFoundError, msg=oracle/jdbc/OracleClob

This occurs because the build.xml does not include the ojdbc6dms library which have CLOB support so in JDeveloper add it to the path “oracle.jdbc.path” in Infrastructure->/DatabaseSchema/Resources build.xml file.

<path id="oracle.jdbc.path">
  <fileset dir="${jdeveloper.home}/../wlserver_10.3/server/lib">
    <include name="ojdbc6.jar"/>
  </fileset>
  <fileset dir="${jdeveloper.home}/../oracle_common/modules/oracle.jdbc_11.1.1">
    <include name="ojdbc6dms.jar"/>
  </fileset>
</path>

Rerun the ant script from Infrastructure/Ant with a target of “buildAll” and it should now complete without errors.

Task 4: Set the Configuration Property for the Store Front Module

Nothing to watch out for here.

Task 5: Edit the Database Connection

Nothing to watch out for here.

Task 6: Deploy the Store Front Module

There is an additional step when deploying, you will be asked for an MDS repository to use.  Best to use the MDS-SOA repository and put the content in its own partition.

Gotcha #3 –when prompted select the mds-soa MDS repository and choose the soa.  Note that this is an MDS partition, not a SOA partition.

Note that when you deploy the StoreFrontServiceSDO_Services application it will populate the local WebLogic LDAP with the demo users.  If this step fails it will be because you forgot to set the “-Djps.app.credential.overwrite.allowed=true” parameter and restart the Admin Server.

Task 7: Deploy the WebLogic Fusion Order Demo Application

Set up the environment for BAM.

When editing the WebLogicFusionOrderDemo->bin->Resources build.properties make sure that you set oracle.home to the jdeveloper directory underneath the directory that you installed JDeveloper into.  I installed JDeveloper Studio into “C:\JDev11gPS5” so my oracle.home is “C:\JDev11gPS5\jdeveloper”.

Gotcha #5a – Make sure that you create directories on the server for the FileAdapter to use as a file directory and a separate control directory and make sure you set the corresponding properties in the build.properties file:

  • orderbooking.file.adapter.dir
  • orderbooking.file.adapter.control.dir

Gotcha #5b – Also make sure you set the following properties in the build.properties file:

  • soa.domain.name
  • managed.server.host
  • managed.server.rmi.port
  • soa.db.username
  • soa.db.password
  • soa.db.connectstring

Note that the soa.server.oracle.home property must be set to the ORACLE_SOA_HOME (usually Oracle_SOA1 under the MW_HOME) on the server.

Gotcha #6 – I found that unless I went into the console to each servers Configuration->Protocols->IIOP->Advanced setting, and set the “Default IIOP Username” and “Default IIOP Password” to be the weblogic user the deployment failed.

Gotcha #7 – when deploying BAM objects in seedBAMServerObjects activity I got an exception “java.lang.NoClassDefFoundError: org/apache/commons/codec/binary/Base64” which is caused because the BAM installation under JDeveloper does not have all the required libraries.  To fix this copy the commons-codec-1.3.jar file from the server machine ORACLE_SOA_HOME/bam/modules/oracle.bam.third.party_11.1.1 to the JDev machine ORACLE_JDEV_HOME/bam/modules/oracle.bam.third.party_11.1.1.

Gotcha #8 – when deploying BAM objects in seedBAMServerObjects activity I got an error “BAM-02440: ICommand is unable to connect to the Oracle BAM server because user credentials (username/password) have not been specified.”.  The quick way to fix this is to change to the directory where the import script was created on the JDeveloper machine (ORACLE_JDEV_HOME\bam\dataObjects\load) and run the load script after setting the JAVA_HOME

..\..\bin\icommand -CMDFILE ImportFODBamObjects.xml

I am sure if I spent more time in the ant scripts I could have found what was wrong with the script for deploying this.

Running Fusion Order Demo

You are now ready to place an order through the frontend app at http://soahost:soaport/StoreFrontModule/faces/home.jspx.  The BAM dashboard is available for you to monitor the progress of your order and EM is all set to let you monitor the health of the processes.  Enjoy studying a relatively complex example that demonstrates many best practices such as use of MDS.

Thursday May 24, 2012

Whos Port Is It?

Who Owns What Port?

It is not uncommon to be unable to start a server process because some other process is holding onto a network port that is required by the server.  The question is how do you find the offending process?  I thought I would identify some of the commands I use to track down wayward port usage.

Identify the Conflict

The first thing to do is to identify the port that is being used.  Hopefully your log file will indicate which port the server process was unable to obtain.  Even if it did not identify the port and you know the ports that it requires then you can use the first of my helpful commands:

Windows Linux
netstat –anop tcp netstat –lnt --program

The Windows version lists all “-a” network sockets using TCP/IP v4 “-p tcp”.  To make it easier to find the listening port I had them list in numeric format “-n” rather than using abbreviations.  Other possible protocols are TCP/IP v6 “tcpv6”, UDP “udp” and UDP v6 “udpv6”.  Finally I had the netstat command print out the process ID “-o” of the process using the port.

The Linux version is slightly different in that it lists only listening ports “-l” in numeric form “-n” for the TCP protocols “tcp”, both V4 and V6.  For UDP protocol use “-u”.  The process ID and program name is also displayed “—program”.  Note that this is best run as root because you need root privileges to have netstat show you the pid of a process you don’t own.

Find the Culprit

Now that we know which process is holding which port in use the next thing to do is find out more about the process holding onto our port.  The second of our helpful commands shows us the command line used to launch are mischievous process.

Windows Linux
tasklist /FI “PID eq <PID>” ps –p <PID> –o args

Unfortunately I haven’t found a good way to find out the actual command line from a Windows machine.  Tasklist allows you to filter “/FI” the list of tasks to see the process name associated with a PID “PID eq <PID>, but if that process is a service then the process name will show as “svchost.exe”.  You may be able to see more information by using Windows Process Explorer, but even that doesn’t always tell you what you need to know.

On Linux we can use the trusty ps command to find a given pid “-p <PID>” and output the command and associated command line arguments “-o args”.  From this we know exactly who is using our PID.

Armed with this information we can reconfigure the errant process, shut it down or decide that we need to change the port number for our server process instead.

Wednesday Apr 25, 2012

Scripting WebLogic Admin Server Startup

How to Script WebLogic Admin Server Startup

My first car was a 14 year old Vauxhall Viva.  It is the only one of my cars that has ever been stolen, and to this day how they stole it is a mystery to me as I could never get it to start.  I always parked it pointing down a steep hill so that I was ready to jump start it!  Of course its ability to start was dramatically improved when I replaced the carburetor butterfly valve!

Getting SOA Suite or other WebLogic based systems to start can sometimes be a problem because the default WebLogic start scripts require you to stay logged on to the computer where you started the script.  Obviously this is awkward and a better approach is to run the script in the background.  This problem can be avoided by using a WLST script to start the AdminServer but that is more work, so I never bother with it.

If you just run the startup script in the background the standard output and standard error still go to the session where you started the script, not helpful if you log off and later want to see what is happening.  So the next thing to do is to redirect standard out and standard error from the script.

Finally it would be nice to have a record of the output of the last few runs of the Admin Server, but these should be purged to avoid filling up the directory.

Doing the above three tasks is the job of the script I use to start WebLogic.  The script is shown below:

Startup Script

#!/bin/sh

# SET VARIABLES

SCRIPT_HOME=`dirname $0`

MW_HOME=/home/oracle/app/Middleware

DOMAIN_HOME=$MW_HOME/user_projects/domains/dev_domain

LOG_FILE=$DOMAIN_HOME/servers/AdminServer/logs/AdminServer.out

# MOVE EXISTING LOG FILE

logrotate -f -s $SCRIPT_HOME/logrotate.status $SCRIPT_HOME/AdminServerLogRotation.cfg

#RUN ADMIN SERVER

touch $LOG_FILE

nohup $DOMAIN_HOME/startWebLogic.sh &> $LOG_FILE &

tail -f $LOG_FILE

Explanation

Lets walk through each section of the script.

SET VARIABLES

The first few lines of the script just set the environment.  Note that I put the output of the start script into the same location and same filename that it would go to if I used the Node Manager to start the server.  This keeps it consistent with other servers that are started by the node manager.

MOVE EXISTING LOG FILE

The next section keeps a copy of the previous output file by using the logrotate command.  This reads its configuration from the “AdminServerLogRotation.cfg” file shown below:

/home/oracle/app/Middleware/user_projects/domains/dev_domain/servers/AdminServer/logs/AdminServer.out {
  rotate 10
  missingok
}

This tells the logrotate command to keep 10 copies (rotate 10) of the log file and if there is no previous copy of the log file that is not an error condition (missingok).

The logrotate.status file is used by logrotate to keep track of what it has done.  It is ignored when the –f flag is used, causing the log file to be rotated every time the command is invoked.

RUN ADMIN SERVER

UPDATE: Sometimes the tail command starts before the shell has created the log file for the startWebLogic.sh command. To avoid an error in the tail command I "touch" the log file to make sure that it is there.

The final section actually invokes the standard command to start an admin server (startWebLogic.sh) and redirects the standard out and standard error to the log file.  Note that I run the command in the background and set it to ignore the death of the parent shell.

Finally I tail the log file so that the user experience is the same as running the start command directly.  However in this case if I Ctrl-C the command only the tail will be terminated, the Admin Server will continue to run as a background process.

This approach allows me to watch the output of the AdminServer but not to shut it down if I accidently hit Ctrl-C or close the shell window.

Restart Script

I also have a restart script shown below:

#!/bin/sh
# SET VARIABLES
SCRIPT_HOME=`dirname $0`
MW_HOME=/home/oracle/app/Middleware
DOMAIN_HOME=$MW_HOME/user_projects/domains/dev_domain

# STOP ADMIN SERVER
$DOMAIN_HOME/bin/stopWebLogic.sh

# RUN ADMIN SERVER
$SCRIPT_HOME/startAdminServer.sh

This is just like the start script except that it runs the stop weblogic command followed by my start script command.

Summary

The above scripts are quick and easy to put in place for the Admin Server and make the stdout and stderr logging consistent with other servers that are started from the node manager.  Now can someone help me push start my car!

Friday Mar 16, 2012

Memory Efficient Windows SOA Server

Installing a Memory Efficient SOA Suite 11.1.1.6 on Windows Server

Well 11.1.1.6 is now available for download so I thought I would build a Windows Server environment to run it.  I will minimize the memory footprint of the installation by putting all functionality into the Admin Server of the SOA Suite domain.

Required Software

  • 64-bit JDK
  • SOA Suite
    • If you want 64-bit then choose “Generic” rather than “Microsoft Windows 32bit JVM” or “Linux 32bit JVM”
    • This has links to all the required software.
    • If you choose “Generic” then the Repository Creation Utility link does not show, you still need this so change the platform to “Microsoft Windows 32bit JVM” or “Linux 32bit JVM” to get the software.
    • Similarly if you need a database then you need to change the platform to get the link to XE for Windows or Linux.

If possible I recommend installing a 64-bit JDK as this allows you to assign more memory to individual JVMs.

Windows XE will work, but it is better if you can use a full Oracle database because of the limitations on XE that sometimes cause it to run out of space with large or multiple SOA deployments.

Installation Steps

The following flow chart outlines the steps required in installing and configuring SOA Suite.

The steps in the diagram are explained below.

64-bit?

Is a 64-bit installation required?  The Windows & Linux installers will install 32-bit versions of the Sun JDK and JRockit.  A separate JDK must be installed for 64-bit.

Install 64-bit JDK

The 64-bit JDK can be either Hotspot or JRockit.  You can choose either JDK 1.7 or 1.6.

Install WebLogic

If you are using 64-bit then install WebLogic using “java –jar wls1036_generic.jar”.  Make sure you include Coherence in the installation, the easiest way to do this is to accept the “Typical” installation.

SOA Suite Required?

If you are not installing SOA Suite then you can jump straight ahead and create a WebLogic domain.

Install SOA Suite

Run the SOA Suite installer and point it at the existing Middleware Home created for WebLogic.  Note to run the SOA installer on Windows the user must have admin privileges.  I also found that on Windows Server 2008R2 I had to start the installer from a command prompt with administrative privileges, granting it privileges when it ran caused it to ignore the jreLoc parameter.

Database Available?

Do you have access to a database into which you can install the SOA schema.  SOA Suite requires access to an Oracle database (it is supported on other databases but I would always use an oracle database).

Install Database

I use an 11gR2 Oracle database to avoid XE limitations.  Make sure that you set the database character set to be unicode (AL32UTF8).  I also disabled the new security settings because they get in the way for a developer database.  Don’t forget to check that number of processes is at least 150 and number of sessions is not set, or is set to at least 200 (in the DB init parameters).

Run RCU

The SOA Suite database schemas are created by running the Repository Creation Utility.  Install the “SOA and BPM Infrastructure” component to support SOA Suite.  If you keep the schema prefix as “DEV” then the config wizard is easier to complete.

Run Config Wizard

The Config wizard creates the domain which hosts the WebLogic server instances.  To get a minimum footprint SOA installation choose the “Oracle Enterprise Manager” and “Oracle SOA Suite for developers” products.  All other required products will be automatically selected.

The “for developers” installs target the appropriate components at the AdminServer rather than creating a separate managed server to house them.  This reduces the number of JVMs required to run the system and hence the amount of memory required.  This is not suitable for anything other than a developer environment as it mixes the admin and runtime functions together in a single server.  It also takes a long time to load all the required modules, making start up a slow process.

If it exists I would recommend running the config wizard found in the “oracle_common/common/bin” directory under the middleware home.  This should have access to all the templates, including SOA.

If you also want to run BAM in the same JVM as everything else then you need to “Select Optional Configuration” for “Managed Servers, Clusters and Machines”.

To target BAM at the AdminServer delete the “bam_server1” managed server that is created by default.  This will result in BAM being targeted at the AdminServer.

Installation Issues

I had a few problems when I came to test everything in my mega-JVM.

  • Following applications were not targeted and so I needed to target them at the AdminServer:
    • b2bui
    • composer
    • Healthcare UI
    • FMW Welcome Page Application (11.1.0.0.0)

How Memory Efficient is It?

On a Windows 2008R2 Server running under VirtualBox I was able to bring up both the 11gR2 database and SOA/BPM/BAM in 3G memory.  I allocated a minimum 512M to the PermGen and a minimum of 1.5G for the heap.  The setting from setSOADomainEnv are shown below:

set DEFAULT_MEM_ARGS=-Xms1536m -Xmx2048m
set PORT_MEM_ARGS=-Xms1536m -Xmx2048m

set DEFAULT_MEM_ARGS=%DEFAULT_MEM_ARGS% -XX:PermSize=512m -XX:MaxPermSize=768m
set PORT_MEM_ARGS=%PORT_MEM_ARGS% -XX:PermSize=512m -XX:MaxPermSize=768m

I arrived at these numbers by monitoring JVM memory usage in JConsole.

Task Manager showed total system memory usage at 2.9G – just below the 3G I allocated to the VM.

Performance is not stellar but it runs and I could run JDeveloper alongside it on my 8G laptop, so in that sense it was a result!

Tuesday Mar 13, 2012

Packt Oracle Discount Month

Packt Publishing are celebrating the publication of their 50th Oracle book by offering an exclusive discount throughout March for all Oracle books. Here is the link which explains it in detail: Packt's Oracle Campaign, https://www.packtpub.com/news/hit-the-oracle-packtpot. If you haven’t bought mine and Matt’s book the Oracle SOA Suite 11g R1 Developer's Guide book then now is a great time to get it from Packt.  Packt have recently published a number of new Oracle SOA Suite books, including an Oracle Service Bus Bus 11g Development Cookbook by those smart guys in the Netherlands and Switzerland, and an Oracle BAM 11gR1 Handbook by my friend Pete Wang.

Monday Mar 12, 2012

My Hiring Approach

Hiring Engineers

I recently had the privilege of performing the technical interviews to evaluate potential new hires into Oracles support organization.  As my approach is different from many interview processes I thought I would share it with you.  It is basically a three step process.

Step 1 – What Do You Know?

We ask them technical questions about what they said they have done on their resume.  Very common to get responses like, oh I didn't do very much with that.  In that case we mark them down, if you you put it on the resume then we will ask you detailed questions, for example if they have "worked" with enterprise Java then we ask about the meaning of EJB transaction settings ("what is the different between Required and Mandatory") and get them to explain the JSP->Servlet lifecycle ("a jsp is deployed as a mixture of HTML and Java source, how does it become executable code").  This is really just an honesty and level setting phase where we see if what they said on the resume is accurate and they understand what they said they understand.

Step 2 – Can You Extrapolate Your Knowledge?

After testing if they know what thy said they know we ask them questions a little outside their area of knowledge to see if they can extrapolate from what they know.  We encourage them to guess an answer, we want to see if they understand principles and can come up with a reasonable response, the response doesn't have to be correct, we are looking for a plausible, but possibly wrong, solution.  If they won't guess we mark them down, if they guess wrong but have good reasons we mark them the same as if they got it right.  This puts them in to the typical support area of trying to solve a customers problem where you have to make informed guesses and be able to justify to the customer why you want them to do this.

Step 3 – Can You Solve Problems

Then we ask them to troubleshoot a specific problem.  For example "yesterday you deployed a new application and it worked fine for users, today they are reporting they get errors saying database unavailable".  We want to see if they can take a big picture and narrow down the problem area in a sensible way.  We are really looking for them to grasp the big picture of which components are being used and then to describe how they would isolate the problem (test if database is running, test network connectivity, check can log in to db from app server etc.).  Again this is directly relevant to the support job and we want them to demonstrate that they know how to troubleshoot - just saying I would look in the logs will get them marked down.

Result

Sadly it seems a lot of people are better resume writers than engineers, but this process tends to weed out those individuals.  We were able to hire some excellent engineers based on the above process and shortly after joining Oracle they were making great contributions to the company so it seemed to work.  Of course a technical interview is only part of the process.  It is also important that engineers fit into the culture of the company.  So an engineer might pass the technical interview but still fail because the interviewer felt they wouldn’t fit into the Oracle culture.  So the above is a process to help in evaluating technical skills but there is more to hiring than just that.

Friday Mar 02, 2012

Using Coherence with JDeveloper

Configuring JDeveloper for use with Coherence

Doing some work with Coherence again and so I needed to create some Java code calling Coherence API and edit some Coherence configuration files in JDeveloper.  The easiest way to do this is to register the Coherence jar file and the Coherence Schemas with JDeveloper, once that is done then you can use JDevelopers XML insight features to help you create the XML documents.

Register the Coherence Library

To register the Coherence jar file in JDeveloper go to “Tools->Manage Libraries…”, select “New…” and then use “Add Entry…” to add the following entries:

  • Class Path
    • <COHERENCE_HOME>\lib\coherence.jar
  • Doc Path
    • <COHERENCE_HOME>\doc\api

COHERENCE_HOME is the location where you unzipped the Coherence product.

This lets us use the Coherence API in our Java code by adding the library to our project.

Register the Schemas

To register the Coherence XML Schemas with JDevelper go to “Tools->Preferences…”, select “XML Schemas” and choose “Add…”.

Browse to the <COHERENCE_HOME>\lib\coherence.jar file and add the following schemas:

  • coherence-cache-config.xsd
  • coherence-operational-config.xsd
  • coherence-pof-config.xsd
  • coherence-report-config.xsd
  • coherence-report-group-config.xsd
  • coherence-rest-config.xsd

Now when you create an XML file for use with Coherence you can choose “XML Document from XML Schema” and choose “Use Registered Schemas” to show you suitable schemas to use for your Coherence config.

Wednesday Dec 28, 2011

Too Much Debug

Too Much Debug

Remains of a Roast Turkey

Well it is Christmas and as is traditional, in England at least, we had roast turkey dinner.  And of course no matter how big your family, turkeys come in only two sizes; massively too big or enormously too big!  So by the third day of Christmas you are ready never to eat turkey again until thanksgiving.  Your trousers no longer fit around the waist, your sweater is snug around the midriff, and your children start talking about the return of the Blob.

And my point?  Well just like the food world, sometimes in the SOA world too much of a good thing is bad for you.  I had just extended my BPM domain with OSB only to discover that I could no longer start the BAM server, or the newly configured OSB server.  The error message I was getting was:

starting weblogic with Java version:
FATAL ERROR in native method: JDWP No transports initialized, jvmtiError=AGENT_ERROR_TRANSPORT_INIT(197)
ERROR: transport error 202: bind failed: Address already in use
ERROR: JDWP Transport dt_socket failed to initialize, TRANSPORT_INIT(510)
JDWP exit error AGENT_ERROR_TRANSPORT_INIT(197): No transports initialized [../../../src/share/back/debugInit.c:690]
Starting WLS with line:
C:\app\oracle\product\FMW\JDK160~2\bin\java -client -Xdebug -Xnoagent -Xrunjdwp:transport=dt_socket,address=8453,server=y,suspend=n

The mention of JDWP points to a problem with debug settings and sure enough in a development domain the setDomainEnv script is set up to enable debugging of OSB.  The problem is that the settings apply to all servers started with settings in the setDomainEnv script and should really only apply to the OSB servers.  There is a blog entry by Jiji Sasidharan that explains this and provides a fix.  However the fix disables the debug flag for all servers, not just the non-OSB servers.  So I offer my extension to this fix which modifies the setDomainEnv script as follows from:

set debugFlag=true

to:

rem Added so that only OSB server starts in debug mode
if "%SERVER_NAME%"=="osb_server1" (
    set debugFlag=true
)

This enables debugging to occur on managed server osb_server1 (this should match the name of one of your OSB servers to enable debugging).  It does not enable the debug flag for any other server, including other OSB servers in a cluster.  After making this change it may be necessary to restart the Admin Server because it is probably bound to the debug port.

So the moral of this tale is don’t eat too much turkey, don’t abuse the debug flag, but make sure you can get the benefits of debugging.

Have a great new year!

Thursday Dec 22, 2011

SOA in a Windows World

Installing BPM Suite on Windows Server 2008 Domain Controller under VirtualBox

It seems I am working with a number of customers for whom Windows is an important part of their infrastructure.  Security is tied in with Windows Active Directory and many services are hosted using Windows Communication Framework.  To better understand these customers environment I got myself a Windows 2008 server license and decided to install BPM Suite on a Windows 2008 Server running as a domain controller.  This entry outlines the process I used to get it to work.

The Environment

I didn’t want to dedicate a physical server to running Windows Server so I installed it under Oracle Virtual Box.

My target environment was Windows 2008 Server with Active Directory and DNS roles.  This would give me access to the Microsoft security infrastructure and I could use this to make sure I understood how to properly integrate WebLogic and SOA Suite security with Windows security.  I wanted to run Oracle under a non-Administrator account, as this is often the way I have to operate on customer sites.  This was my first challenge.  For very good security reasons the only accounts allowed to log on to a Windows Domain controller are domain administrator accounts.  Now I only had resources (and licenses) for a single Windows server so I had to persuade Windows to let me log on with a non-Domain Admin account.

Logging On with  a non-Domain Admin Account

I found this very helpful blog entry on how to log on using a non-domain account - Allow Interactive Logon to Domain Controllers in Windows Server 2008.  The key steps from this post are as follows:

  • Create a non-Admin user – I created one called “oracle”.
  • Edit the “Default Domain Controllers” group policy to add the user “oracle” to the “Allow log on locally” policy in Computer Configuration > Policies > Windows Settings >Security Settings > Local Policies > User Rights Assignment:
  • Force a policy update.

If you didn’t get it right then you will get the following error when trying to logon "You cannot log on because the logon method you are using is not allowed on this computer".  This means that you correctly created the user but the policy has not been modified correctly.

Acquiring Software

The best way to acquire the software needed is to go to the BPM download page.  If you choose the Microsoft Windows 32bit JVM option you can get a list of all the required components and a link to download them directly from OTN.  The only download link I didn’t use was the database download because I opted for an 11.2 database rather than the XE link that is given.  The only additional software I added was the 11.1.1.5 BPM feature pack (obtain from Oracle Support as patch #12413651: 11.1.1.5.0 BPM FEATURES PACK) and the OSB software.  The BPM feature pack patch is applied with OPatch so I also downloaded the latest OPatch from Oracle support (patch 6880880 for 11.1.0.x releases on Windows 32-bit).

Installing Oracle Database

I began by setting the system environment variable ORACLE_HOSTNAME to be the hostname of the my Windows machine.  I also added this hostname to the hosts file, mapping it to 127.0.0.1.  When launching the installer as a non-Administrator account you will be asked for Administrator credentials in order to install.

Gotcha with Virtual Box Paths

I mounted the install software as a VirtualBox shared Folder and told it to auto-mount.  Unfortunately this auto-mount in Windows only applied to the current user, so when the software tried to run as administrator it couldn’t find the path.  The solution to this was to launch the installer using a UNC path “\\vboxsrv\<SHARE_NAME>\<PATH_TO_INSTALL_FILES>” because the mount point is available to all users, but the auto-mapping is only done at login time for the current user.

Database Install Options

When installing the database I made the following choices to make life easier later, in particular I made sure that I had a UTF-8 character set as recommended for SOA Suite.

  • Declined Security Updates (this is not a production machine)
  • Created and Configured a Database during install
  • Chose Server Class database to get character set options later
  • Chose single instance database installation
  • Chose advanced install to get character set options later
  • Chose English as my language
  • Chose Enterprise Edition
    • Included Oracle Data Extensions for .NET
  • Set Oracle Base as C:\app\oracle
  • Selected General Purpose/Transaction Processing
  • Changed default database name
  • Configuration Options
    • Accepted default memory management settings
    • Change character set to be AL32UTF8 as required by SOA Suite
    • Unselected “Assert all new security settings” to relax security as this is not a production system
    • Chose to create sample schemas
  • Used database control for database management
  • Use file system for database storage
  • Didn’t enable automated backups (that is what Virtual Box snapshots are for)
  • Used same non-compliant password for all accounts

I set up the environment variable ORACLE_UNQNAME to be the database name, this is provided on the last screen of the Oracle database Configuration Assistant.

Configuring for Virtual Box

Because Virtual Box port forwarding settings are global I changed the DB console listen port (from 1158 using emca) and the database listener port (from 1521 using EM console) before setting up port forwarding for virtual box to the new ports.  This required me to re-register the database with the listener and to reconfigure EM.

Firewall Restrictions

After changing my ports I had a final task to do before snapshotting my image, I had add a new Windows Firewall rule to open up database ports (EM & listener).

Installing WebLogic Server

With a working database I was now able to install WebLogic Server.  I decided to do a 32-bit install to simplify the process (no need for a separate JDK install).  As this was intended to be an all in one machine (developer and server) I accepted the Coherence (needed for SOA Suite) and OEPE (needed for OSB design time tooling) options.  After installing I set the oracle user to have full access permissions on the Middleware home I created in C:\app\oracle\product\FMW.

Installing SOA/BPM Suite

Because I was using a 32-bit JVM I had to provide the “–jreLoc” option to the setup.exe command in order to run the SOA Suite installer (see release notes).  The installer correctly found my Middleware Home and installed the SOA/BPM Suite.  After installing I set the oracle user to have full access to the new SOA home created in C:\app\oracle\product\FMW\Oracle_SOA and the Oracle common directory (C:\app\oracle\product\FMW\oracle_common).

Running Repository Creation Utility

I ran the RCU from my host OS rather than from within the Windows guest OS.  This helps avoid any unnecessary temporary files being created in the virtual machine.  I selected the SOA and BPM Infrastructure component and left the prefix at the default DEV.  Using DEV makes life easier when you come to create a SOA/BPM doamin because you don’t need to change the username in the domain config wizard.  Because this isn’t a production environment I also set all the passwords to be the same, again this will simplify things in the config wizard.

Adding BPM Feature Pack

With SOA installed I updated it to include the BPM feature pack.

Installing OPatch Update

First I needed to apply patch 6880880 to get the latest OPatch.  The patch can be applied to any Oracle home and I chose to apply it to the oracle_common home, it seemed to make more sense there rather than the Oracle_SOA home.  To apply the patch I moved the original OPatch directory to OPatch.orig and then unzipped the patch in the oracle_common directory which created a new OPatch directory for me.  Before applying the feature set patch I opened a command prompt and set the ORACLE_HOME environment variable to the Oracle_SOA home and added the new OPatch directory to the path.  I then tested the new OPatch by running the command “opatch lsinventory” which showed me the SOA Suite install version.

Fixing a Path Problem

OPatch uses  setupCCR.exe which has a dependency on msvc71.dll.  Unfortunately this DLL is not on the path so by default the call to setupCCR fails with an error “This application failed to start because MSVCR71.dll was not found”.  To fix this I found a helpful blog entry that led me to create a new key in the registry at “HKEY_LOCAL_MACHINE\SOFTWARE\Microsfot\Windows\CurrentVersion\App Paths\setupCCR.exe” with the default value set to “<MW_HOME>\utils\ccr\bin\setupCCR.exe”.  I added a String value to this key with a name of “Path” and a value of “<Oracle_Common_Home>\oui\lib\win32”.  This registers the setupCCR application with Windows and adds a custom path entry for this application so that it can find the MSVCR71 DLL.

Patching oracle_common Home

I then applied the BPM feature pack patch to oracle_common by

  • Setting ORACLE_HOME environment variable to the oracle_common directory
  • Creating a temporary directory “PATCH_TOP”
  • Unzipping the following files from the patch into PATCH_TOP
    • p12413651_ORACOMMON_111150_Generic.zip
    • p12319055_111150_Generic.zip
    • p12614083_111150_Generic.zip
  • From the PATCH_TOP directory run the command “<Oracle_Common_Home>\OPatch\opatch napply”
    • Note I didn’t provide the inventory pointer parameter (invPtrLoc) because I had a global inventory that was found just fine by OPatch and I didn’t have a local inventory as the patch readme seems to expect.
  • Deleting the PATCH_TOP directory

After successful completion of this “opatch lsinventory” showed that 3 patches had been applied to the oracle_common home.

Patching Oracle_SOA Home

I applied the BPM feature pack patch to Oracle_SOA by

  • Setting ORACLE_HOME environment variable to the Oracle_SOA directory
  • Creating a temporary directory “PATCH_TOP”
  • Unzipping the following file from the patch into PATCH_TOP
    • p12413651_SOA_111150_Generic.zip
  • From the PATCH_TOP directory run the command “<Oracle_Common_Home>\OPatch\opatch napply”
    • Note again I didn’t provide the inventory pointer parameter (invPtrLoc) because I had a global inventory that was found just fine by OPatch and I didn’t have a local inventory as the patch readme seems to expect.
  • Deleting the PATCH_TOP directory

After successful completion of this “opatch lsinventory” showed that 1 patch had been applied to the Oracle_SOA home.

Updating the Database Schemas

    Having updated the software I needed to update the database schemas which I did as follows:

    • Setting ORACLE_HOME environment variable to the Oracle_SOA directory
    • Setting Java_HOME to <MW_HOME>\jdk160_24
    • Running “psa -dbType Oracle -dbConnectString  //<DBHostname>:<ListenerPort>/<DBServiceName> -dbaUserName sys –schemaUserName DEV_SOAINFRA
      • Note that again I elided the invLocPtr parameter

    Because I had not yet created a domain I didn’t have to follow the post installation steps outlined in the Post-Installation Instructions.

    Creating a BPM Development Domain

    I wanted to create a development domain.  So I ran config from <Oracle_Common_Home>\common\bin selecting the following:

    • Create a New Domain
    • Domain Sources
      • Oracle BPM Suite for developers – 11.1.1.0
        • This will give me an Admin server with BPM deployed in it.
      • Oracle Enterprise Manager – 11.1.1.0
      • Oracle Business Activity Monitoring – 11.1.1.0
        • Adds a managed BAM server.
    • I changed the domain name and set the location of the domains and applications directories to be under C:\app\oracle\MWConfig
      • This removes the domain config from the user_projects directory and keeps it separate from the installed software.
    • Chose Development Mode and Sun JDK rather than JRockit
    • Selected all Schema and set password, service name, host name and port.
      • Note when testing the test for SOA Infra will fail because it is looking for version 11.1.1.5.0 but the BPM feature pack incremented it to 11.1.1.5.1.  If the reason for the failure is “no rows were returned from the test SQL statement” then you can continue and select OK when warned that “The JDBC configuration test did not fully complete”.  This is covered in Oracle Support note 1386179.1.
    • Selected Optional Configuration for Administration Server and Managed Servers so that I could change the listening ports.
      • Set Admin Server Listen port to 7011 to avoid clashes with other Admin Servers in other guest OS.
      • Set bam_server Listen port to 9011 to avoid clashes with other managed servers in other guest OS.
      • Changed the name of the LocalMachine to reflect hostname of machine I was installing on.
      • Changed the node manager listen port to 5566 to avoid clashes with other Node Managers in other guest OS.

    Having created my domain I then created a boot.properties file for the bam_server.

    Configuring Node Manager

    With the domain created I set up Node Manager to use start scripts by running setNMProps.cmd from <oracle_common>\common\bin.

    I then edited the <MW_Home>\wlserver_10.3\common\nodemanager\nodemanager.properties file and added the following property:

    • ListenPort=5566

    Firewall Policy Updates

    I had to add the Admin Server, BAM Server and Node Manager ports to the Windows firewall policy to allow access to those ports from outside the Windows server.

    Set Node Manager to Start as a Windows Service

    I wanted node manager to automatically run on the machine as a Windows service so I first edited the <MW_HOME>\wlserver_10.3\server\bin\installNodeMgrSvc.cmd and changed the port to 5566.  Then I ran the command as Administrator to register the service.  The service is automatically registered for automatic startup.

    Set Admin Server to Start as a Windows Service

    I also wanted the Admin Server to run as a Windows service.  There is a blog entry about how to do this using the installSvc command but I found it much easier to use NSSM. To use this I did the following:

    • Downloaded NSSM and put the 64-bit version in my MWConfig directory.
      • Once you start using NSSM the Services you create will point to the location from which you ran NSSM so don’t move it after installing a service!
    • Created a simple script to start the admin server and redirect its standard out and standard error to a log file (I redirected to the “%DOMAIN_HOME%\servers\AdminServer\logs\AdminServer.out” because this is the location that would be used if the AdminServer were started by the node manager.

        @REM Point to Domain Directory
        set DOMAIN_HOME=C:\app\oracle\MWConfig\domains\bp_domain
        @REM Point to Admin Server logs directory
        set LOGS_DIR=%DOMAIN_HOME%\servers\AdminServer\logs
        @REM Redirect WebLogic stdout and stderr
        set JAVA_OPTIONS=-Dweblogic.Stdout="%LOGS_DIR%\AdminServer.out" -Dweblogic.Stderr="%LOGS_DIR%\AdminServer.out"
        @REM Start Admin Server
        call %DOMAIN_HOME%\startWebLogic.cmd

    • Registered the script as a Windows service using NSSM
      • nssm install “Oracle WebLogic AdminServer” “C:\app\oracle\MWConfig\startAdminServer.cmd”

    Note that when you redirect WebLogic stdout and stderr as I have done it does not get the first few lines of output, so test your script from the command line before registering it as a service.

    By default the AdminServer will be restarted if it fails, allowing you to bounce the Admin Server without having to log on to the Windows machine.

    Configuring for Virtual Box

    Having created the domain and configured Node Manager I enabled port forwarding in VirtualBox to expose the Admin Server (port 7011), BAM Server (port 9011) and the Node Manager (port 5566).

    Testing It

    All that is left is to start the node manager as a service, start the Admin server as a service, start the BAM server from the WebLogic console and make sure that things work as expected.  In this case all seemed fine.  When I shut down the machine and then restarted everything came up as expected!

    Conclusion

    The steps above create a SOA/BPM installation running under Windows Server 2008 that is automatically started when Windows Server starts.  The log files can be accessed and read by a non-admin user so the status of the environment can be checked.  Additional managed servers can be started from the Admin console because we have node manager installed.  The database, database listener, database control, node manager and Admin Server all start up as Windows services when the server is started avoiding the need for an Administrator to start them.

    Wednesday Oct 19, 2011

    Structure in a Flat World

    Adding Structure to Flat XML Documents

    A friend recently was wondering how to convert a flat document structure to a more structured form.

    The type of flat structure is shown in the diagram below:

    The deptNo and deptName fields repeat for each employee in the department.

    This would be better represented as a structured format like the one shown below:

     

    Note that the department details are now represented once per department and employees appear in a sequence called emp.  This is a more natural representation and easier to manipulate elsewhere.

    So the question is, how do I get from the flat schema to the structured schema?

    The answer lies in the preceding-sibling and following-sibling XPath axis.

    To get just the first time a department appears we select all the entries that do not have the same deptNo earlier in the document using this XPath expression:

    <xsl:for-each select="/ns1:collection/ns1:entry[not(ns1:deptNo = preceding-sibling::ns1:entry/ns1:deptNo)]">

    Within the first occurrence of a department we then set a variable to hold the department number:

    <xsl:variable name="DeptNo" select="ns1:deptNo"/>

    Within the department we then put in the employee included in the current node.  We then select all the other entries that have the same department number and add their employee details by using the following XPath expression:

    <xsl:for-each select="following-sibling::ns1:entry[ns1:deptNo = $DeptNo]">

    A sample JDeveloper project to test this is available here.

    Monday Oct 10, 2011

    Fixing OEL 6 & VirtualBox

    Fixing OEL 6 & VirtualBox

    Just upgraded the kernel on my VirtualBox image of OEL6 and the VirtualBox Additions failed to build.  The problem was that the latest OEL6 kernel is now kernel-uek.

    $ uname -a

    Linux soavbox.oracle.com 2.6.32-200.20.1.el6uek.x86_64 #1 SMP Fri Oct 7 01:50:00 EDT 2011 x86_64 x86_64 x86_64 GNU/Linux

    This seems to confuse the dkms rebuild of VBox Additions.  See Virtual Box Ticket #9332 clipboard support broken – OEL 5.6, OEL 6.1.

    To fix it I followed the suggestions in the ticket above:

    • Choose "Install Guest Additions"
    • Run a root shell and execute following commands:
      • export MAKE='/usr/bin/gmake –i'
      • cd /media/VBOX*
        • Current latest VBox Additions is VBOXADDITIONS_4.1.4_74291
      • ./VBoxLinuxAdditions.run
    • Restart guest

    Voila now my OEL 6.1 works great with VirtualBox again!

    Thursday Sep 29, 2011

    Mapping the Java World Part I

    How to Customise Java/XML Mapping in SOA Suite
    Part I Setting Up EclipseLink MOXy

    The Challenge

    java-xml-duke-mediumDuring a recent POC, the customer asked us to integrate with several of their backend EJBs, which hosted legacy code talking to backend systems.  The EJB interfaces could not be changed, and had to be called a certain way to be successful.  The customer was looking to phase out another set of "integration EJBs", written over the last few years that orchestrated their core backend EJBs.  SOA Suite will allow them to drastically reduce their development time, and provide much better visibility into the processes as they run.  Given their past experiences (writing their integration EJBs was painful and long winded), one of the key requirements was to support their legacy EJBs without writing code in the SOA environment.  At first glance this seemed impossible, because we couldn't change anything on the EJB side.  In addition many of the parameters to the interfaces contained either incomplete annotations or methods that didn't follow the JavaBeans spec.  This was not previously a problem for them because all their EJBs ran in the same JVM and were making local Java calls

    We decided to use a powerful yet obscure feature of SOA Suite to do the mapping.  Chapter 49.7 of the SOA Suite Developer's guide mentions this marriage between EclipseLink MOXy and SOA Suite, but it does more than advertised, and works outside of the Spring Framework components.  We decided to use this functionality to "fix" some of the things we had mapping issues with in the customer code.  Additionally, we used the framework to do other helpful tasks, such as changing namespaces, fixing arrays, and removing unnecessary mappings.

    In this article we'll cover the theory behind the use of this functionality, basic setup and usage, and several examples to get you started.

    Background

    When we use an EJB Reference or a Spring component in SOA Suite we usually want to wire it to a non-Java resource.  When we do this JDeveloper uses JAXB to create an XML representation of the parameters and return values of the methods in the Java interface we are using.  In this article we will show how to override those mappings.  Overriding the default generation of mappings allows us to specify target namespaces, rationalize the structure of the data and remove unneeded properties in the Java classes.  Some things we may want to customize include:

    • Specifying concrete implementations for abstract classes and interfaces in the interface
      • This allows us to map to Java objects in the interface which cannot be instantiated directly.  For example often we have lists of abstract classes or interfaces, by  specifying the possible concrete implementations of these classes we can generate an XML schema that includes additional properties available only through the concrete classes.
    • Hiding unwanted properties
      • This allows us to remove properties that are not needed for our implementation, or not needed because they are convenience properties such as the length of an array or collection which can easily be derived from the underlying array or collection.
    • Providing wrappers for arrays and collections
      • The default mapping for an array or collection is to provide a list of repeating elements.  We can modify the mapping to provide a wrapper element that represents the whole array or collection, with the repeating elements appearing a level down inside this.
    • Changing WSDL namespaces
      • It is often necessary to change the namespaces in a generated WSDL to match a corporate standard or to avoid conflicts with other components that are being used.

    Approach

    EclipseLinkMoxySOA Suite allows us to describe in XML how we want a Java interface to be mapped from Java objects into XML.  The file that does this is called an “Extended Mapping” (EXM) file.  When generating a WSDL and its associated XML Schema from a Java interface SOA Suite looks for an EXM file corresponding to the Java Interface being generated from.  Without this file the mapping will be the “default” generation, which simply attempts to take each field and method in the Java code and map it to an XML type in the resulting WSDL.  The EXM file is used to describe or clarify the mappings to XML and uses EclipseLink MOXy to provide an XML version of Java annotations.  This means that we can apply the equivalent of Java annotations to Java classes referenced from the interface, giving us complete control over how the XML is generated.  This is illustrated in the diagram which shows how the WSDL interface mapping depends on the JavaInterface of the EJB reference or Spring component being wired (obviously), but is modified by the EXM file which in turn may embed or reference an XML version of JAXB annotations (using EclipseLink MOXy).

    The mapping will automatically take advantage of any class annotations in the Java classes being mapped, but the XML descriptions can override or add to these annotations, allowing us fine grained control over our XML interface.  This allows for changes to be made without touching the underlying Java code.

    Setup

    Using the mapper out of the box is fairly simple.   Suppose you set up an EJB Reference or Spring component inside your composite.  You'd like to call this from a Mediator or BPEL Process which expect to operate on a WSDL.  Simply drag a wire from the BPEL process or Mediator to the EJB or Spring component and you should see a WSDL generated for you, which contains the equivalent to the Java components business interface and all required types.  This happens for you, as the tool goes through the classes in your target.

    But what if you get a "Schema Generation Error" or if the generated WSDL isn't correct.  As discussed earlier there may be a  number of changes we need or want to make to the mapping.   In order to use an "Extended Mapping", or EXM file, we need to do the following:

    1. We need to register the EclipseLink MOXy schema with JDeveloper.  Under JDeveloper Tools->Preferences->XML Schemas we click Add… to register the schema as an XML extension.

      The schema is found in a jar file located at <JDEV_HOME>/modules/org.eclipse.persistence_1.1.0.0_2-1.jar! and the schema is inside this jhar at /xsd/eclipselink_oxm_2_1.xsd so the location we register is jar:file:/<JDEV_HOME>/modules/org.eclipse.persistence_1.1.0.0_2-1.jar!/xsd/eclipselink_oxm_2_1.xsd where <JDEV_HOME> is the location where you installed JDeveloper.

      NOTE: This will also work as OXM instead of XML.  But if you use an .oxm extension then in each project you use it you must add a rule to copy .oxm files from the source to the output directory when compiling.
    2. Change the order of the source paths in your SOA Project to have SCA-INF/src first.  This is done using the Project Properties…->Project Source Paths dialog.  All files related to the mapping will go here.  For example, <Project>/SCA-INF/src/com/customer/EXM_Mapping_EJB.exm, where com.customer is the associated java package.
    3. We will now use a wizard to generate a base mapping file.
      1. Launch the wizard New XML Document from XML Schema (File->New->All Technologies->General->XML Document from XML Schema).
      2. Specify a file with the name of the Java Interface and an .exm extension in a directory corresponding to the Java package of the interface under SCA-INF/src.  For example, if your EJB adapter defined soa.cookbook.QuoteInterface as the remote interface, then the directory should be <Project>/SCA-INF/src/soa/cookbook and so the full file path would be <Project>/SCA-INF/src/soa/cookbook/QuoteInterface.exm.  By using the exm extension we are able to Use Registered Schema which will automatically map to the correct schema so that future steps in the wizard will understand what we are doing.
      3. The weblogic-wsee-databinding schema should already be selected, select a root element of java-wsdl-mapping and generate to a Depth of 3.  This will give us a basic file to start working with.
    4. As recommended by Oracle, separate out the mappings per package, by using the toplink-oxm-file element.  This will allow you, per package, to define re-usable mapping files outside of the EXM file.  Since the EXM file and the embedded mappings have different XML root elements, defining them separately allows JDeveloper to provide validation and completion, a sample include is shown below:

      <?xml version="1.0" encoding="UTF-8" ?>
      <java-wsdl-mapping xmlns="
      http://xmlns.oracle.com/weblogic/weblogic-wsee-databinding">
        <xml-schema-mapping>
          <toplink-oxm-file file-path="./mappings.xml" java-package="soa.cookbook"/>
        </xml-schema-mapping>
      </java-wsdl-mapping>

    5. Create an "OXM Mapping" file to store custom mappings.  As mentioned, these files are per package, separate from the EXM files, and re-usable.  We can use the "New XML Document from Schema" to create these as well.  In this case, they will have an XML or OXM extension, use the persistence registered schema (http://www.eclipse.org/eclipselink/xsds/persistence/oxm), and be stored relative to the EXM file.  That is, they can go in the same directory, or in other directories, as long as you refer to them by relative path from the EXM file.

      <?xml version="1.0" encoding="UTF-8" ?>
      <xml-bindings xmlns="
      http://www.eclipse.org/eclipselink/xsds/persistence/oxm">
        <!-- Set target Namespace via namespace attribute -->
        <xml-schema namespace=
      http://cookbook.soa.mapping/javatypes
                    element-form-default="QUALIFIED"/>
      </xml-bindings>

    6. In the newly created OXM Mapping file, we can use completion and validation to ensure we follow the EclipseLink MOXy documentation.  For example, to declare that a field is to be transient and not show up in the WSDL mapping, do this:

      <?xml version="1.0" encoding="UTF-8" ?>
      <xml-bindings xmlns="
      http://www.eclipse.org/eclipselink/xsds/persistence/oxm">
        <!-- Set target Namespace via namespace attribute -->
        <xml-schema namespace=
      http://cookbook.soa.mapping/javatypes
                    element-form-default="QUALIFIED"/>
        <java-types>
          <java-type name="soa.cookbook.QuoteRequest">
            <java-attributes>
              <!-- Can remove mappings by making them transient via xml-transient element -->
         
          <xml-transient java-attribute="product"/>
            </java-attributes>
          </java-type>
        </java-types>
      </xml-bindings>

    7. Once complete, delete any existing wires to the Java components and rewire.  You should notice the dialog box change to indicate that an extended mapping file was used.

      Note that an “extended mapping file” was used.
    8. As an iterative process, changing a mapping is quite easy.  Simply delete the wire, and JDeveloper will offer to delete the WSDL file, which you should do.  Make updates to the EXM and OXM Mapping files as required, and rewire.  Often this occurs several times during development, as there are multiple reasons to make changes.  We will cover some of these in our next entry.

    Summary

    The extended mapping file puts the SOA composite developer in control of his own destiny when it comes to mapping Java into XML.  It frees him from the tyranny of Java developer specified annotations embedded in Java source files and allows the SOA developer to customize the mapping for his own needs.  In this blog entry we have shown how to set up and use extended mapping in SOA Suite composites.  In the next entry we will show some of the power of this mapping.

    Patches

    To use this effectively you need to download the following patch from MetaLink:

    • 12984003 - SOA SUITE 11.1.1.5 - EJB ADAPTER NONBLOCKINGINVOKE FAILS WITH COMPLEX OBJECTS (Patch)
      • Note that this patch includes a number of fixes that includes a performance fix for the Java/XML mapping in SOA Suite.

    Sample Code

    There is a sample application uploaded as EXMdemo.zip.  Unzip the provided file and open the EXMMappingApplication.jws in JDeveloper.  The application consists of two projects:

    • EXMEJB
      • This project contains an EJB and needs to be deployed before the other project.  This EJB provides an EJB wrapper around a POJO used in the Spring component in the EXMMapping project.
    • EXMMapping
      • This project contains a SOA composite which has a Spring component that is called from a Mediator, there is also an EJB reference.  The Mediator calls the Spring component or the EJB based on an input value.  Deploy this project after deploying the EJB project.

    Key files to examine are listed below:

    • QuoteInterface.java in package soa.cookbook
      • This is the same interface implemented by the Spring component and the EJB.
      • It takes a QuoteRequest object as an input parameter and returns a QuoteResponse object.
    • Quote.java_diagram in package soa.cookbook
      • A UML class diagram showing the structure of the QuoteRequest and QuoteResponse objects.
    • EXM_Mapping_EJB.exm in package soa.cookbook
      • EXM mapping file for EJB.
      • This is used to generate EXM_Mapping_EJB.wsdl file.
    • QuoteInterface.exm in package soa.cookbook
      • EXM mapping file for Spring component.
      • This is used to generate QuoteInterface.wsdl file.
    • mappings.xml in package soa.cookbook
      • Contains mappings for QuoteRequest and QuoteResponse objects.
      • Used by both EXM files (they both include it, showing how we can re-use mappings).
      • We will cover the contents of this file in the next installment of this series.

    Sample request message

    <soap:Envelope xmlns:soap="http://schemas.xmlsoap.org/soap/envelope/">
      <soap:Body xmlns:ns1="
    http://cookbook.soa.mapping/types">
        <ns1:quoteRequest>
          <ns1:products>
            <ns1:product>Product Number 1</ns1:product>
            <ns1:product>Product Number 2</ns1:product>
            <ns1:product>Product Number 3</ns1:product>
          </ns1:products>
          <ns1:requiredDate>2011-09-30T18:00:00.000-06:00</ns1:requiredDate>
          <!-- provider should be “EJB” or “Spring” to select the appropriate target -->
          <ns1:provider>EJB</ns1:provider>
        </ns1:quoteRequest>
      </soap:Body>
    </soap:Envelope>

    Co-Author

    This blog article was co-written with my colleague Andrew Gregory.  If this becomes a habit I will have to change the title of my blog!

    Acknowledgements

    Blaise Doughan, the Eclipse MOXy lead was extremely patient and helpful as we worked our way through the different mappings.  We also had a lot of support from David Twelves, Chen Shih-Chang, Brian Volpi and Gigi Lee.  Finally thanks to Simone Geib and Naomi Klamen for helping to co-ordinate the different people involved in researching this article.

    Friday Sep 23, 2011

    Oracle & JBoss Comparison

    It’s All About TCO!

    CrimsonConsultingLogoCrimson Research has just published a paper comparing total cost of ownership (TCO) of WebLogic versus JBoss.

    You can download the paper here.  Key point it makes is that acquisition of an application server platform is only a small part of the total cost of ownership over a 5 year period.  What I found surprising was the speed with which the report suggests the lower TCO of WebLogic begins to be noticable, it indicates that the break even point is about 18 months into the deployment.

    The study was sponsored by Oracle but Crimson conducted it using their own methodology and it certainly sets out a good case for the benefits of “-ility” features in bringing down the implementation costs of an applications server infrastructure.  This gels with my own experience where the more I work with operations staff the more I learn how important things like script recording and WLST are to them.

    Thursday Sep 22, 2011

    Coping with Failure

    Handling Endpoint Failure in OSB

    HardwareFailureRecently I was working on a POC and we had demonstrated stellar performance with OSB fronting a BPEL composite calling back end EJBs.  The final test was a failover test which tested killing an OSB and bringing it back online and then killing a SOA(BPEL) server and bringing it back online and finally killing a backend EJB server and bringing it back online.  All was going well until the BPEL failover test when for some reason OSB refused to mark the BPEL server as down.  Turns out we had forgotten to set a very important setting and so this entry outlines how to handle endpoint failure in OSB.

    Step 1 – Add Multiple End Points to Business Service

    The first thing to do is create multiple end points for the business service, pointing to all available backends.  This is required for HTTP/SOAP bindings.  In theory if using a T3 protocol then a single cluster address is sufficient and load balancing will be taken care of by T3 smart proxies.  In this scenario though we will focus on HTTP/SOAP endpoints.

    Navigate to the Business Service->Configuration Details->Transport Configuration and add all your endpoint URIs.  Make sure that Retry Count is greater than 0 if you don’t want to pass failures back to the client.  In the example below I have set up links to three back end webs service instances.  Go to Last and Save the changes.

    MultiOSBEndpoint

    Step 2 – Enable Offlining & Recovery of Endpoint URIs

    When a back end service instance fails we want to take it offline, meaning we want to remove it from the pool of instances to which OSB will route requests.  We do this by navigating to the Business Service->Operational Settings and selecting the Enable check box for Offline Endpoint URIs in the General Configuration section.  This causes OSB to stop routing requests to a backend that returns errors (if the transport setting Retry Application Errors is set) or fails to respond at all.

    Offlining the service is good because we won’t send any more requests to a broken endpoint, but we also want to add the endpoint again when it becomes available.  We do this by setting the Enable with Retry Interval in General Configuration to some non-zero value, such as 30 seconds.  Then every 30 seconds OSB will add the failed service endpoint back into the list of endpoints.  If the endpoint is still not ready to accept requests then it will error again and be removed again from the list.  In the example below I have set up a 30 second retry interval.  Remember to hit update and then commit all the session changes.

    OfflineOSBEndpoint

    Considerations on Retry Count

    A couple of things to be aware of on retry count.

    If you set retry count to greater than zero then endpoint failures will be transparent to OSB clients, other than the additional delay they experience.  However if the request is mutative (changes the backend) then there is no guarantee that the request might not have been executed but the endpoint failed before turning the result, in which case you submit the mutative operation twice.  If your back end service can’t cope with this then don’t set retries.

    If your back-end service can’t cope with retries then you can still get the benefit of transparent retries for non-mutative operations by creating two business services, one with retry enabled that handles non-mutative requests, and the other with retry set to zero that handles mutative requests.

    Considerations on Retry Interval for Offline Endpoints

    If you set the retry interval to too small a value then it is very likely that your failed endpoint will not have recovered and so you will waste time on a request failing to contact that endpoint before failing over to a new endpoint, this will increase the client response time.  Work out what would be a typical unplanned outage time for a node (such as caused by a JVM failure and subsequent restart) and set the retry interval to be say half of this as a comprise between causing additional client response time delays and adding the endpoint back into the mix as soon as possible.

    Conclusion

    Always remember to set the Operational Setting to Enable Offlining and then you won’t be surprised in a fail over test!

    About

    Musings on Fusion Middleware and SOA Picture of Antony Antony works with customers across the US and Canada in implementing SOA and other Fusion Middleware solutions. Antony is the co-author of the SOA Suite 11g Developers Cookbook, the SOA Suite 11g Developers Guide and the SOA Suite Developers Guide.

    Search

    Archives
    « July 2014
    SunMonTueWedThuFriSat
      
    1
    3
    4
    5
    6
    7
    8
    9
    10
    11
    12
    13
    14
    15
    16
    17
    18
    19
    20
    21
    22
    23
    24
    25
    26
    27
    28
    29
    30
    31
      
           
    Today