Tuesday Dec 24, 2013

Cleaning Up After Yourself

Maintaining a Clean SOA Suite Test Environment

Fun blog entry with Fantasia animated gifs got me thinking like Mickey about how nice it would be to automate clean up tasks.

I don’t have a sorcerers castle to clean up but I often have a test environment which I use to run tests, then after fixing problems that I uncovered in the tests I want to run them again.  The problem is that all the data from my previous test environment is still there.

Now in the past I used VirtualBox snapshots to rollback to a clean state, but this has a problem that it not only loses the environment changes I want to get rid of such as data inserted into tables, it also gets rid of changes I want to keep such as WebLogic configuration changes and new shell scripts.  So like Mickey I went in search of some magic to help me.

Cleaning Up SOA Environment

My first task was to clean up the SOA environment by deleting all instance data from the tables.  Now I could use the purge scripts to do this, but that would still leave me with running instances, for example 800 Human Workflow Tasks that I don’t want to deal with.  So I used the new truncate script to take care of this.  Basically this removes all instance data from your SOA Infrastructure, whether or not the data is live.  This can be run without taking down the SOA Infrastructure (although if you do get strange behavior you may want to restart SOA).  Some statistics, such are service and reference statistics, are kept since server startup, so you may want to restart your server to clear that data.  A sample script to run the truncate SQL is shown below.

#!/bin/sh
# Truncate the SOA schemas, does not truncate BAM.
# Use only in development and test, not production.

# Properties to be set before running script
# SOAInfra Database SID
DB_SID=orcl
# SOA DB Prefix
SOA_PREFIX=DEV
# SOAInfra DB password
SOAINFRA_PASSWORD=welcome1
# SOA Home Directory
SOA_HOME=/u01/app/fmw/Oracle_SOA1

# Set DB Environment
. oraenv << EOF
${DB_SID}
EOF

# Run Truncate script from directory it lives in
cd ${SOA_HOME}/rcu/integration/soainfra/sql/truncate

# Run the truncate script
sqlplus ${SOA_PREFIX}_soainfra/${SOAINFRA_PASSWORD} @truncate_soa_oracle.sql << EOF
exit
EOF

After running this script all your SOA composite instances and associated workflow instances will be gone.

Cleaning Up BAM

The above example shows how easy it is to get rid of all the runtime data in your SOA repository, however if you are using BAM you still have all the contents of your BAM objects from previous runs.  To get rid of that data we need to use BAM ICommand’s clear command as shown in the sample script below:

#!/bin/sh
# Set software locations
FMW_HOME=/home/oracle/fmw
export JAVA_HOME=${FMW_HOME}/jdk1.7.0_17
BAM_CMD=${FMW_HOME}/Oracle_SOA1/bam/bin/icommand
# Set objects to purge
BAM_OBJECTS=/path/RevenueEvent /path/RevenueViolation

# Clean up BAM
for name in ${BAM_OBJECTS}
do
  ${BAM_CMD} -cmd clear -name ${name} -type dataobject
done

After running this script all the rows of the listed objects will be gone.

Ready for Inspection

Unlike the hapless Mickey, our clean up scripts work reliably and do what we want without unexpected consequences, like flooding the castle.

Saturday Oct 12, 2013

Share & Enjoy : Using a JDeveloper Project as an MDS Store

Share & Enjoy : Sharing Resources through MDS

One of my favorite radio shows was the Hitchhikers Guide to the Galaxy by the sadly departed Douglas Adams.  One of the characters, Marvin the Paranoid Android, was created by the Sirius Cybernetics Corporation whose corporate song was entitled Share and Enjoy!  Just like using the products of the Sirius Cybernetics Corporation, reusing resources through MDS is not fun, but at least it is useful and avoids some problems in SOA deployments.  So in this blog post I am going to show you how to re-use SOA resources stored in MDS using JDeveloper as a development tool.

The Plan

We would like to have some SOA resources such as WSDLs, XSDs, Schematron files, DVMs etc. stored in a shared location.  This gives us the following benefits

  • Single source of truth for artifacts
  • Remove cross composite dependencies which can cause deployment and startup problems
  • Easier to find and reuse resources if stored in a single location

So we will store a WSDL and XSD in MDS, using a JDeveloper project to maintain the shared artifact and using File based MDS to access it from development and Database based MDS to access it from runtime.  We will create the shared resources in a JDeveloper project and deploy them to MDS.  We will then deploy a project that exposes a service based on the WSDL.  Finally we will deploy a client project to the previous project that uses the same MDS resources.

Creating Shared Resources in a JDeveloper Project

First lets create a JDeveloper project and put our shared resources into that project.  To do this

  1. In a JDeveloper Application create a New Generic Project (File->New->All Technologies->General->Generic Project)
  2. In that project create a New Folder called apps (File->New->All Technologies->General->Folder) – It must be called apps for local File MDS to work correctly.
  3. In the project properties delete the existing Java Source Paths (Project Properties->Project Source Paths->Java Source Paths->Remove)
  4. In the project properties a a new Java Source Path pointing to the just created apps directory (Project Properties->Project Source Paths->Java Source Paths->Add)
    JavaSourcePaths

Having created the project we can now put our resources into that project, either copying them from other projects or creating them from scratch.

Create a SOA Bundle to Deploy to a SOA Instance

Having created our resources we now want to package them up for deployment to a SOA instance.  To do this we take the following steps.

  1. Create a new JAR deployment profile (Project Properties->Deployment->New->Jar File)
  2. In JAR Options uncheck the Include Manifest File
  3. In File Groups->Project Output->Contributors uncheck all existing contributors and check the Project Source Path
  4. Create a new SOA Bundle deployment profile (Application Properties->Deployment->New->SOA Bundle)
  5. In Dependencies select the project jar file from the previous steps.
    SOABundle
  6. On Application Properties->Deployment unselect all options.
    SOABundle2

The bundle can now be deployed to the server by selecting Deploy from the Application Menu.

Create a Database Based MDS Connection in JDeveloper

Having deployed our shared resources it would be good to check they are where we expect them to be so lets create a Database Based MDS Connection in JDeveloper to let us browse the deployed resources.

  1. Create a new MDS Connection (File->All Technologies->General->Connections->SOA-MDS Connection)
  2. Make the Connection Type DB Based MDS and choose the database Connection and parition.  The username of the connection will be the <PREFIX>_mds user and the MDS partition will be soa-infra.

Browse the repository to make sure that your resources deplyed correctly under the apps folder.  Note that you can also use this browser to look at deployed composites.  You may find it intersting to look at the /deployed-composites/deployed-composites.xml file which lists all deployed composites.

DbMDSbrowse

Create an File Based MDS Connection in JDeveloper

We can now create a File based MDS connection to the project we just created.  A file based MDS connection allows us to work offline without a database or SOA server.  We will create a file based MDS that actually references the project we created earlier.

  1. Create a new MDS Connection (File->All Technologies->General->Connections->SOA-MDS Connection)
  2. Make the Connection Type File Based MDS and choose the MDS Root Folder to be the location of the JDeveloper project previously created (not the source directory, the top level project directory).
    FileMDS

We can browse the file based MDS using the IDE Connections Window in JDeveloper.  This lets us check that we can see the contents of the repository.

Using File Based MDS

Now that we have MDS set up both in the database and locally in the file system we can try using some resources in a composite.  To use a WSDL from the file based repository:

  1. Insert a new Web Service Reference or Service onto your composite.xml.
  2. Browse the Resource Palette for the WSDL in the File Based MDS connection and import it.
    BrowseRepository
  3. Do not copy the resource into the project.
  4. If you are creating a reference, don’t worry about the warning message, that can be fixed later.  Just say Yes you do want to continue and create the reference.
    ConcreteWSDLWarning

Note that when you import a resource from an MDS connection it automatically adds a reference to that MDS into the applications adf-config.xml.  SOA applications do not deploy their adf-config.xml, they use it purely to help resolve oramds protocol references in SOA composites at design time.  At runtime the soa-infra applications adf-config.xml is used to help resolve oramds protocol references.

The reason we set file based MDS to point to the project directory rather than the apps directory underneath is because when we deploy SOA resources to MDS as a SOA bundle the resources are all placed under the apps MDS namespace.  To make sure that our file based MDS includes an apps namespace we have to rename the src directory to be apps and then make sure that our file based MDS points to the directory aboive the new source directory.

Patching Up References

When we use an abstract WSDL as a service then the SOA infrastructure automatically adds binging and service information at run time.  An abstract WSDL used as a reference needs to have binding and service information added in order to compile successfully.  By default the imported MDS reference for an abstract WSDL will look like this:

<reference name="Service3"
   ui:wsdlLocation="oramds:/apps/shared/WriteFileProcess.wsdl">
  <interface.wsdl interface="
http://xmlns.oracle.com/Test/SyncWriteFile/WriteFileProcess# wsdl.interface(WriteFileProcess)"/>
  <binding.ws port="" location=""/>
</reference>

Note that the port and location properties of the binding are empty.  We need to replace the location with a runtime WSDL location that includes binding information, this can be obtained by getting the WSDL URL from the soa-infra application or from EM.  Be sure to remove any MDS instance strings from the URL.

EndpointInfo

The port information is a little more complicated.  The first part of the string should be the target namespace of the service, usually the same as the first part of the interface attribute of the interface.wsdl element.  This is followed by a #wsdl.endpoint and then in parenthesis the service name from the runtime WSDL and port name from the WSDL, separated by a /.  The format should look like this:

{Service Namespace}#wsdl.endpoint({Service Name}/{Port Name})

So if we have a WSDL like this:

<wsdl:definitions
   …
  
targetNamespace=
   "http://xmlns.oracle.com/Test/SyncWriteFile/WriteFileProcess"
>
   …
   <wsdl:service name="writefileprocess_client_ep">
      <wsdl:port name="WriteFileProcess_pt"
            binding="client:WriteFileProcessBinding">
         <soap:address location=… />
      </wsdl:port>
   </wsdl:service>
</wsdl:definitions>

Then we get a binding.ws port like this:

http://xmlns.oracle.com/Test/SyncWriteFile/WriteFileProcess# wsdl.endpoint(writefileprocess_client_ep/WriteFileProcess_pt)

Note that you don’t have to set actual values until deployment time.  The following binding information will allow the composite to compile in JDeveloper, although it will not run in the runtime:

<binding.ws port="dummy#wsdl.endpoint(dummy/dummy)" location=""/>

The binding information can be changed in the configuration plan.  Deferring this means that you have to have a configuration plan in order to be able to invoke the reference and this means that you reduce the risk of deploying composites with references that are pointing to the wrong environment.

Summary

In this blog post I have shown how to store resources in MDS so that they can be shared between composites.  The resources can be created in a JDeveloper project that doubles as an MDS file repository.  The MDS resources can be reused in composites.  If using an abstract WSDL from MDS I have also shown how to fix up the binding information so that at runtime the correct endpoint can be invoked.  Maybe it is more fun than dealing with the Sirius Cybernetics Corporation!

Wednesday Oct 09, 2013

Multiple SOA Developers Using a Single Install

Running Multiple SOA Developers from a Single Install

A question just came up about how to run multiple developers from a single software install.  The objective is to have a single software installation on a shared server and then provide different OS users with the ability to create their own domains.  This is not a supported configuration but it is attractive for a development environment.

Out of the Box

Before we do anything special lets review the basic installation.

  • Oracle WebLogic Server 10.3.6 installed using oracle user in a Middleware Home
  • Oracle SOA Suite 11.1.1.7 installed using oracle user
  • Software installed with group oinstall
  • Developer users dev1, dev2 etc
    • Each developer user is a member of oinstall group and has access to the Middleware Home.

Customizations

To get this to work I did the following customization

  • In the Middleware Home make all user readable files/directories group readable and make all user executable files/directories group executable.
    • find $MW_HOME –perm /u+r ! –perm /g+r | xargs –Iargs chmod g+r args
    • find $MW_HOME –perm /u+x ! –perm /g+x | xargs –Iargs chmod g+x args

Domain Creation

When creating a domain for a developer note the following:

  • Each developer will need their own FMW repository, perhaps prefixed by their username, e.g. dev1, dev2 etc.
  • Each developer needs to use a unique port number for all WebLogic channels
  • Any use of Coherence should use Well Known Addresses to avoid cross talk between developer clusters (note SOA and OSB both use Coherence!)
  • If using Node Manager each developer will need their own instance, using their own configuration.

Tuesday Aug 13, 2013

WebLogic Admin Cookbook Review

Review of Oracle WebLogic Server 12c Advanced Administration Cookbook

Like all of Packts cookbook titles, the book follows a standard format of a recipe followed by an explanation of how it works and then a discussion of additional recipe related features and extensions.

When reading this book I tried out some of the recipes on an internal beta of 12.1.2 and they seemed to work fine on that future release.

The book starts with basic installation instructions that belie its title.  The author is keen to use console mode, which is often needed for servers that have no X11 client libraries, however for all but the most simple of domains I find console mode very slow and difficult to use and would suggest that where possible you persuade the OS admin to make X11 client libraries available, at least for the duration of the domain configuration.

Another pet peeve of mine is using nohup to start servers/services and not redirecting output, with the result that you are left with nohup.out files scattered all over your disk.  The book falls into this trap.

However we soon sweep into some features of WebLogic that I believe are less understood such as using the pack/unpack commands and customizing the console screen.  The “Protecting changes in the Administration Console” recipe is particularly useful.

The next chapter covers HA configuration.  One of the nice things about this book is that most recipes are illustrated not only using the console but also using WLST.  The coverage of multiple NICs and dedicated network channels is very useful in the Exalogic world as well as regular WLS clusters.  One point I would quibble with is the setting up of HA for the AdminServer.  I would always do this with a shared file system rather than copying files around, I would also prefer a floating IP address to avoid having to update the DNS.

Other chapters cover JDBC & JMS, Monitoring, Stability, Performance and Security.

Overall the recipes are useful, I certainly learned some new ways of doing things.  The WLST example code is a real plus.  Well worth being added to your WebLogic Admin library.

The book is available on the Packt website.

Thursday May 24, 2012

Whos Port Is It?

Who Owns What Port?

It is not uncommon to be unable to start a server process because some other process is holding onto a network port that is required by the server.  The question is how do you find the offending process?  I thought I would identify some of the commands I use to track down wayward port usage.

Identify the Conflict

The first thing to do is to identify the port that is being used.  Hopefully your log file will indicate which port the server process was unable to obtain.  Even if it did not identify the port and you know the ports that it requires then you can use the first of my helpful commands:

Windows Linux
netstat –anop tcp netstat –lnt --program

The Windows version lists all “-a” network sockets using TCP/IP v4 “-p tcp”.  To make it easier to find the listening port I had them list in numeric format “-n” rather than using abbreviations.  Other possible protocols are TCP/IP v6 “tcpv6”, UDP “udp” and UDP v6 “udpv6”.  Finally I had the netstat command print out the process ID “-o” of the process using the port.

The Linux version is slightly different in that it lists only listening ports “-l” in numeric form “-n” for the TCP protocols “tcp”, both V4 and V6.  For UDP protocol use “-u”.  The process ID and program name is also displayed “—program”.  Note that this is best run as root because you need root privileges to have netstat show you the pid of a process you don’t own.

Find the Culprit

Now that we know which process is holding which port in use the next thing to do is find out more about the process holding onto our port.  The second of our helpful commands shows us the command line used to launch are mischievous process.

Windows Linux
tasklist /FI “PID eq <PID>” ps –p <PID> –o args

Unfortunately I haven’t found a good way to find out the actual command line from a Windows machine.  Tasklist allows you to filter “/FI” the list of tasks to see the process name associated with a PID “PID eq <PID>, but if that process is a service then the process name will show as “svchost.exe”.  You may be able to see more information by using Windows Process Explorer, but even that doesn’t always tell you what you need to know.

On Linux we can use the trusty ps command to find a given pid “-p <PID>” and output the command and associated command line arguments “-o args”.  From this we know exactly who is using our PID.

Armed with this information we can reconfigure the errant process, shut it down or decide that we need to change the port number for our server process instead.

Friday Mar 02, 2012

Using Coherence with JDeveloper

Configuring JDeveloper for use with Coherence

Doing some work with Coherence again and so I needed to create some Java code calling Coherence API and edit some Coherence configuration files in JDeveloper.  The easiest way to do this is to register the Coherence jar file and the Coherence Schemas with JDeveloper, once that is done then you can use JDevelopers XML insight features to help you create the XML documents.

Register the Coherence Library

To register the Coherence jar file in JDeveloper go to “Tools->Manage Libraries…”, select “New…” and then use “Add Entry…” to add the following entries:

  • Class Path
    • <COHERENCE_HOME>\lib\coherence.jar
  • Doc Path
    • <COHERENCE_HOME>\doc\api

COHERENCE_HOME is the location where you unzipped the Coherence product.

This lets us use the Coherence API in our Java code by adding the library to our project.

Register the Schemas

To register the Coherence XML Schemas with JDevelper go to “Tools->Preferences…”, select “XML Schemas” and choose “Add…”.

Browse to the <COHERENCE_HOME>\lib\coherence.jar file and add the following schemas:

  • coherence-cache-config.xsd
  • coherence-operational-config.xsd
  • coherence-pof-config.xsd
  • coherence-report-config.xsd
  • coherence-report-group-config.xsd
  • coherence-rest-config.xsd

Now when you create an XML file for use with Coherence you can choose “XML Document from XML Schema” and choose “Use Registered Schemas” to show you suitable schemas to use for your Coherence config.

Wednesday Oct 19, 2011

Structure in a Flat World

Adding Structure to Flat XML Documents

A friend recently was wondering how to convert a flat document structure to a more structured form.

The type of flat structure is shown in the diagram below:

The deptNo and deptName fields repeat for each employee in the department.

This would be better represented as a structured format like the one shown below:

 

Note that the department details are now represented once per department and employees appear in a sequence called emp.  This is a more natural representation and easier to manipulate elsewhere.

So the question is, how do I get from the flat schema to the structured schema?

The answer lies in the preceding-sibling and following-sibling XPath axis.

To get just the first time a department appears we select all the entries that do not have the same deptNo earlier in the document using this XPath expression:

<xsl:for-each select="/ns1:collection/ns1:entry[not(ns1:deptNo = preceding-sibling::ns1:entry/ns1:deptNo)]">

Within the first occurrence of a department we then set a variable to hold the department number:

<xsl:variable name="DeptNo" select="ns1:deptNo"/>

Within the department we then put in the employee included in the current node.  We then select all the other entries that have the same department number and add their employee details by using the following XPath expression:

<xsl:for-each select="following-sibling::ns1:entry[ns1:deptNo = $DeptNo]">

A sample JDeveloper project to test this is available here.

Thursday Jun 02, 2011

Building Your Own Path

How to create custom XPath functions in SOA Suite 11g.[Read More]

Wednesday May 11, 2011

A Database Detour

A quick guide to installing Oracle Database 11gR2 on Oracle Enterprise Linux 6 (OEL6).[Read More]

Tuesday Jan 06, 2009

Which Coherence Edition is Right for Me?

[Read More]

Wednesday Oct 08, 2008

Which JDev is Right for me?

[Read More]
About

Musings on Fusion Middleware and SOA Picture of Antony Antony works with customers across the US and Canada in implementing SOA and other Fusion Middleware solutions. Antony is the co-author of the SOA Suite 11g Developers Cookbook, the SOA Suite 11g Developers Guide and the SOA Suite Developers Guide.

Search

Archives
« April 2014
SunMonTueWedThuFriSat
  
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
   
       
Today