Friday Jun 20, 2014

Application Management Pack for Utilities Self Running Demonstration

A self running demonstration of the Application Management Pack for Oracle Utilities is now available from My Oracle Support at Doc Id: 1474435.1. The demonstration in SWF (Flash) format, covers the features of the pack available for Oracle Enterprise Manager which is annotated for ease of use.

The demonstration covers the following topics:

  • Discovery of the Oracle Utilities Targets
  • Registration of Oracle Utilities Targets
  • Starting and Stopping Oracle Utilities Targets
  • Patching Oracle Utilities Targets
  • Migrating patches across Oracle Utilities Targets
  • Cloning Environments including basic and advanced cloning
  • Miscellaneous functions

The demonstration requires a browser with Adobe Flash installed. The demonstrate can be streamed from My Oracle Support or downloaded for offline replay.

Wednesday Jun 18, 2014

New ILM Whitepapers available

In Oracle Utilities Application Framework V4. and above, a new archiving and data management capability was introduced to help customers design a cost effective storage solution and data management strategy for their implementations of Oracle Utilities products. The facility will allow customers to retain data according to their policies (legal or otherwise) whilst offering strategies for cost effective storage of that data.

To help with the implementation of this new facility two new whitepapers have been published to My Oracle Support for download:

Additional products will be releasing additional whitepaper as they release the ILM based components.

Tuesday May 20, 2014

Audit On Inquiry Example - Zones

One of the features of Oracle Utilities Application Framework V4.x is the ability to audit inquiries from zones and pages. This allows you to track information that is read rather than what is updated (which is the prime focus of the internal audit facility).

This example shown below is a sample only. It just illustrates the process. The Script is invoked upon broadcast (the sample does not include the global context but that can be added as normal).

To use this facility here is the basic design pattern (in order you would perform it):

  • Decide where you want to store the inquiry data first. You cannot store the inquiry data in the same audit tables/objects as updates or deletes are recorded as the Audit Object Maintenance Object is read only (as it is only used internally). You have three options here:
    • If the Maintenance Object has a child log table then this is ideal for recording when that object is read or viewed by an end user. The advantage of this option is that there is more than likely a user interface for viewing those records.
    • If the Maintenance Object does not have a child log table then you can use the generic Business Event Log object (F1-BUSEVTLOG). This can be used to store such audit information. You may want to create a UI to view that information in a format you want to expose as well as adding records to this table. If you use this option, remember to setup a message to hold the audit message you want to display on the screen. This is needed in the Business Object definition. This is in the sample used.
    • You can create a custom Maintenance Object to store this information. This is the least desirable as you need to build Java objects to maintain the Maintenance Object but it is still possible. For the rest of this article I will ignore this alternative.
  • Create a Business Object with the data you want to store the audit within using the Business Object Maintenance Schema Editor. You can structure the information as you see fit including adding flattened fields for the collections if you wish.

For example, for Business Event Log I created a BO called CM-BusinessAuditLog like below:

    <logId mapField="BUS_EVT_LOG_ID"/>  
    <logDateTime mapField="LOG_DTTM" default="%CurrentDateTime"/>
    <user mapField="USER_ID" default="%CurrentUser"/>  
    <maintenanceObject mapField="MAINT_OBJ_CD" default="F1-BUSEVTLOG"/>   
    <businessObject mapField="BUS_OBJ_CD" default="CM-BusinessAuditLog"/>  
    <primaryKeyValue1 mapField="PK_VALUE1" default="001"/>
    <messageCategory mapField="MESSAGE_CAT_NBR" default="90000"/>  
    <messageNumber mapField="MESSAGE_NBR" default="1000"/>   
    <version mapField="VERSION" suppress="true"/>
    <parmUser mapField="MSG_PARM_VAL"> 
       <row mapChild="F1_BUS_EVT_LOG_MSG_PARM">  
         <PARM_SEQ is="1"/>
    <parmPortal mapField="MSG_PARM_VAL">
       <row mapChild="F1_BUS_EVT_LOG_MSG_PARM">
         <PARM_SEQ is="2"/>
    <parmZone mapField="MSG_PARM_VAL">
       <row mapChild="F1_BUS_EVT_LOG_MSG_PARM">
         <PARM_SEQ is="3"/>
    <parmF1 mapField="MSG_PARM_VAL"> 
        <row mapChild="F1_BUS_EVT_LOG_MSG_PARM">  
          <PARM_SEQ is="4"/>
    <parmH1 mapField="MSG_PARM_VAL"> 
        <row mapChild="F1_BUS_EVT_LOG_MSG_PARM">  
           <PARM_SEQ is="5"/>
    <parmXML1 mapField="MSG_PARM_VAL">  
        <row mapChild="F1_BUS_EVT_LOG_MSG_PARM">
           <PARM_SEQ is="6"/>
    <parmGC1 mapField="MSG_PARM_VAL">
        <row mapChild="F1_BUS_EVT_LOG_MSG_PARM">
           <PARM_SEQ is="7"/>
    <parmF1Label mapField="MSG_PARM_VAL">
        <row mapChild="F1_BUS_EVT_LOG_MSG_PARM"> 
            <PARM_SEQ is="8"/>  
    <parmH1Label mapField="MSG_PARM_VAL">
        <row mapChild="F1_BUS_EVT_LOG_MSG_PARM">
            <PARM_SEQ is="9"/> 
  • Note: I set up a basic message (message category 90000 and message number 10000) to hold the desired message

User %1 has read value %6 on Portal %2 within Zone %3

  • Create a Service Script (say CM-AuditZone) to populate the fields on the Business Object according to your site standards. Remember to add the Business Object as a Data Area For example:
     move "parm/userId" to "CM-BusinessAuditLog/parmUser";
     move "parm/portalName" to "CM-BusinessAuditLog/parmPortal";
     move "parm/zoneCd" to "CM-BusinessAuditLog/parmZone";
     move "parm/pk1" to "CM-BusinessAuditLog/parmXML1";
     invokeBO 'CM-BusinessAuditLog' using "CM-BusinessAuditLog" for add;

  • To reduce performance impact on creating audit records (also to add audit records when the change mode on the prime object is read only) it is recommended to create another Service script (say CM-ZoneAuditing) and use F1-ExecuteScriptInNewSession to execute it in a new thread. Remember to add the script as a Data Area. For example:

    move "parm/input" to "CM-AuditZone";
    move 'CM-AuditZone' to "F1-ExecuteScriptInNewSession/scriptName";
    move "CM-AuditZone" to "F1-ExecuteScriptInNewSession/scriptData";
    invokeBS 'F1-ExecuteScriptInNewSession' using "F1-ExecuteScriptInNewSession";
    • Add the schema to the script to accept the input from the Zone parameters as per the Help entry for the Audit Service Script. For example:
        <input type="group"> 
  • Attach the Audit Service Script (CM-ZoneAuditing). For example:

ss='CM-ZoneAuditing' input=[zone=zone portal=portal user=userId pk1=F1Label pk2=F1 
pk3=F3Label pk4=F3]

This example is just a sample with some basic processing. It can be extended to capture additional information. It is recommended to use Log files on the Object if they are available.

Monday May 12, 2014

SSO Integration Patterns

Single Sign On Support is one of the common questions I get asked from customers, partners and sales people.

Single Sign On is basically an implementation mechanism or technology that allows customers of multiple browser applications to specify credentials once (at login typically) that are reused for that session for subsequent applications. This avoids logging on more than once. This aids in cross product navigation where a user logs onto one application and when transfer to another application avoid logging into that other product.

Single Sign On is not a product requirement it is an infrastructure requirement. Therefore there are infrastructure solutions available.

Typically there are two main styles of Single Sign On with different approaches for implementation.

The first style is best described as "Desktop" Single Sign-On. This is where you logon to your client machine (usually a windows based PC) and the credentials you used to logon to that machine are reused for ANY product used after authentication. Typically this is implemented using the Kerberos protocol and Simple and Protected Negotiate (SPNEGO) protocol. This is restricted to operating systems (typically Windows) where you perform the following:

  • Setup the client machine browsers to accept and pass the credentials to the server. This sets the browser to read the kerberos credentials and pass them to the server.
  • Setup the Microsoft Active Directory Services Network Domain Controller to accept Kerberos and pass onto the subsequent applications.
  • Create a keytab file for Oracle WebLogic to use.
  • Configure Oracle WebLogic Indentity Assertion Provider to specify that the keytab is to be used and that Kerberos is to be used for the Identity.
  • Configure Oracle WebLogic to startup using the provider and Kerberos.
  • Set the login preferences within OUAF to CLIENT-CERT to indicate the login is passed from somewhere else. This turns off our login screen.

As you can see the majority of the work is in Oracle WebLogic and is documented in Configuring Single Sign-On with Microsoft Clients.

The second style of is best described as "Browser" Single Sign-On. This typically means you login to the machine and then open the browser to logon. At this point as long as the browser is open, any subsequent application will reuse the credentials specified for the browser session. This is the style i implemented by SSO products such as Oracle Access Manager, Oracle Enterprise SSO and other SSO products (including third party ones). Typically implementing this involves the following:

  • Setting Up Oracle Access Manager or the SSO product to your requirements. Oracle Access Manager supports lots of variations for SSO including Single Network Domain SSO, Multiple Network Domains, Application SSO, etc. This is all outlined in Introduction to Single Sign-On with Access Manager.
  • Setting up Oracle WebLogic with Oracle Access Manager (this allows Oracle WebLogic to get the credentials from Oracle Access Manager). This is outined in Configuring Single Sign-On with Oracle Access Manager 11g.
  • Set the login preferences within OUAF to CLIENT-CERT to indicate the login is passed from somewhere else. This turns off our login screen

Again, as you can see the majority of the work is in Oracle WebLogic and Oracle Access Manager.

Information about implementing Single Sign-On withour products (both styles) is contained in

  • Single Sign On Integration for Oracle Utilities Application Framework based products (Doc Id: 799912.1) available from My Oracle Support.
  • Oracle Identity Management Suite Integration with Oracle Utilities Application Framework based products (Doc Id: 1375600.1) available from My Oracle Support.

While the first style is lower cost typically, it is restricted to specific platforms that support Kerberos and SPNEGO. It is restricted also in flexibility, it passes the credentials from the client all the way to the server so they must match. Oracle Access Manager on the other hand is far more flexible supporting a wide range of architectures as well as including Access Control features, password control and user tracking features within WebGate. These features allow additional features to be implemented:

  • Access Control - This allows for additional security rules to be implemented. For example, turning off part of a product during time periods. I have heard of customers using Oracle Access Manager to stop online payments from being accessible after business hours from a call center, due to customer specific payment processes being implemented. This augments the inbuilt security model available from Oracle Utilities Application Framework.
  • User Tracking - Oracle Utilities Application Framework is stateless, therefore you can only see active users when they are actively running transactions, not when they are idle. WebGate has information about idle users as well as active users allowing for enhanced user tracking.

Whatever the style you choose to adopt, we have a flexible set of solutions to implement SSO. The only common element and the only step Oracle Utilities Application Framework is to change the J2EE login preference from the default FORM based to CLIENT-CERT.

Friday May 09, 2014

Archiving/ILM Part 2 - ILM Date And ILM Archive Flag

As part if the new data management capabilities of Oracle Utilities Application Framework V4., two new columns have been added to products to be managed by this capability.

  • ILM Date (ILM_DT) - This is a field populated with the system date at record creation time. This sets the starting date (plus the retention period) where the ILM solution will evaluate the eligibility of the record for archiving by the ILM Crawler. This date is set by the Maintenance Object at object creation time but like ANY other column in the object can be altered by algorithms, batch processes etc. Manipulating the date can delay (or speed up) ILM activities on a particular object. For example, it is possible to set this date in an appropriate algorithm (set by your business practices) to manipulate when a particular object is to be considered for ILM consideration.
  • ILM Archive Flag (ILM_ARCH_FLG) - This is a flag, set to N by default, that determines whether the record is eligible for archiving (removal) or any other ILM activities. This column is maintained by the ILM Crawler assigned to the object, which will assess the rules for eligibility after the ILM_DT has passed. If the record is deemed eligible for archiving then the value will be set to Y to indicate that other ILM activities can be safely performed on this object.

The ILM Crawlers uses these columns and the associated ILM Eligibility algorithm n the Maintenance Object to determine the eligibility of the objects. These values are managed for you automatically. If the basic setup is not sufficient for your data retention needs the ILM Eligibility algorithm can be altered to suit your needs or other algorithms can be extended to help you set these values.

Monday May 05, 2014

Archiving/ILM Introduction - Part 1

As part of Oracle Utilities Application Framework and Oracle Utilities Customer Care And Billing, a new Archiving/Data Management engine based around the Information Lifecycle Management (ILM) capabilities within the Oracle Database (with options).

The first part of the solution is actually built into the Oracle Utilities Application Framework to define the business definition of active transaction data. Active transaction data is transactional data that is regularly added, changed or deleted as part of a business process. Transaction data that is read regularly is not necessarily active from an ILM point of view. Data that read can be compressed, for example, with little impact to performance of that data.

Note: The ILM solution only applies to the objects that are transactional and that ILM has been enabled against. Refer to the DBA Guide shipped with the product for a list of objects that are shipped with each product.

To set the business definition of active transaction data is using a master configuration record for ILM. For example:

Master Configuration

It is possible to define the data retention period, in days, for individual objects that are covered by the data management capability. These settings are set on the Maintenance Object Options shipped with the ILM solution. For example:

Maintenance Object Options

Essentially the configuration allows for the following:

  • A global retention period can be defined, in days. Objects that are covered by ILM can inherit this setting if you do not want to manage at the Maintenance Object level.
  • Each Maintenance Object that is enabled for ILM, has a number of Maintenance Object options to define the following:
    • ILM Retention Period In Days - Sets the retention period for the object at creation time.
    • ILM Crawler Batch Control - The batch control for the crawler which will traverse the objects and set the ILM dates and ILM flags.
    • ILM Eligibility Algorithm - The algorithm containing the business rules to assess the eligibility of individual objects for data management. This algorithm can be altered to implement additional business rules or additional criteria to implement object specific rules. For example, it is possible to implement specific rules for specific object definitions (i.e. say, have different rules for residential customers to industrial/commercial customers).
  • An ILM crawler has been provided for each object to set ILM dates and assess eligibility for objects. This batch process can be run whenever the business rules need to be implemented for data management and also used for when the business rules need to be changed, due to business changes.

At the end of this stage, a number of ILM specific fields on those objects have been set ready for the technical implementation of the ILM facilities in the database (which will be a subject of a future post).

The date that is set by this configuration does not mean that this data will disappear, it just defines the line where the business hands the data over to the technical database setup.

As you can see from this post, the data management capability from the business perspective is simple and flexible. You can define take the default eligibility rules and setup as provided or customize this first stage to implement more complex rules that match your data retention rules.

Friday May 02, 2014

Information Lifecycle Management or Archiving V2

Oracle Utilities Application Framework V4. has been released with Oracle Utilities Customer Care and Billing V2. In this major release, a new data management facility has been released to replace the original Archiving facility that was provided with Oracle Utilities Customer Care and BillingV2.1/V2.2/V2.3.1.

This new facility has a number of major advantages for effective data management:

  • A new set of fields have been added to objects specifically to allow implementations to control the data lifecycle for those objects. This includes dates and a flag to determine what the lifecycle for objects is as well as the eligibility of individual objects for data management.
  • The facility allows the customer to define how long key objects are active for across an implementation. This allows a new date within these objects to be specifically set for data management activities independent of when they are actually active. This allows flexible data retention policies to be implemented.
  • A crawler batch job per object, implements business and integrity rules to trigger data management activities. This allows data retention policies to be adjusted for changes to business needs.
  • The data management capability uses Oracle's Information Lifecycle Management capabilities to implement the storage and data retention policies based upon the data management dates and flags. This allows IT personnel to define the physical data retention policies to perform the following types of activities:
    • Allows use of compression including base compression in Oracle, Advanced Compression or Hybrid Columnar Compression in Oracle ExaData. Externalized compression in SAN hardware is also supported.
    • Allows the ability to, optionally, use lower cost storage to manage groups of data using Oracle Partitioning. This allows saving of costs of storing less active data in your implementation.
    • Allows specification and simulation of data management policies using ILM Assistant including estimating expected storage cost savings.
    • Allows data management manually or automatically using Automatic Storage Management (ASM), Automatic Data Optimization (ADO) and/or Heat Maps. The latter are available in Oracle 12c to provide additional facilities.
    • Allows use of transportable tablespaces via Oracle Partitioning to quickly remove data that has been identified as archived.
  • The definition of the lifecycle can be simple or as complex as your individual data retention policies dictate, with the business and IT together defining the business and technical implementations of the rules within the product and the Information Lifecycle Management components within the database.
  • The Oracle Utilities Application Framework has been altered to recognize data management policies. This means that once a policy has been implemented, access to that data will conform to that policy. For example, data that is removed via transportable tablespaces will be recognized as archived and the online/batch process will take this into account.

Data Management documentation is provided with the products to allow  implementations to take advantage of this new capability. This allows data management retention policies to be flexible and use the data management capabilities within the database to efficiently manage the lifecycle of critical data in Oracle Utilities Applications.

Over the next few weeks a number of blog entries will be published to walk through the various aspects of the solution.

Thursday Apr 10, 2014

New Web Services Capabilities available

As part of Oracle Utilities Application Framework V4., a new set of Web Services capabilities is now available to replace the Multi-Purpose Listener (MPL) and also XAI Servlet completely with more exciting capabilities.

Here is a summary of the facilities:

  • There is a new Inbound Web Services (IWS) capability to replace the XAI Inbound Services and XAI Servlet (which will be deprecated in a future release). This capability combines the meta data within the Oracle Utilities Application Framework with the power of the native Web Services capability within the J2EE Web Application Server to give the following advantages:
    • It is possible to define individual Web Services to be deployed on the J2EE Web Application Server. Web based and command line utilities have been provided to allow developers to design, deploy and manage individual Inbound Web Services.
    • It is now possible to define multiple operations per Web Service. XAI was restricted to a single operation with multiple transaction types. IWS supports multiple operations separated by transaction type. Operations can even extend to different objects within the same Web Service. This will aid in rationalizing Web Services.
    • IWS  makes it  possible to monitor and manage individual Web Services from the J2EE Web Application Server console (or Oracle Enterprise Manager). These metrics are also available from Oracle Enterprise Manager to provide SLA and trend tracking capabilities. These metrics can also be fine grained to the operation level within a Web Service.
    • IWS allows greater flexibility in security. Individual Services can now support standards such as WS-Policy, WS-ReliableMessaging etc as dictated by the capabilities of the J2EE Web Application Server. This includes message and transport based security, such as SAML, X.509 etc and data encryption.
    • For customers lucky enough to be on Oracle WebLogic and/or Oracle SOA Suite, IWS now allows full support for Oracle Web Services Manager (OWSM) on individual Web Services. This also allows the Web Services to enjoy additional WS-Policy support, as well as, for the first time, Web Service access rules. These access rules allow you to control when and who can run the individual service using simple or complex criteria ranging from system settings (such as dates and times), security (the user and roles) or individual data elements in the payload.
    • Customers migrating from XAI to IWS will be able to reuse a vast majority of their existing definitions. The only change is that each IWS service has to be registered and redeployed to the server, using the provided tools, and the URL for invoking the service will be altered. XAI can be used in parallel to allow for flexibility in migration.
  • The IWS capability and the migration path for customers using XAI Inbound Services is available in a new whitepaper Migrating from XAI to IWS (Doc Id: 1644914.1) available from My Oracle Support.

Over the next few weeks I will be publishing articles highlighting capabilities for both IWS and the OSB to help implementations upgrade to these new capabilities.

Tuesday Apr 08, 2014

Oracle Service Bus transports available

As outlined in the whitepaper Oracle Service Bus Integration with Oracle Utilities Application Framework (Doc Id: 1558279.1) available from My Oracle Support, the Multi-Purpose Listener (MPL) is bsing replaced by Oracle Service Bus (OSB). Whilst transaction inbound to the Oracle Utilities Application Framework based product are handled natively using Web Services, transactions outbound from the products are handled by Oracle Service Bus. Consequently a number of protocol adapters have been developed that are installed in Oracle Service Bus to allow Oracle Service Bus to initiate the following outbound communications:

  • Outbound Messages
  • Notification Download Staging (Oracle Utilities Customer Care and Billing and Oracle Public Service Revenue Management only)

The transports are now available from My Oracle Support as a patch on any OUAF and above product as Patch 18512327: OUAF Transports for OSB 1.0.0.

Installation instructions for Oracle Service Bus and Oracle Enterprise Pack for Eclipse are included in the patch as well as the whitepaper.

Monday Apr 07, 2014

Password Change Sample

In the Technical Best Practices whitepaper ((Doc Id: 560367.1), available from My Oracle Support, there is a section (Password Management Solution for Oracle WebLogic) that mentions a sample password change JSP that used to be provided by BEA for WebLogic. That site is no longer available but the sample code is now available on this blog.

Now, this is an example only and is very generic. It is not a drop and install feature that you can place in your installation but the example is sufficient to give an idea of the Oracle WebLogic API available for changing your password. It is meant to allow you to develop a CM JSP if you required this feature.

There is NO support for this as it is sample code only. It is merely an example of the API available. Link to this code is here. Examine it to get ideas for your own solutions.

The API used will most probably work for any security system that is configured as an authentication security provider.

Private Cloud Planning Guide available for Oracle Utilities

Oracle Utilities Application Framework based applications can be housed in private cloud infrastructure which is either onsite or as a partner offering. Oracle provides a Private Cloud foundation set of software that can be used to house Oracle Utilities software. To aid in planning for installing Oracle Utilities Application Framework based products on private cloud a whitepaper has been developed and has been published.

The Private Cloud Planning Guide (Doc Id: 1308165.1) which is available from My Oracle Support, provides and architecture and software manifest for implementing a fully functional private cloud offering onsite or via a partner. It refers to other documentation to install and configure specific components of a private cloud solution.

Thursday Mar 20, 2014

Updated SSO Integration Whitepaper

The Single Sign-On integration whitepaper has been updated with the latest information to assist implementations configure Single Sign-On solutions with Oracle Utilities Application Framework based products.

The changes can be summarized as follows:

  •  Instructions on setting login-config updated for multiple Oracle Utilities Application Framework versions.
  • Added example configuration sections to illustrate various options
  • Added appendix on linking to Kerberos based solutions
  • Added appendix to link to Oracle Access Manager and Oracle Adaptive Access Manager based solutions.

The whitepaper is available from My Oracle Support at Single Sign On Integration for Oracle Utilities Application Framework based products (Doc Id: 799912.1)

Monday Mar 03, 2014

Overview and Guidelines for Managing Business Exceptions and Errors Whitepaper

The Oracle Utilities Customer Care and Billing team have released a new whitepaper detailing an overview and guidelines for managing business exceptions and errors (To Do Entries) in Oracle Utilities Customer Care and Billing implementations. This whitepaper is part of a project to improve the use of facilities within the product lines to help optimize implementations.

The whitepaper is available from My Oracle Support under Overview and Guidelines for Managing Business Exceptions and Errors (KB Id: 1628358.1).

Friday Feb 21, 2014

Implementing Multiple Products in a single domain

By default, the Oracle Utilities applications are installed in embedded mode for Oracle WebLogic. Basically the product reuses an existing Oracle WebLogic installation and points the WebLogic runtime installation to the Oracle Utilities application runtime to run the product. It is called embedded as basically we are not using the Oracle WebLogic installation to house the product, the product is using file embedded within the product to run Oracle WebLogic. For instance we generate the security setup,, config.xml etc and command utilities to start/stop Oracle WebLogic and they are embedded within our product.

Whilst the embedded installation is ideal for most environments, as it is simple, it has a number of disadvantages:
  • Advanced facilities such as clustering and high availability cannot be easily implemented in embedded mode.
  • Most of the configuration is defaulted such as the domain name and server names.
  • The administration server is automatically included in each environment.
  • You need to use text file based user exits to augment the embedded configuration for advanced configurations. This requires manual efforts to maintain XML files in some cases.

To offer an alternative to the embedded installation, we introduced the ability to use a native installation method which houses the product inside Oracle WebLogic. This allows the site to take full advantage of Oracle WebLogic features and also manage the configuration from the Oracle WebLogic console or Oracle Enterprise Manager. For details of the features of the Native installation refer to the previous blog posts (Installation, Overview) on that subject.

Now one of the interesting abilities that is possible when using native mode is that it is possible to run multiple products or environment within the same domain. Basically this means you can reduce the number of administration consoles to manage your environment.

To use this facility the following process should be used:

  • Install Oracle WebLogic as per the Oracle WebLogic Installation documentation and Native Installation Oracle Utilities Application Framework (Doc Id: 1544969.1).
  • Create a domain with an administration server using the Configuration Wizard shipped with Oracle WebLogic.
  • Logon to the administration console with the user you specified when you created the domain.
  • Within the console create individual servers (naming is up to your site standards) for each product or environment you want to house the products. You should use machines with Node manager as well to allow for expansion and remote management if necessary. With native mode, the administration console does not have to be on the same machine as the target environments. Ensure each server is broadcasting on a different port.
  • Install the products as outlined the in the Native Installation Oracle Utilities Application Framework (Doc Id: 1544969.1) whitepaper and the product installation documentation with the additional advice:
  • Deployments in Oracle WebLogic need to be unique across a domain. By default, the product creates a common set of names for each component. It is necessary to change these names during the installation to avoid confusion in deployment. There are two settings that need to change:
Setting Default Recommendation
Business Server Application Name SPLService Add an environment or product identifier as prefix or suffix
Web Server Application Name SPLWeb Add an environment or product identifier as prefix or suffix
  • Ensure the deployment name is unique for every single deployment (even across products/environments).
  • For example, I run a FW2.2 environment and FW42 environment on the same domain. I setup SPLServiceFW22 and SPLWebFW22 for FW22 deployments and SPLServiceFW42 and SPLWebFW42 for my FW42 environment. These are just examples I use locally.
  • Ensure the paths in the Server Setup for the individual servers point to the classes in the relevant environment installations. Ensure the SPLEBASE is set correctly in the server setup.
  • Ensure the port numbers allocated to the Servers match the port numbers you specified in the product installation for each server.
  • The most important part of this is that you MUST alter setDomain for the domain to set the SPLEBASE variable appropriately for each SERVERNAME. If you forget this the product will not startup. In my example:

if [ $SERVERNAME$ = 'ouaf22server']


   set SPLEBASE=/oracle/FW22


  • Deploy the deployments to the relevant server. To save time, deploy the SPLService (or whatever you called it) first and then SPLWeb (or whatever you called it) as per the Native Installation Oracle Utilities Application Framework (Doc Id: 1544969.1) whitepaper.
  • Start/Stop the server to start/stop the environment/product using the Administration console.

Now a couple of additional things to think about when using this technique:

  • All servers on this environment share the same authentication security setup. Just be aware of this.
  • By default, all the J2EE resources are controlled by a common role/credential cisusers. If you want to separate the servers using different roles/credentials then you need to change the cisuser setting using the configureEnv -a settings for the Web Security Role/Web Principal Name/Application Viewer Security Role/Application Viewer Principal Name to an appropriate setting for each product/environment.
  • When using native mode, any changes to the EAR files needs a redeployment (it is an update deployment which is far quicker). You can use the autodeploy features of Oracle WebLogic to minimize this effort (just note that it will take higher CPU consumption overall as Oracle WebLogic will check regularly for changes to deploy). Just remember, if you ever run initialSetup an update redeployment is required.
  • Any changes to properties files may not necessarily require redeployment at runtime as setting the SPLEBASE uses the versions stored in the etc/conf directory.  If you want to keep the EAR versions in synchronization then running an update redeployment is necessary after running initialSetup.
  • Embedded installations can be converted to this facility and retain the embedded installation as a fallback. The embedded installation and native installation cannot be running at the same time as they share port numbers. This is outlined in the Native Installation Oracle Utilities Application Framework (Doc Id: 1544969.1) whitepaper.

Once this is done you can manage the deployments from the console including security and monitoring.

Note: Customers using Oracle Enterprise Manager to manage the products or Oracle Web:ogic will not necessarily need to use this facility as the Oracle Enterprise Manager already serves this process.

Tuesday Feb 18, 2014

Using Oracle Test Data Manager with OUAF

The Oracle Test Data Management Pack allows the quick and safe  copying a subset of data from a production database to a non-production database. The pack can be used standalone or in association with the Oracle Data Masking Pack to comply with data privacy and data protection rules mandated by regulation or policy that restrict the use of actual customer data for non-production purposes.

Oracle Utilities Application Framework based products can utilize this pack using the following technique:

  • A copy of the production schema with no data should be created on the production database. It is important not to load the data as this will aid in the creation. A copy of the schema can be built using Oracle SQL Developer or using tools included in Oracle Database Control/Oracle Database 12c EM Express.

Note: Oracle highly recommends not using the live production schema for the definition process.

  • Create an Application Model on the copied and prepared schema using the instructions in the Data Discovery And Modeling documentation.
  • Optionally, remove any tables or objects you do not want managed with the Oracle Test Data Management Pack Application Data Model you just loaded. For example, you might want to remove administration tables to optimize the time for the extract. This can be done within the Oracle Test Data Management Pack interface available within Oracle Enterprise Manager.
  • The Application Model now can be used against any production schema (as the source) at execution time.
  • Define the data subset you wish to extract as outlined in the Data Subsetting documentation. This can be a fixed subset, percentage or a complex SQL condition to determine the active subset to extract.
  • Optionally, identify the sensitive data you want to mask and associate the formatting to be used for handling the masked data. This will automatically mask the data in the extract as outlined in the Masking Sensitive Data documentation.

It is recommended that Oracle Test Data Management Pack be only used on Production environments to minimize licensing arrangements.

Note: If there is a need to comply with local privacy and protection laws that Oracle Data Masking Pack is also used with the Oracle Test Data Management Pack.

Note: This technique can be used with any release of the products or any release of the Oracle Utilities Application Framework.


Anthony Shorten
Hi, I am Anthony Shorten, I am the Principal Product Manager for the Oracle Utilities Application Framework. I have been working for over 20+ years in the IT Business and am the author of many a technical whitepaper, manual and training material. I am one of the product managers working on strategy and designs for the next generation of the technology used for the Utilities and Tax markets. This blog is provided to announce new features, document tips and techniques and also outline features of the Oracle Utilities Application Framework based products. These products include Oracle Utilities Customer Care and Billing, Oracle Utilities Meter Data Management, Oracle Utilities Mobile Workforce Management and Oracle Public Service Revenue Management. I am the product manager for the Management Pack for these products.


« May 2015