Tuesday May 20, 2014

Audit On Inquiry Example - Zones

One of the features of Oracle Utilities Application Framework V4.x is the ability to audit inquiries from zones and pages. This allows you to track information that is read rather than what is updated (which is the prime focus of the internal audit facility).

This example shown below is a sample only. It just illustrates the process. The Script is invoked upon broadcast (the sample does not include the global context but that can be added as normal).

To use this facility here is the basic design pattern (in order you would perform it):

  • Decide where you want to store the inquiry data first. You cannot store the inquiry data in the same audit tables/objects as updates or deletes are recorded as the Audit Object Maintenance Object is read only (as it is only used internally). You have three options here:
    • If the Maintenance Object has a child log table then this is ideal for recording when that object is read or viewed by an end user. The advantage of this option is that there is more than likely a user interface for viewing those records.
    • If the Maintenance Object does not have a child log table then you can use the generic Business Event Log object (F1-BUSEVTLOG). This can be used to store such audit information. You may want to create a UI to view that information in a format you want to expose as well as adding records to this table. If you use this option, remember to setup a message to hold the audit message you want to display on the screen. This is needed in the Business Object definition. This is in the sample used.
    • You can create a custom Maintenance Object to store this information. This is the least desirable as you need to build Java objects to maintain the Maintenance Object but it is still possible. For the rest of this article I will ignore this alternative.
  • Create a Business Object with the data you want to store the audit within using the Business Object Maintenance Schema Editor. You can structure the information as you see fit including adding flattened fields for the collections if you wish.

For example, for Business Event Log I created a BO called CM-BusinessAuditLog like below:

    <logId mapField="BUS_EVT_LOG_ID"/>  
    <logDateTime mapField="LOG_DTTM" default="%CurrentDateTime"/>
    <user mapField="USER_ID" default="%CurrentUser"/>  
    <maintenanceObject mapField="MAINT_OBJ_CD" default="F1-BUSEVTLOG"/>   
    <businessObject mapField="BUS_OBJ_CD" default="CM-BusinessAuditLog"/>  
    <primaryKeyValue1 mapField="PK_VALUE1" default="001"/>
    <messageCategory mapField="MESSAGE_CAT_NBR" default="90000"/>  
    <messageNumber mapField="MESSAGE_NBR" default="1000"/>   
    <version mapField="VERSION" suppress="true"/>
    <parmUser mapField="MSG_PARM_VAL"> 
       <row mapChild="F1_BUS_EVT_LOG_MSG_PARM">  
         <PARM_SEQ is="1"/>
       </row>
    </parmUser>
    <parmPortal mapField="MSG_PARM_VAL">
       <row mapChild="F1_BUS_EVT_LOG_MSG_PARM">
         <PARM_SEQ is="2"/>
       </row>
    </parmPortal>
    <parmZone mapField="MSG_PARM_VAL">
       <row mapChild="F1_BUS_EVT_LOG_MSG_PARM">
         <PARM_SEQ is="3"/>
       </row> 
    </parmZone>
    <parmF1 mapField="MSG_PARM_VAL"> 
        <row mapChild="F1_BUS_EVT_LOG_MSG_PARM">  
          <PARM_SEQ is="4"/>
        </row> 
    </parmF1>   
    <parmH1 mapField="MSG_PARM_VAL"> 
        <row mapChild="F1_BUS_EVT_LOG_MSG_PARM">  
           <PARM_SEQ is="5"/>
        </row>
    </parmH1>  
    <parmXML1 mapField="MSG_PARM_VAL">  
        <row mapChild="F1_BUS_EVT_LOG_MSG_PARM">
           <PARM_SEQ is="6"/>
        </row> 
    </parmXML1>   
    <parmGC1 mapField="MSG_PARM_VAL">
        <row mapChild="F1_BUS_EVT_LOG_MSG_PARM">
           <PARM_SEQ is="7"/>
        </row>
    </parmGC1>
    <parmF1Label mapField="MSG_PARM_VAL">
        <row mapChild="F1_BUS_EVT_LOG_MSG_PARM"> 
            <PARM_SEQ is="8"/>  
        </row>
    </parmF1Label>
    <parmH1Label mapField="MSG_PARM_VAL">
        <row mapChild="F1_BUS_EVT_LOG_MSG_PARM">
            <PARM_SEQ is="9"/> 
        </row>
    </parmH1Label> 
  • Note: I set up a basic message (message category 90000 and message number 10000) to hold the desired message

User %1 has read value %6 on Portal %2 within Zone %3

  • Create a Service Script (say CM-AuditZone) to populate the fields on the Business Object according to your site standards. Remember to add the Business Object as a Data Area For example:
     move "parm/userId" to "CM-BusinessAuditLog/parmUser";
     move "parm/portalName" to "CM-BusinessAuditLog/parmPortal";
     move "parm/zoneCd" to "CM-BusinessAuditLog/parmZone";
     move "parm/pk1" to "CM-BusinessAuditLog/parmXML1";
     invokeBO 'CM-BusinessAuditLog' using "CM-BusinessAuditLog" for add;

  • To reduce performance impact on creating audit records (also to add audit records when the change mode on the prime object is read only) it is recommended to create another Service script (say CM-ZoneAuditing) and use F1-ExecuteScriptInNewSession to execute it in a new thread. Remember to add the script as a Data Area. For example:

    move "parm/input" to "CM-AuditZone";
    move 'CM-AuditZone' to "F1-ExecuteScriptInNewSession/scriptName";
    move "CM-AuditZone" to "F1-ExecuteScriptInNewSession/scriptData";
    
    invokeBS 'F1-ExecuteScriptInNewSession' using "F1-ExecuteScriptInNewSession";
    • Add the schema to the script to accept the input from the Zone parameters as per the Help entry for the Audit Service Script. For example:
        <input type="group"> 
            <userId/>  
            <zoneCd/>  
            <portalName/>  
            <mo/>  
            <pk1/>  
            <pk2/>  
            <pk3/>  
            <pk4/>  
            <pk5/>  
        </input> 
    
  • Attach the Audit Service Script (CM-ZoneAuditing). For example:

ss='CM-ZoneAuditing' input=[zone=zone portal=portal user=userId pk1=F1Label pk2=F1 
pk3=F3Label pk4=F3]

This example is just a sample with some basic processing. It can be extended to capture additional information. It is recommended to use Log files on the Object if they are available.

Monday May 12, 2014

SSO Integration Patterns

Single Sign On Support is one of the common questions I get asked from customers, partners and sales people.

Single Sign On is basically an implementation mechanism or technology that allows customers of multiple browser applications to specify credentials once (at login typically) that are reused for that session for subsequent applications. This avoids logging on more than once. This aids in cross product navigation where a user logs onto one application and when transfer to another application avoid logging into that other product.

Single Sign On is not a product requirement it is an infrastructure requirement. Therefore there are infrastructure solutions available.

Typically there are two main styles of Single Sign On with different approaches for implementation.

The first style is best described as "Desktop" Single Sign-On. This is where you logon to your client machine (usually a windows based PC) and the credentials you used to logon to that machine are reused for ANY product used after authentication. Typically this is implemented using the Kerberos protocol and Simple and Protected Negotiate (SPNEGO) protocol. This is restricted to operating systems (typically Windows) where you perform the following:

  • Setup the client machine browsers to accept and pass the credentials to the server. This sets the browser to read the kerberos credentials and pass them to the server.
  • Setup the Microsoft Active Directory Services Network Domain Controller to accept Kerberos and pass onto the subsequent applications.
  • Create a keytab file for Oracle WebLogic to use.
  • Configure Oracle WebLogic Indentity Assertion Provider to specify that the keytab is to be used and that Kerberos is to be used for the Identity.
  • Configure Oracle WebLogic to startup using the provider and Kerberos.
  • Set the login preferences within OUAF to CLIENT-CERT to indicate the login is passed from somewhere else. This turns off our login screen.

As you can see the majority of the work is in Oracle WebLogic and is documented in Configuring Single Sign-On with Microsoft Clients.

The second style of is best described as "Browser" Single Sign-On. This typically means you login to the machine and then open the browser to logon. At this point as long as the browser is open, any subsequent application will reuse the credentials specified for the browser session. This is the style i implemented by SSO products such as Oracle Access Manager, Oracle Enterprise SSO and other SSO products (including third party ones). Typically implementing this involves the following:

  • Setting Up Oracle Access Manager or the SSO product to your requirements. Oracle Access Manager supports lots of variations for SSO including Single Network Domain SSO, Multiple Network Domains, Application SSO, etc. This is all outlined in Introduction to Single Sign-On with Access Manager.
  • Setting up Oracle WebLogic with Oracle Access Manager (this allows Oracle WebLogic to get the credentials from Oracle Access Manager). This is outined in Configuring Single Sign-On with Oracle Access Manager 11g.
  • Set the login preferences within OUAF to CLIENT-CERT to indicate the login is passed from somewhere else. This turns off our login screen

Again, as you can see the majority of the work is in Oracle WebLogic and Oracle Access Manager.

Information about implementing Single Sign-On withour products (both styles) is contained in

  • Single Sign On Integration for Oracle Utilities Application Framework based products (Doc Id: 799912.1) available from My Oracle Support.
  • Oracle Identity Management Suite Integration with Oracle Utilities Application Framework based products (Doc Id: 1375600.1) available from My Oracle Support.

While the first style is lower cost typically, it is restricted to specific platforms that support Kerberos and SPNEGO. It is restricted also in flexibility, it passes the credentials from the client all the way to the server so they must match. Oracle Access Manager on the other hand is far more flexible supporting a wide range of architectures as well as including Access Control features, password control and user tracking features within WebGate. These features allow additional features to be implemented:

  • Access Control - This allows for additional security rules to be implemented. For example, turning off part of a product during time periods. I have heard of customers using Oracle Access Manager to stop online payments from being accessible after business hours from a call center, due to customer specific payment processes being implemented. This augments the inbuilt security model available from Oracle Utilities Application Framework.
  • User Tracking - Oracle Utilities Application Framework is stateless, therefore you can only see active users when they are actively running transactions, not when they are idle. WebGate has information about idle users as well as active users allowing for enhanced user tracking.

Whatever the style you choose to adopt, we have a flexible set of solutions to implement SSO. The only common element and the only step Oracle Utilities Application Framework is to change the J2EE login preference from the default FORM based to CLIENT-CERT.

Friday May 09, 2014

Archiving/ILM Part 2 - ILM Date And ILM Archive Flag

As part if the new data management capabilities of Oracle Utilities Application Framework V4.2.0.2.0, two new columns have been added to products to be managed by this capability.

  • ILM Date (ILM_DT) - This is a field populated with the system date at record creation time. This sets the starting date (plus the retention period) where the ILM solution will evaluate the eligibility of the record for archiving by the ILM Crawler. This date is set by the Maintenance Object at object creation time but like ANY other column in the object can be altered by algorithms, batch processes etc. Manipulating the date can delay (or speed up) ILM activities on a particular object. For example, it is possible to set this date in an appropriate algorithm (set by your business practices) to manipulate when a particular object is to be considered for ILM consideration.
  • ILM Archive Flag (ILM_ARCH_FLG) - This is a flag, set to N by default, that determines whether the record is eligible for archiving (removal) or any other ILM activities. This column is maintained by the ILM Crawler assigned to the object, which will assess the rules for eligibility after the ILM_DT has passed. If the record is deemed eligible for archiving then the value will be set to Y to indicate that other ILM activities can be safely performed on this object.

The ILM Crawlers uses these columns and the associated ILM Eligibility algorithm n the Maintenance Object to determine the eligibility of the objects. These values are managed for you automatically. If the basic setup is not sufficient for your data retention needs the ILM Eligibility algorithm can be altered to suit your needs or other algorithms can be extended to help you set these values.

Monday May 05, 2014

Archiving/ILM Introduction - Part 1

As part of Oracle Utilities Application Framework 4.2.0.2.0 and Oracle Utilities Customer Care And Billing 2.4.0.2.0, a new Archiving/Data Management engine based around the Information Lifecycle Management (ILM) capabilities within the Oracle Database (with options).

The first part of the solution is actually built into the Oracle Utilities Application Framework to define the business definition of active transaction data. Active transaction data is transactional data that is regularly added, changed or deleted as part of a business process. Transaction data that is read regularly is not necessarily active from an ILM point of view. Data that read can be compressed, for example, with little impact to performance of that data.

Note: The ILM solution only applies to the objects that are transactional and that ILM has been enabled against. Refer to the DBA Guide shipped with the product for a list of objects that are shipped with each product.

To set the business definition of active transaction data is using a master configuration record for ILM. For example:

Master Configuration

It is possible to define the data retention period, in days, for individual objects that are covered by the data management capability. These settings are set on the Maintenance Object Options shipped with the ILM solution. For example:

Maintenance Object Options

Essentially the configuration allows for the following:

  • A global retention period can be defined, in days. Objects that are covered by ILM can inherit this setting if you do not want to manage at the Maintenance Object level.
  • Each Maintenance Object that is enabled for ILM, has a number of Maintenance Object options to define the following:
    • ILM Retention Period In Days - Sets the retention period for the object at creation time.
    • ILM Crawler Batch Control - The batch control for the crawler which will traverse the objects and set the ILM dates and ILM flags.
    • ILM Eligibility Algorithm - The algorithm containing the business rules to assess the eligibility of individual objects for data management. This algorithm can be altered to implement additional business rules or additional criteria to implement object specific rules. For example, it is possible to implement specific rules for specific object definitions (i.e. say, have different rules for residential customers to industrial/commercial customers).
  • An ILM crawler has been provided for each object to set ILM dates and assess eligibility for objects. This batch process can be run whenever the business rules need to be implemented for data management and also used for when the business rules need to be changed, due to business changes.

At the end of this stage, a number of ILM specific fields on those objects have been set ready for the technical implementation of the ILM facilities in the database (which will be a subject of a future post).

The date that is set by this configuration does not mean that this data will disappear, it just defines the line where the business hands the data over to the technical database setup.

As you can see from this post, the data management capability from the business perspective is simple and flexible. You can define take the default eligibility rules and setup as provided or customize this first stage to implement more complex rules that match your data retention rules.

Friday May 02, 2014

Information Lifecycle Management or Archiving V2

Oracle Utilities Application Framework V4.2.0.2.0 has been released with Oracle Utilities Customer Care and Billing V2.4.0.2.0. In this major release, a new data management facility has been released to replace the original Archiving facility that was provided with Oracle Utilities Customer Care and BillingV2.1/V2.2/V2.3.1.

This new facility has a number of major advantages for effective data management:

  • A new set of fields have been added to objects specifically to allow implementations to control the data lifecycle for those objects. This includes dates and a flag to determine what the lifecycle for objects is as well as the eligibility of individual objects for data management.
  • The facility allows the customer to define how long key objects are active for across an implementation. This allows a new date within these objects to be specifically set for data management activities independent of when they are actually active. This allows flexible data retention policies to be implemented.
  • A crawler batch job per object, implements business and integrity rules to trigger data management activities. This allows data retention policies to be adjusted for changes to business needs.
  • The data management capability uses Oracle's Information Lifecycle Management capabilities to implement the storage and data retention policies based upon the data management dates and flags. This allows IT personnel to define the physical data retention policies to perform the following types of activities:
    • Allows use of compression including base compression in Oracle, Advanced Compression or Hybrid Columnar Compression in Oracle ExaData. Externalized compression in SAN hardware is also supported.
    • Allows the ability to, optionally, use lower cost storage to manage groups of data using Oracle Partitioning. This allows saving of costs of storing less active data in your implementation.
    • Allows specification and simulation of data management policies using ILM Assistant including estimating expected storage cost savings.
    • Allows data management manually or automatically using Automatic Storage Management (ASM), Automatic Data Optimization (ADO) and/or Heat Maps. The latter are available in Oracle 12c to provide additional facilities.
    • Allows use of transportable tablespaces via Oracle Partitioning to quickly remove data that has been identified as archived.
  • The definition of the lifecycle can be simple or as complex as your individual data retention policies dictate, with the business and IT together defining the business and technical implementations of the rules within the product and the Information Lifecycle Management components within the database.
  • The Oracle Utilities Application Framework has been altered to recognize data management policies. This means that once a policy has been implemented, access to that data will conform to that policy. For example, data that is removed via transportable tablespaces will be recognized as archived and the online/batch process will take this into account.

Data Management documentation is provided with the products to allow  implementations to take advantage of this new capability. This allows data management retention policies to be flexible and use the data management capabilities within the database to efficiently manage the lifecycle of critical data in Oracle Utilities Applications.

Over the next few weeks a number of blog entries will be published to walk through the various aspects of the solution.

About

Anthony Shorten
Hi, I am Anthony Shorten, I am the Principal Product Manager for the Oracle Utilities Application Framework. I have been working for over 20+ years in the IT Business and am the author of many a technical whitepaper, manual and training material. I am one of the product managers working on strategy and designs for the next generation of the technology used for the Utilities and Tax markets. This blog is provided to announce new features, document tips and techniques and also outline features of the Oracle Utilities Application Framework based products. These products include Oracle Utilities Customer Care and Billing, Oracle Utilities Meter Data Management, Oracle Utilities Mobile Workforce Management and Oracle Enterprise Taxation and Policy Management. I am the product manager for the Management Pack for these products.

Search

Archives
« May 2014 »
SunMonTueWedThuFriSat
    
1
3
4
6
7
8
10
11
13
14
15
16
17
18
19
21
22
23
24
25
26
27
28
29
30
31
       
Today