Friday May 27, 2016

Embedded mode limitations for Production systems

In most implementations of Oracle Utilities products the installer creates an embedded mode installation. This is called embedded as the domain configuration is embedded in the application which is ideal for demonstration and development environments as the default setup is enough for those types of activities.

Over time though customers and partners will want to use more and more of the Oracle WebLogic domain facilities including advanced setups like multiple servers, clusters, advanced security setups etc. Here are a few important things to remember about embedded mode:

  • The embedded mode domain setup is fixed with a fixed single server that houses the product and the administration server with the internal basic security setup. In non-production this is reasonable as the requirements for the environment are simple.
  • The domain file (config.xml) is generated by the product, using a template, assuming it is embedded only.
  • When implementations need additional requirements within the domain there are three alternatives:
    • Make the changes in the domain from the administration console and then convert the new config.xml generated by the console as a custom template. This needs to be done as remember when Oracle deliver ANY patches or upgrades (or when you make configuration changes) we need to run initialSetup[.sh] to add the patch, upgrade or configuration to the product. This will reset the file back to the factory provided template unless you are using the custom template. Basically, if you decide to use this option, and do not implement a custom template then you will lose your changes each time.
    • In later versions of OUAF we introduced user exits. These allow implementations to add to the configuration using XML snippets. It does require you to understand the configuration file that is being manipulated and we have sprinkled user exits all over the configuration files to allow extensions. Using this method means that you make changes to the domain using the configuration files, examine the changes to the domain file and then decide which user exit is available to reflect that change and add the relevant XML snippet. Again you must understand the configuration file to make sure you do not corrupt the domain.
    • The easiest option is to migrate to native mode. This basically removes the embedded nature of the domain and houses it within Oracle Weblogic. This is explained in Native Installation whitepaper (Doc Id: 1544969.1) available from My Oracle Support.

Native Installations allows you to use the full facilities within Oracle WebLogic without the restrictions of embedded mode. The advantages of native installations is the following:

  • The domain can be setup according to your company standards.
  • You can implement clusters, multiple servers including dynamic clustering..
  • You can use the security features of Oracle WebLogic to implement complex security setups including SSO solutions.
  • You can lay out the architecture according to your volumes to manage within your SLA's.
  • You can implement JDBC connection pooling, Work Managers, advanced diagnostics etc.

Oracle recommends that native installations be used for environments where you need to take advantage of the domain facilities. Embedded mode should only be used within the restrictions it poses.

Monday May 09, 2016

Additional CCB 2.5 benchmark information

Recently I published a link to a summary report for the recent Oracle Utilities Customer Care and Billing 2.5 benchmark. Due to popular demand, we have released additional information about the benchmark including some configuration advice in a new additional whitepaper Oracle Utilities Customer Care and Billing V2.5 and 2.4 Comparison Benchmark Whitepaper (Doc Id: 2135359.1) now available from My Oracle Support.

This whitepaper was provided from our performance team and provides additional technical information about the benchmark setup as well as the results.

Monday May 02, 2016

DISTRIBUTED mode deprecated

Based upon feedback from partners and customers, the DISTRIBUTED mode used in the batch architecture has been deprecated in Oracle Utilities Application Framework V4.3.x and above. The DISTRIBUTED mode was originally introduced to the batch cluster architecture back in Oracle Utilities Application Framework V2.x and was popular but suffered from a number of restrictions. Given the flexibility of the batch architect was expanded in newer releases it was decided to deprecate the DISTRIBUTED mode to encourage more effective use of the architecture.

It is recommended that customers using this mode migrate to CLUSTERED mode using a few techniques:

  • For customers on non-production environments, it is recommended to use CLUSTERED mode using the single server (ss) template used by the Batch Edit facility. This is a simple cluster that uses CLUSTERED mode without the advanced configurations in a clustered environment. It is restricted to single host servers so it is not typically recommended for production or clustered environments that use more than one host server.
  • For customers on production environments, it is recommended to use CLUSTERED mode with the unicast (wka) template used by the Batch Edit facility. This will allow flexible configuration without the use of multi-cast which can be an issue on some implementations using CLUSTERED mode. The advantage of Batch Edit is that it has a simple interface to allow you to define this configuration without too much fuss. 

The advantage of Batch Edit when building your new batch configurations is that it is a simple to use as well as it generates an optimized set of configuration files that can be used directly for the batch architecture. Execution of the jobs would have to remove the DISTRIBUTED tags on the command lines or configuration files to use the new architecture.

Customers should read the Batch Best Practices (Doc Id: 836362.1) and the Server Administration Guide shipped with your product for advice on Batch Edit as well as the templates mentioned in this article.

Friday Apr 29, 2016

Migrating Oracle Utilities products from On Premise to Oracle Public Cloud

A while back Oracle Utilities announced that the latest releases of the Oracle Utilities Application Framework applications were supported on Platform As A Service (PaaS) on Oracle Public Cloud. As part of that support a new whitepaper has been released outlining the process of migrating an on-premise installation of the product to the relevant Platform As A Service offering on Oracle Public Cloud.

The whitepaper covers the following from a technical point of view:

  • The Oracle Cloud services to obtain to house the products, including the Oracle Java Cloud Service and Oracle Database As A Service with associated related services.
  • Setup instructions on how to configure the services in preparation to house the product.
  • Instructions of how to prepare the software for transfer.
  • Instructions on how to transfer the product schema to a Oracle Database As A Service instance using various techniques.
  • Instructions on how to transfer the software and make configuration changes to realign the product installation for the cloud. The configuration must follow the instructions in the Native Installation Oracle Utilities Application Framework (Doc Id: 1544969.1) available from My Oracle Support which has also been updated to reflect the new process.
  • Basic instructions on using the native cloud facilities to manage your new PaaS instances. More information is available in the cloud documentation.

The whitepaper applies to the latest releases of the Oracle Utilities Application Framework based products only. Customers and partners wanting to establish new environments (with no previous installation) can use the same process with the addition of actually running the installation on the cloud instance.

Customers and partners considering using Oracle Infrastructure As A Service can use the same process with the addition of installing the prerequisites.

The Migrating From On Premise To Oracle Platform As A Service (Doc Id: 2132081.1) whitepaper is available from My Oracle Support. This will be the first in a series of cloud based whitepapers.

Saturday Apr 23, 2016

Oracle Utilities Customer Care And Billing 2.5 Benchmark available

Oracle Utilities Customer Care and Billing v2.5.x marked a major change in application technology as it is an all Java-based architecture.  In past releases, both Java and COBOL were supported. Over the last few releases, COBOL support has been progressively been replaced to optimize the product.

In recently conducted performance benchmark tests, it was demonstrated that the performance of Oracle Utilities Customer Care and Billing v2.5.x, an all java based release, is at least 15 percent better than that of the already high performing Oracle Utilities Customer Care and Billing v2.4.0.2, which included the COBOL-based architecture for key objects, in all use cases tested.

The performance tests simulated a utility with 10 million customers with both versions running the same workloads. In the key use cases tested, Oracle Utilities Customer Care and Billing v2.5.x performed at least 15% faster than the previous release.

Additionally, Oracle Utilities Customer Care and Billing v2.5.x processed 500,000 bills (representing the nightly batch billing for a utility serving 10 million customer accounts being divided into twenty groups, so that 5% of all customers are billed each night on each of the 20 working days during the month) within just 45 minutes.

The improved Oracle Utilities Customer Care and Billing performance ultimately reduces utility staff overtime hours required to oversee batch billing, allows utilities to consolidate tasks on fewer servers and reduce data center size and cost required, and it enables utilities to confidently explore new business processes and revenue sources, such as running billing services to smaller utilities.

A whitepaper is available summarizing the results and details of the architecture used. 

Tuesday Apr 12, 2016

Using Database Resource Plans for effective resource management

In a past article we announced the support for Database Resource Plans. This facility is a technique that can be used by implementations to set limits and other resource constraints on processing to help optimize resource usage for implementations of Oracle Utilities products.

I have been asked a couple of follow-up questions about use cases that can exploit this facility. Here are a few things that might encourage its use:

  • Database Resource Plans can help constrain multiple channels share resources helping to avoid database contention between channels. For example, typically most utilities will not run batch processes in online hours. Typically the batch processes may cause contention with online users causing both channels to run slower. Using Database Resource plans you can tell the database to share the resources more effectively and also constrain batch to have minimal impact on the online users. Of course, batch will borrow resources used by online  but by using resource plans you can constrain it as much as practical.
  • Database Resource Plans are very flexible. You can set plans for time periods to reflect different resource profiles by channel by time of day. Using the batch/online use case in the last point, you can set batch to use less resources during the day and more at night. Conversely you can set online to use more resources during the day and less at night. This balances resources with their optimal use.
  • Database Resource Plans can be set globally or at low levels. In past releases of Oracle Utilities Application Framework, a set of database session visibility variables were set so that the database connection can be identified for monitoring. These same variables can now be used with resource plans. These include the program/batch job, threadpool/thread, client authorization user, client user tag etc. This means, if you desire, you can set minute level information based upon session characteristics in your database resource plans using Consumer Groups.
  • Database Resource Plans feature monitoring at the plan, directive, consumer group etc level to assess the effectiveness of those resource plans. This is available from database monitoring products including Oracle Enterprise Manager.

Database Resource Plans are another feature you can use from the database to effectively manage your resource usage to ensure each channel stays within its allocated resource profile. It is all about sharing the available resources and minimizing contention whilst harnessing the processing power available more effectively.

Monday Apr 04, 2016

OEM and Passwords

I wanted to outline an interesting experience I had recently around security. Oracle, like a lot of companies requires their employees to regularly change their passwords as it is considered good security practice. There are strict rules around the password formats and their history. Luckily Oracle uses its own Identity Management solutions so the experience is simple and quick.

Recently my passwords were set to expire. I have a process I use to ensure the passwords are changed across all the technologies I use. I usually do that one morning a couple of days before they are due to expire. I did that this time at the end of the day, as it was a particularly busy day. It was a Friday and all was well..

Except I forgot one important change. My credentials in my demonstration instance of Oracle Enterprise Manager. I have a demonstration environment where I do research and development as well as record training and do demonstration against. After that weekend I logged to my demonstration environment to see alerts that it could not connect via some credentials.

I have three credentials to worry about in Oracle Enterprise Manager:

  • There is a credential for Oracle Enterprise Manager to connect to My Oracle Support. This is used for checking patches, looking for critical advice as well as register Service Requests directly from Oracle Enterprise Manager in online mode. Typically, you would nominate an account to link to My Oracle Support (along with a Service Identifier for your site).
  • I have two named credentials I use regularly for host interaction such as installations and running regular jobs on the machines. These are administration accounts used for the product at the operating system level. The way the machine is setup, I use two as one is the Administration account and the other is a privileged account used for low level administration. In some cases some sites will only need one per user.

I was able to correct the passwords and all my environments reported back correctly. Credential management is one of the strengths of Oracle Enterprise Manager. Next time I will add the OEM credentials to my checklist.

Wednesday Mar 30, 2016

Oracle Coherence Use in the product

One of the most common questions I get from people is about the use of Oracle Coherence in our product.

We bundle a subset of the Oracle Coherence libraries for use in our Batch Architecture. The Coherence libraries permit our threadpools to be clustered and communicate (via Coherence) to each other in an efficient manner. This includes our submitters (the threads that are submitted) as well as the threadpools.

We introduced Oracle Coherence to our batch architecture in previous releases to manage our architecture and there are a few things that need to be clarified about the support:

  • We bundle a set of Coherence libraries that are used by the product. The libraries are only a subset of the full Coherence stack. They represent a Restricted Use License (RUL) for the provided use (i.e. managing the batch cluster).  The libraries are listed in the ouaf_jar_versions.txt file in the etc directory of the product installation. You do not need to purchase the Oracle WebLogic with Coherence to use the libraries for their licensed purpose.
  • As part of the Restricted Use License you cannot use the libraries in customizations so you cannot extend past the provided use. If you want to extend the use of Coherence in your custom solutions then you will need to purchase a FULL additional license for Oracle Coherence.
  • As the libraries are a subset of what is available in Oracle Coherence, it is NOT recommended to use the Oracle Coherence pack for Oracle Enterprise Manager with our products. This is because the pack assumes you are using the full stack and can return erroneous information when attempting to use it with the batch cluster.

Essentially we bundle a subset of Coherence libraries we use internally for our clustered batch architecture. These are locked down to use for that clustering purpose only. You do not need to extend the license to use them for this purposes. If you want to use them beyond this purpose, then you can purchase a full license if desired.

Thursday Mar 24, 2016

Service Based Testing

The Oracle Functional/Load Testing Advanced Pack for Oracle Utilities is a service based automated testing solution based around the popular Oracle Application Testing Suite. The main focus of this product to allow implementations of Oracle Utilities products to adopt automated testing quickly using prebuilt service based components to verify the product against your business processes and with your data. This is a fundamental principle of the solution.

Traditionally automated testing uses the user interface as the conduit to perform functional/load testing. There are a number of issues with that approach:

  • Traditionally you have to record a session to build testing assets. The data along with the user interaction are recorded and converted into a programmable script (using some scripting language). The data is typically associated with the test and to reuse the same process with different data you would have to either re-record the test or manually edit the scripting, which requires some programming experiences, to put new data into the script. This can involve quite a bit of test asset building and management. By the way, you can use Oracle Application Testing Suite in this mode as well if you did not have the Oracle Functional/Load Testing Advanced Pack for Oracle Utilities but the unique advantage of the Oracle Application Testing Suite is that that user interface is componentized for reuse.
  • If you use the user interface as the basis of the testing then ANY change to user interface that you perform (or the vendor performs) will invalidate the recorded script. One big example of this is that all the latest Oracle Utilities products are moving to a new user interface, to support a wide range of devices, which required the user interface to change. This even alone would invalidate user interface based scripting and require those assets to be rebuilt.

The Oracle Functional/Load Testing Advanced Pack for Oracle Utilities uses a service based approach which utilizes the service layer that the user interface passes data to (and from). The solution passes the same data as the screens would internally pass to the underlying services. There are a number of distinct advantages of this approach:

  • The service based approach is isolated from any user interface changes whether the change was introduced in a new version or as part of your implementation. The main focus is always functionality testing of the underlying business services.
  • The service based testing components are prebuilt, against our base services. They are verified against the product as the product QA teams use these components to verify the product in QA. If a prebuilt service component is not appropriate for your implementation or you have custom functionality that is beyond the scope of the product, we supply a component builder, built in OpenScript, that reads our meta data and generates a service based definition which can be loaded into the already provided library of components. We also ship a component verifier, built in OpenScript, that helps ensure that your generated components are still valid when you make administration or configuration data changes.
  • The service layer in the product is common across ALL channels (i.e. online, web services, batch and mobile). All the business logic and rules are stored, verified at that layer and applied regardless of channel used. There are no business rules in the user interface in the base product. There are usability features that look like rules to improve usability of the product but they are NOT business rules.
  • The service layer encapsulates all the business rules and validations. This greatly simplifies testing as the testing tool just needs to interface to the layer to take advantage of those rules. Just like any channel when a business rule is broken then the product will respond with an appropriate message (the same message as the online user would get). The testing tool will recognize these error conditions.
  • This solution separates usability testing (which is typically done manually to assess the screen for usability) versus verifying your functionality in the product against your business processes and your data. Assessment of screens for usability is best performed by a person in your organization that will assess the usability of the screen.

Now, we also understand that some implementations may of introduced business rules into your user interface for various reasons. While this is not ideal, as you will be missing those business rules in non-user interface based interfaces, you can use the power of the Oracle Application Testing Suite to record a user interface based component. That component can be mixed with the service based components to incorporate into a flow.

The service based approach is different to the user interface based approach used in a lot of other tools but we feel it is the most efficient means of testing your product implementation and upgrade both quickly and easily.

Wednesday Mar 16, 2016

Enterprise Manager: Using Metrics Extensions (SQL)

One the major features of Oracle Enterprise Manager (OEM) is the ability to create Metrics Extensions. These are metrics you want to track that may or may not be provided with the underlying products. I want to illustrate this point in a series of articles on using Oracle Enterprise Manager with Oracle Utilities products.

The first article is about how to use the basic metrics extensions capability with a simple SQL statement. This is not usual as it will be part of the database targets (not the Oracle Utilities targets) but I feel it will introduce specific techniques that we will reuse a lot in subsequent articles and serves as really good starting point.

A couple of things before we start:

  • The Metrics Extension part of OEM basically is a facility for you to add all sorts of custom metrics for OEM to track. You will create the extension and then associate it with targets to track.
  • The Metrics Extension component allows for incremental development. You specify and test the Metric first in the user interface. You can then mark it as deployed which will create a version. You then deploy the metric extension to be tracked on targets. The version tracking is useful as you can different versions of the metric deployed to different targets in different stages of development. I will touch on this only. More information is in the Metrics Extension documentation associated with the version of OEM you are using.
  • The screen dumps and example in this article are based upon a tracking query outlined in the Batch Troubleshooting Guide which flattens the Batch Run Tree and summarizes it. It is not a base view but a custom view that is used for illustrative purposes only. Refer to Performance Troubleshooting Guideline Series (Doc Id: 560382.1) from My Oracle Support.
  • The example shown is for Oracle Enterprise Manager 13c but can apply to other versions of Oracle Enterprise Manager.
  • The example will use SQL and in future articles we will explore other adapters.
  • The example is just for illustrative purposes only.

To perform this task you need to be authorized to use the Metrics Extension facility and the targets you will associate with the metric. Refer to your  installation to see if that is the case.

To setup the Metrics Extension, the following process can be used:

  • Navigate to the Metrics Extension facility. This can be done from the link page or menu (Monitoring --> Metric Extensions). For example:

Metrics Extension Menu

  •  From the Create menu, Select Metrics Extension. For example:

 Create Metric Extension

  • Specify the Metric Name, Target Type (Database Instance in this case), Display Name, Adapter (SQL in this case), Description and other attributes for the metric including default collection frequency. For example:

Metric Extension General Properties

  • You might notice the Select Advanced Properties which allows you to specify other attributes on the target to specialize the metric. This is new to OEM 13c and in this case will allow you to target multi-tenant databases (or not) for example.
  • Now as this is an SQL based metric you need to specify the SQL statement to execute to gather the data. In this example, we are using the custom view from the Performance Troubleshooting Guideline Series (Doc Id: 560382.1) from My Oracle Support. Now, in my example, I hardcoded the owner of the view. This is just an illustration. You can get over this by making sure the credentials have access to the view or create a synonym. Remember the database user must have SELECT access as a minimum. The example of the SQL is shown below:

SQL Example

  • For each column in the query you need to define it as part of the Metric. You do not have to define all of them but it is recommended to get full reuse. For each column, defines it attributes including if it data or a key value. Key values are used for SLA tracking. Also you can define more meta data to allow OEM to determine how to process it. The columns for our example are shown below:

Example Column definitions

  • Now we extend the metric by adding a few deltas. Delta's are virtual column that compare the last value with the current value. It is great for checking changes in values at the metric level. In our sample I will add two deltas. One for the Maximum Elapsed Time to see if the job elapsed time is getting worse and one for Maximum Run Rate (Throughput) to track if the number of records processed per period is getting lower. To do this select the field and create the Delta on that field. For example:

Max Elasped Time Delta

  • The Delta Column can also hold the Alert Threshold which is the default SLA including the messages that are available. For the Maximum Elapsed Time I want to detect if the change in the value has increased (greater than 0) and you can even set specific limits. I set a Critical SLA of delta of above 10 (as an example). For example:

Delta Definition with SLA - Max Elapsed Time

  • Repeat for the Max Throughput as that should be tracked to see if goes down (less records processed per minute). For example:

Adding Delta on Throughput

  • Again setup the Max Throughput. For example:

Deleta defintion for throughput

  •  Now the metric is complete with all the API fields. For example:

Complete Metric definition

  • The credentials for the metric need to be defined. When you create a metric you simply attach it to the metrics collection to use it. Again ensure that the credential is valid for the query. In my example I will use the standard database monitoring credential. For example:

Credentials

  • You can attach a database and run the test to verify the metric. This does not attach the metric to the target. It just tests it. For example:

 Testing the Metric

  • Review before saving the metric. At any time you can change the metric before you publish it. For example:

Review the Metric

Review the metric

  • Now the metric is still in editable mode so can be edited as much as necessary. This is indicated on the metric screen. For example:

Summary of Metric

  • To implement the metric you must save it as a Deployable Draft from the Actions Menu. For example:

Save As Deployable Draft

  • A version number is locked in and it is marked as deployable. For example:

Marked As Deployable

  • Now you need to identify the targets you want to deploy this metric to. You select the metric and use Deploy To Targets. from the Actions menu. For example:

Deploy to Targets

  • In this example, we will select the databases that will use this metric. You should note that if you specified Additional Parameters on the Target Type selection, those will be applied to the search. In my example, a standalone database and the CDB version are available (PDB's are not listed). For example:

Selecting Targets

  • OEM will then copy the metric to the targets supplied as a background job. For example:

Scheduled Deployment to Targets

  • You can set targets for individual targets using the Metrics and Collection Settings on the individual target. For example:

Setting Target specific values

  • Scroll down to see the metric and set the appropriate values. If not they will be defaulted from the metric itself. For example:

Example Metric in the Target

 

This is the conclusion of this article. Obviously I cannot cover everything you need to know in one article but hopefully you can see how easy it is to add custom metric extensions. In other articles I will add more detail and add other types of metrics.

Thursday Mar 10, 2016

Application Testing: The Oracle Utilities Difference

Late last year we introduced a new product to the Oracle Utilities product set. It was the Oracle Functional/Load Testing Advanced Pack for Oracle Utilities. This pack is a set of prebuilt content and utilities based upon Oracle Application Testing Suite.

One of the major challenges in any implementation, or upgrade, is the amount of time that testing takes in relation to the overall time to go live. Typically testing is on the critical path for most implementations and upgrades. Subsequently, customers have asked us to help address this for our products.

Typically one technique to reduce testing time is to implement automated testing as much as possible. The feedback we got from most implementations was that the adoption of automated testing tools initially was quite high as you needed to build and maintain the assets for the automated testing to be cost effective. This typically requires specialist skills in the testing tool.

This also brought up another issue with traditional automated testing techniques. Most traditional based automated testing tools use the user interface to record their automation scripts. Let me explain. Typically using traditional methods, the tool will "record" your interactions with the online system including the data you used. This is then built into a testing "script" to reproduce the interactions to automated them. This is limiting in that to use the same script with another set of data, for alternative sceanrios, you have to get a script developer to get involved and this requires additional skills. This is akin to programming.

Now let me explain the difference with Oracle Application Testing Suite in combination with the Oracle Functional/Load Testing Advanced Pack for Oracle Utilities:

  • Prebuilt Testing Assets - We provide a set of prebuilt component based assets that the product developers use to QA the product. These greatly reduce the need for building assets from scratch and get you testing earlier.
  • One pack, multiple products, multiple versions - The pack contains the components for the Oracle Utilities products supported and the versions supported.
  • Service based not UI based - The components in the pack are service based rather than using the UI approach traditionally used. This is to isolate your functionality from any user experience changes. In a traditional approach, any changes to the User Interface would require either to re-record the script or making programming changes to the script. This is not needed for the service based approach.
  • Supports Online, Web Services and Batch - Traditional approaches typically would cover online testing only. Oracle Application Testing Suite and the pack allows for online, web services and batch testing as well which greatly expands the benefits.
  • Component Generator Utility - Whilst the pack supplies the components you will need, we are aware the fact that some implementations are heavily customized so we provide a Component Generator which uses the product meta data to generate a custom component that can be added to the existing library.
  • Assemble not code - We use the Oracle Flow Builder product, used by many Oracle eBusiness Suite customers, to assemble the components into a flow that models your business processes. Oracle Flow Builder simply generates the script that is executed with the need for technical script development.
  • Upgrade easier - The upgrade process is much simpler with the flows simply pointed to the new version of the components supplied to perform your upgrade testing.
  • Can Co-exist with UI based Components - Whilst our solution is primarily service based, it is possible to use all the facilities in Oracle Application Testing Suite to build components, including traditional recording, to add any logic introduced on the browser client. The base product does not introduce business logic into the user interface so the base components are not user interface based. We do supply a number of UI based components in the Oracle Utilities Application Framework part of the pack to illustrate that UI based components can co-exist.
  • Cross product testing - It is possible to test across Oracle Utilities products within a single flow. As the license includes the relevant Oracle Application Testing Suite tools (Flow Builder, OpenScript etc) it is possible to add components for bespoke and other solutions, that are web or service based, in your implementation as well.
  • Flexible licensing - The licensing of the testing solution is very flexible. You not only get the pack and the Oracle Application Testing Suite but the license allows the following:
    • The license is regardless of the number of Oracle Utilities products you use. Obviously customers with more than one Oracle Utilities product we see a greater benefit but it is cost effective regardless.
    • The license is regardless of the number of copies of products you run the testing against. There is a server enablement that needs to be performed as part of the installation but you are not restricted to non-production copies you run the solution against.
    • The license conditions include full use of the Oracle Application Testing Suite for licensed users. This can be used against any web or Web Service based application on the site so that you can include third party integration as part of your flows if necessary.
    • The license conditions include OpenScript which allows technical people to build and maintain their own custom assets to add to the component libraries to perform a wide range of ancillary testing.
  • Data is separated from process - In the traditional approach you included the data as part of the test. Using this solution, the flow is built independent of the data. The data, in the form of databanks (CSV, MS Excel etc) can be attached at the completion of the flow, in the flow definition or altered AFTER the flow has been built. Even after the script has been built, Oracle Flow Builder separates the data from the flow so that you can substitute the data without the need to regenerate the script. This means you have greater reuse and greater flexibility in your testing.
  • Flexible execution of Testing - The Flow Builder product generates a script (that typically needs no alteration after generation). This script can be executed in OpenScript (for developers), using the optional Oracle Test Manager product, loaded into the optional Oracle Load Testing product for performance/load testing or executed by a third party tool via a command line interface. This flexibility means greater reuse of your testing assets. 

Support for Extensions

One of the most common questions I get about the pack is the support for customization (or extensions as we call them). Let me step back before answer and put extensions into categories.

When I discuss extending our product there is a full range of facilities available. To focus on the impact of extensions I am going to categorize these into three simple categories:

  • User Interface extensions - These are bits of code in CSS or Java script that extend the user interface directly or add business logic into the browser front end. These are NOT covered by the base components as the product has all the business logic in the services layer. The reason for this is that the same business rules can be reused regardless of the channel used (such as online, web services and batch). If you have it in just one channel then you miss those business rules elsewhere. To support these you can use the features of Oracle Application Testing Suite to record that logic and generate a component for you. You can then include that component in any flow, with other relevant components, to test that logic.
  • Tier 1 extensions - These are extensions that alter the structure of the underlying object. Anything that changes the API to the object are what I am talking about. Extension types such as custom schemas which alter the structure of the object (e.g. flattening data, changing tags, adding rules in the schema etc). These will require the use of the Component Generator as the API will be different than the base component.
  • Tier 2 extensions - These are extensions within the objects themselves that alter behavior. For example, algorithms, user exits, change handlers etc are example of such extensions. These are supported by the base components directly as they alter the base data not the structure. If you have a combination of Tier 1 and Tier 2 then you must use the Component Generator as the structure is altered.

Customers will use a combination of all three and in some cases will need to use the component generators (the UI one or the meta data one) but generally the components supplied will be reused for at least part of the testing, which saves time.

We are excited about this new product and we look forward to adding more technology and new features over the next few releases.

Wednesday Mar 09, 2016

OSB 12c Adapter for Oracle Utilities

In Oracle Utilities Application Framework V4.2.0.3.0 we introduced  Oracle Service Bus adapters to allow that product to process Outbound Messages and for Oracle Utilities Customer Care And Billing, Notification and Workflow records.

These adapters were compatible with Oracle Service Bus 11g. We have not patched these adapters to be compatible with new facilities in Oracle Service Bus 12c. The following patches must be applied:

 Version  Patch Number
 4.2.0.3.0  22308653
 4.3.0.0.1  21760629
 4.3.0.1.0  22308684
 

Tuesday Mar 08, 2016

Optimizing Your To Do Experience

One of the key objects in the Oracle Utilities Application Framework is the To Do object. It is one of the most commonly used objects and I am asked by customers and partners on various techniques they can use to manage the To Do records generated efficiently. Part of the issue with the To Do object is that it is sometimes used incorrectly causing implementation issues long term.

Before I outline some advice on how to optimize the use of To Do's I want to spend some time describing the concept of the To Do object.

Primarily the product is used to automate business processes within a utility organization (gas, electricity, water, waste water etc).  Sometimes, due to some condition or data issue, typically an exception, the product cannot automatically resolve the condition or data to satisfy the business process. In this case, a human needs to intervene to correct the condition or data and allow the process to proceed (usually on the next execution of the process). In this case the automation of the business process will create a To Do record outlining the type of exception (expressed in a To Do Type), the priority of the issue,  extra information, as well the links back to the relevant record that created the issue (for navigation). The product will allocate the To Do record to a To Do Role which presents the group of people that are allocated to address the exception. One of the people allocated to the To Do will work on the exception to resolve it and then mark the To Do as complete. The product will reprocess the original object that caused the exception whenever it is scheduled within the product.

In summary, whenever an exception is detected that requires human intervention, a To Do is created and then managed by designated individuals to resolve the exception for reprocessing. For example, say you are billing a customer and that customer has some information that is not complete for a successful bill to be generated. The product would raise a relevant To Do of a particular type and indicate a group of people to resolve the missing information so that the product can successfully bill that customer.

With all these facts in mind, here is my advice:

  • To Do's are transient. They should be treated as such. They are created as needed and then once they are completed they should be only retained for a short time and then removed. We supply a purge process, F1-TDPG, in the products to remove completed To Do's after a period of time. We also supply an ILM based solution for To Do's as well if you wish to retain the data for longer periods. There is an article outlining the purge process.
  • Do not use the To Do object for anything other than managing exceptions. I have seen it used to record business events and other data (including using it for a bespoke analytics). There are other methods for satisfying business requirements. For example, log entities can be used on most objects to record events and we also have a generic FACT object that can be used for all sorts of extensions.
  • Examine each To Do Type and see if someone in your organization is actually doing anything with those To Do entries. If there is no business process to deal with the exceptions then you should reexamine whether you need to actually generate the To Do in the first place. Having To Do entries sit there and not be closed is not recommended as it would just build up slowly. If you do not have a business process for the exception then consider turning off that To Do generation. You must do this for base To Do Types as well as any custom To Do Types.
  • For custom To Do Types, check for duplicate To Do entries. This does happen with customizations. When an exception occurs where a To Do needs to be generated, the customization should check if an existing To Do is already created before creating a new one.
  • Examine anywhere within the product where a To Do is created and completed within the same transaction. This is a sign that probably the To Do should not be created in the first place. Consider turning off the To Do creation. If this is needed for some business process, look at running the purge process regularly to keep these optimally.
  • Optimize the To Do Roles allocated to the To Do Type. The demonstration database is shipped with a single To Do Role per To Do Type. This is not the only configuration. You can use To Do roles to manage teams of people and then allocate them to the To Do Types they work on. You can have many To Do Types with many To Do Roles (and visa versa). You do need to nominate a default To Do Role on the To Do Type, which represents the default group of people to manage the To Do's of that To Do Type if no To Do Role is specified at creation time.
  • The To Do Type has a number of algorithms that allow for greater control of the To Do:
    • Calculate Priority - By default the priority on the To Do record is inherited from the To Do Type but it is possible to alter the Priority based upon additional information in the object or in your processing using this algorithm.
    • External Routing - Routing To Do information to external systems or other objects.
    • To Do Post Processing - Process the To Do after it is created or updated. For example, if the To Do is updated you can use this algorithm to pass additional information or state to another system or dashboard application.

These are some of the techniques that will optimize your experiences with To Do. Remember the volume of To Do's is really an indicator of your data quality so improving the quality of data is also a valid technique for minimizing the management of To Do.

There is additional advice on optimal managing of To Do in the whitepaper Overview and Guidelines for Managing Business Exceptions and Errors (Doc Id: 1628358.1) from My Oracle Support.

Saturday Mar 05, 2016

Cloud and not to Cloud

At the Oracle Utilities Edge Conference the word most said was "Cloud" and whilst I tried to avoid overusing the word in my sessions I think it is important to clarify and understand what this means to your implementation.

  • We announced our Software As A Service plans and capabilities at the Oracle Utilities Edge Conference. This is a full service offering from us.
  • We announced our support for Platform As A Service (a whitepaper will be out in a few weeks time on this) and support for Private Clouds.
  • We announced a series of capabilities across deploying, extending and managing our cloud implementations available now, soon and in the future.

Even if you are not intending to use ANY cloud facilities now and in the future there are some key messages to remember:

  • EVERYTHING, that is applicable, we are introducing to make or cloud implementation easier and more efficient will be rolled into the products (either into the product or the Oracle Application Management Pack for Oracle Utilities) available for on-premise and Private Cloud implementations. This is a key thing to remember. Just because we are focusing some of our efforts on the cloud, we are not abandoning other implementation styles. Anything we make easier will make your implementations much easier.
  • We are not assuming all customers will use the cloud so we are offering flexible cloud offerings to cover as much of the spectrum as we can. For example, you can use PaaS for your non-production environments and retaining on-premise for production. This gives you full access as you would see on site with some of the cost moved away.
  • We are in fact duplicating some of our functionality for on-premise and cloud implementations. For example, I announced that all the cloud metrics we are exposing to the cloud will also be available in our Application Management Pack for Oracle Utilities as well for on-premise implementations. This is just an example of trying to give you what we use. Our Oracle Functional/Load Testing Advanced Pack for Oracle Utilities is another example. It contains the testing components we use in our QA cycles.

 

Extending the Product - Supporting technologies

First of all I want to thank all the customers and partners who attended the technical stream at the annual Oracle Utilities Edge Conference in Phoenix Arizona. I hope those attended found the sessions helpful. It was a pleasure to meet and chat to many customers and partners during and after the sessions.

We made a series of announcements at the sessions about the future directions of our products and our works program for the future. Customers and Partners who did not attend the conference will get this information as we publically release the facilities we discussed over the next few service packs and releases.

One of the announcements has caused confusion amongst the attendees and I wanted to clarify a few points to make sure that everyone understands the announcement.

We announced that we will be adding support for Groovy as an extension language in a future release. I want to point out a few things about this:

  • Groovy support for extensions was added primarily to support our Software As A Service (SaaS) offerings for our products. In SaaS we will lock down the facilities accessible for extensions to protect security and performance. As Java has access low level calls we will not allow Java based extensions in the SaaS cloud. As Groovy is used for a similar role in other Oracle Cloud offerings, we are extending the OUAF to support Groovy based extensions to allow extensions typically implemented in java to be implemented in the cloud. We will be supporting Scripting and Groovy based extensions in SaaS offerings.
  • To protect security and performance we will be whitelisting support upon parts of Groovy to prevent inappropriate access. We use a similar technique for scripting and other extension support.
  • Java and scripting based extension support will be continued to be supported in the OUAF. You will not be forced to migrate to Groovy for on-premise, Platform As A Service (PaaS) and Private Cloud implementations. We will continue to extend our Java, Groovy and scripting support over future releases. We are not dropping the existing technologies. 
  • You can use Groovy for on-premise, Platform As A Service (PaaS) and Private Cloud implementations if you desire to use that technology. Java programmers can migrate to Groovy skills pretty easily as Groovy is very similar to Java (at the end of the day Groovy code becomes Java byte code anyway).

In summary, Groovy is being supported as an alternative to scripting and Java, it is not replacing them. It is primarily used to address the lack of access to Java for use in building extensions in Oracle Utilities SaaS offerings. It can be used by any style of implementation if you desire but if you are not using our SaaS offerings you can continue to use all the styles of extension we have available.

Now that I clarified that, I want to clarify one other misconception and that is the performance profile of the different styles:

  • When loaded into memory, which what happens to all extensions regardless of technology, raw performance of Java, scripting and Groovy are similar. Java does have a slight edge as it compiled byte code and Groovy/Scripting are interpreted with a small initial overhead. There is not much difference in the performance once it ends up in memory. You have the flexibility of picking any of those technologies with confidence that the code will perform with efficient programming.
  • Each language has its facilities and there are some clauses that are unique to each. For example, Java has full flexibility especially with computational type and also has a wide API. Groovy has a very similar API to Java but in our implementation of Groovy, we will be whitelisting clauses to protect security and performance. Scripting has a generous set of programming clauses but it is smaller in scope. In the vast majority of most extensions, scripting is more than enough but you can choose to use Java or Groovy (in the future). The choice if language gets down to skills and what your scope is.
  • In most case it is recommended to use scripting first, if that is not appropriate then you can soon use Groovy and even if that is not appropriate then use Java. Java has overheads in deployment where scripting and Groovy will be stored in our metadata.

I hope these clarifications will assist you in extending our products now and in the future. We are looking forward to providing a rich development experience no matter which style or technique you choose to use.

 

About

Anthony Shorten
Hi, I am Anthony Shorten, I am the Principal Product Manager for the Oracle Utilities Application Framework. I have been working for over 20+ years in the IT Business and am the author of many a technical whitepaper, manual and training material. I am one of the product managers working on strategy and designs for the next generation of the technology used for the Utilities and Tax markets. This blog is provided to announce new features, document tips and techniques and also outline features of the Oracle Utilities Application Framework based products. These products include Oracle Utilities Customer Care and Billing, Oracle Utilities Meter Data Management, Oracle Utilities Mobile Workforce Management and Oracle Public Service Revenue Management. I am the product manager for the Management Pack for these products.

Search

Archives
« May 2016
SunMonTueWedThuFriSat
1
3
4
5
6
7
8
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
28
29
30
31
    
       
Today