Thursday Feb 27, 2014

Identifying HumanTask TaskId from IdentificationKey using XREF

Since Oracle BPM PS4FP there is an activity task called UpdateTask, which is able to alter a task within the process. By using it, we can Withdraw, Suspend or Resume a task based on the name of the task (implicitly using the activityId), all instantiated tasks within the process(generally used when we want to terminate the process), or using explicitly the instantiated task id.

Problem:  when multiple instances of the same task (instantiated within a multi-instance subprocess for example), how can I withdraw a specific instance only? On the paper, as mentioned above we can use the TaskId option in the UpdateTask to achieve this goal... but how to get the TaskId?

For once, the documentation could not be any help... it mentions that we can provide a "custom" TaskId during the Human Task instantiation, but without any further indication. So after multiple attempts, including modifying the execData/systemAttributes/taskId directly, I gave up the idea of trying to set my own TaskId. Instead, I explored another direction by checking how I can leverage the IdentificationKey that I have full control on, and when set properly I can definitely end up with a clean one-to-one mapping with the TaskId. The challenge is then to extract the mapping out.
Long story short, I will skip all the detail steps I went through before coming up with the following solution. I can only say that I initially start with using Event Task/Mediator to capture the IdentificationKey/TaskId couple, then creating a XREF definition and an utility service to handling the data lifecycle in the XREF map... A long, but necessary journey to end up with this final and clean solution, which can be considered as a generic pattern for others usages.

And as usual, you can download the complete project of the solution before hand here.

1. First of all, create a simple process with correlation including a multi-instance subprocess with parallel mode, so that we can instantiate multiple instances on a same Human Task at once. Of course, make sure that each instance can be uniquely identified by the IdentificationKey. In my case, I append the correlationId with the pre-defined variable loopCounter. (it is important to have the correlationId in the IdentificationKey,  I will explain later)



2. Create an event subprocess to take an input argument the IdentificationKey, from which we can extract the correlationId by leveraging the "#" character I specify above



3. Now we need to create a XREF definition file. This XREF definition will contain 2 columns - IdentificationKey & TaskId



4. In the XREF editor, check the option Optimize to "Yes", then "Generate Table DDL" By doing so, you will see a DDL statement to be executed into your SOAINFRA database. This table will be used as a key-value map, accessible via the out-of the box XREF XPath functions (more detail on XREF here)




5. This is the tricky part: DO NOT EXECUTE the ddl statement as it!
XREF is a generic feature for developer to temporary store process data directly into the SOAINFRA, shareable across multiple processes running from different context. In our case, the data are already present in the SOAINFRA schema, via the WFTASK table. So, we do not need to create the table, to populate, to lookup or to delete data into it. We just need to convert the CREATE TABLE ddl statement into a CREATE VIEW ddl statement to map out the IdentificationKey/TaskId (or any other process/task metadata that you would like to access from the BPM repository from your own process !!!).

6. Here's the CREATE VIEW that I used, but you can amend the WHERE condition to better fit your specific need. Run the view creation ddl into your database:

CREATE OR REPLACE VIEW XREF_TASKIDREF
AS SELECT IDENTIFICATIONKEY, TASKID
    FROM WFTASK
  WHERE STATE='ASSIGNED'
     AND IDENTIFICATIONKEY IS NOT NULL;




7. To finish, configure your UpdateTask activity by extracting the TaskId by using the XREF function with IdentificationKey as argument, and you are done !

xref:lookup('TaskIDRef.xref,'identificationKey',bpmn:getDataObject('vIDKey'),'taskId',false())




Now, let's do a simple testing creating a process with 4 tasks instantiated together:




Then, just select one of the tasks to withdraw by selecting the withdrawTask operation and using the IdentificationKey as argument



Voila !!! 


Monday Jan 27, 2014

Extract Human Task Event payload detail in SOA Mediator

Oracle SOA/BPM Human Task provides a very powerful feature to generate Business Event upon a task assignment or completion. The associated Business Event will then be sent to the Oracle Event Delivery Network and it can be captured by a Mediator component, which in turn executes others services or processes, like creating a Microsoft Exchange Task with Reminder as nicely described from this blog entry.


When the information that you need from the Business Event is only related to the Task metadata (taskid, assignee, taskURL…) then the configuration is pretty straight forward as it is possible to extract each desired element easily from the XSL mapping tools. And there are plenty of blogs out there showing the step by step configuration for this purpose.


However, if you need to extract a specific ‘value’ from the Task payload, the picture suddenly becomes more complicated as the ‘payload’ element in the Business Event has a xsd:anyType type and it is not possible to “browse” the payload detail, hence mapping them out easily from the XSL Mapping tool in Jdeveloper for further processing.


To address to the problem there are 2 options: 



  1. (Easy but Restrictive) Map the specific values to be catch using the Human Task Mapped Attributes (or directly in the systemMessageAttributes in the execData). When doing so those values can not only be used for queries from BPM Workspace, and they will be mapped as the Task metadata so you can easily link them up during the XSL transformation. Nevertheless, every time you want to extract additional values, the Human Task has to be modified and the associated process needs to be redeployed. It is a valid option if you know exactly what you need upfront.

  2. (Relatively difficult but flexible) Fully utilize XSL Transformation capability to extract the “hidden” values from the payload element. This option requires more XML/XSL knowledge and not necessarily easy to comprehend. However it provides much more flexibility and it can have a completely independent design lifecycle from the original business process. This is the option we will explore in this post. If you want to get before hand the source code of the example below, you can download from here.




First of all, consider the following BPM Process with a very simple Human Task taking 2 Data arguments defined by the Business Object FirstBO and SecondBO. Please note a third Business Object ThirdBO is encapsulated within SecondBO.






Now let’s enable the Human Task to generate a Business Event upon the task assignment. To do this, open the Human Task, go to Event and check the box OnAssigned (Please note the other available option but there are not key in our current demonstration)



From the Business Process perspective we are all set. Now let’s consider the a separate SOA Composite Application which only contains a Mediator to capture the associated business event and an File Adaptor to extract write data into a file.



Once we link the Mediator to the File Adaptor, we can then generate a XSL stylesheet to transform the Human Task Event into service call to write data into a file. 



Now the challenge is to extract the Payload information within the stylesheet. This can be done via the XSL source mode with 3 major steps:



  1. Add all the namespace of the objects within the payload. Those namespaces can be retrieved either directly from the BO definition or the Human Task configuration page

  2.       xmlns:p1="http://xmlns.oracle.com/bpm/bpmobject/Data/FirstBO"
          xmlns:p2="http://xmlns.oracle.com/bpm/bpmobject/Data/SecondBO"
          xmlns:p3="http://xmlns.oracle.com/bpm/bpmobject/Data/SubData/ThirdBO"

     


  3. Create XSL variables, pointing to a specific argument of the Human Task with the Payload, using the namespaces defined above. In our case we have 2 arguments so we create 2 variables

  4. <xsl:variable name="">p1:firstPayload
    select="/tns:taskAssignedMessage/task:task/task:payload/p1:FirstBO"/>
    <xsl:variable name="p2:secondPayload"
    select="/tns:taskAssignedMessage/task:task/task:payload/p2:SecondBO"/>

  5. Finally, you can now manually extract each desired value from the XSL variable by using the XPath expression

  6.       
          <imp1:FirstBO.one>
            <xsl:value-of select="$p1:firstPayload/p1:one"/>
          </imp1:FirstBO.one>
          <imp1:FirstBO.two>
            <xsl:value-of select="$p1:firstPayload/p1:two"/>
          </imp1:FirstBO.two>
          <imp1:FirstBO.three>
            <xsl:value-of select="$p1:firstPayload/p1:three"/>
          </imp1:FirstBO.three>
          <imp1:SecondBO.one>
            <xsl:value-of select="$p2:secondPayload/p2:one"/>
          </imp1:SecondBO.one>
          <imp1:SecondBO.two>
            <xsl:value-of select="$p2:secondPayload/p2:two"/>
          </imp1:SecondBO.two>
          <imp1:SecondBO.three.ThirdBO.one>
            <xsl:value-of select="$p2:secondPayload/p2:three/p3:one"/>
          </imp1:SecondBO.three.ThirdBO.one>
          <imp1:SecondBO.three.ThirdBO.two>
            <xsl:value-of select="$p2:secondPayload/p2:three/p3:two"/>
          </imp1:SecondBO.three.ThirdBO.two>

To get a complete view of the final XSL stylesheet, please download the source code of the 2 projects illustrated from here.

Now you can deploye both projects into your BPM domain, and you can initiate the SampleProcess to start the testing:



After the execution, monitor the process thread in FMW Control and you should see the execution on the Mediator upon the Human Task initialization. 




By clicking on the Mediator link you will be able to see the payload retrieved by the Mediator (Business Event):




And of course the transformed payload using the XSL Stylesheet we specified above:





Voilà ! Hope this entry is useful to you.

Wednesday Jul 04, 2012

Understanding the 'High Performance' meaning in Extreme Transaction Processing

Despite my previous blogs entries on SOA/BPM and Identity Management, the domain where I'm the most passionated is definitely the Extreme Transaction Processing, commonly called XTP.
I came across XTP back to 2007 while I was still FMW Product Manager in EMEA. At that time Oracle acquired a company called Tangosol, which owned an unique product called Coherence that we renamed to Oracle Coherence. Beside this innovative renaming of the product, to be honest, I didn't know much about it, except being a "distributed in-memory cache for Extreme Transaction Processing"... not very helpful still.

In general when people doesn't fully understand a technology or a concept, they tend to find some shortcuts, either correct or not, to justify their lack-of understanding... and of course I was part of this category of individuals. And the shortcut was "Oracle Coherence Cache helps to improve Performance". Excellent marketing slogan... but not very meaningful still. 

By chance I was able to get away quickly from that group in July 2007* at Thames Valley Park (UK), after I attended one of the most interesting workshops, in my 10 years career in Oracle, delivered by Brian Oliver. The biggest mistake I made was to assume that performance improvement with Coherence was related to the response time. Which can be considered as legitimus at that time, because after-all caches help to reduce latency on cached data access, hence reduce the response-time. But like all caches, you need to define caching and expiration policies, thinking about the cache-missed strategy, and most of the time you have to re-write partially your application in order to work with the cache. At a result, the expected benefit vanishes... so, not very useful then?

The key mistake I made was my perception or obsession on how performance improvement should be driven, but I strongly believe this is still a common problem to most of the developers. In fact we all know the that the performance of a system is generally presented by the Capacity (or Throughput), with the 2 important dimensions Speed (response-time) and Volume (load) :

Capacity (TPS) = Volume (T) / Speed (S)

To increase the Capacity, we can either reduce the Speed(in terms of response-time), or to increase the Volume. However we tend to only focus on reducing the Speed dimension, perhaps it is more concrete and tangible to measure, and nicer to present to our management because there's a direct impact onto the end-users experience. On the other hand, we assume the Volume can be addressed by the underlying hardware or software stack, so if we need more capacity (scale out), we just add more hardware or software. Unfortunately, the reality proves that IT is never as ideal as we assume...

The challenge with Speed improvement approach is that it is generally difficult and costly to make things already fast... faster. And by adding Coherence will not necessarily help either. Even though we manage to do so, the Capacity can not increase forever because... the Speed can be influenced by the Volume. For all system, we always have a performance illustration as follow:





In all traditional system, the increase of Volume (Transaction) will also increase the Speed (Response-Time) as some point. The reason is simple: most of the time the Application logics were not designed to scale. As an example, if you have a while-loop in your application, it is natural to conceive that parsing 200 entries will require double execution-time compared to 100 entries. If you need to "Speed-up" the execution, you can only upgrade your hardware (scale-up) with faster CPU and/or network to reduce network latency. It is technically limited and economically inefficient. And this is exactly where XTP and Coherence kick in.


The primary objective of XTP is about designing applications which can scale-out for increasing the Volume, by applying coding techniques to keep the execution-time as constant as possible, independently of the number of runtime data being manipulated. It is actually not just about having an application running as fast as possible, but about having a much more predictable system, with constant response-time and linearly scale, so we can easily increase throughput by adding more hardwares in parallel. It is in general combined with the Low Latency Programming model, where we tried to optimize the network usage as much as possible, either from the programmatic angle (less network-hoops to complete a task), and/or from a hardware angle (faster network equipments). In this picture, Oracle Coherence can be considered as software-level XTP enabler, via the Distributed-Cache because it can guarantee:

- Constant Data Objects access time, independently from the number of Objects and the Coherence Cluster size
- Data Objects Distribution by Affinity for in-memory data grouping
- In-place Data Processing for parallel execution

To summarize, Oracle Coherence is indeed useful to improve your application performance, just not in the way we commonly think. It's not about the Speed itself, but about the overall Capacity with Extreme Load while keeping consistant Speed. In the future I will keep adding new blog entries around this topic, with some sample codes experiences sharing that I capture in the last few years. In the meanwhile if you want to know more how Oracle Coherence, I strongly suggest you to start with checking how our worldwide customers are using Oracle Coherence first, then you can start playing with the product through our tutorial.

Have Fun !

Tuesday Jun 19, 2012

Oracle SOA Foundation Practitioner Certification

Great news for those who are trying to get the Oracle SOA Practitioner Certification (1Z0-451): Packt Publishing just released a book with the essential Oracle SOA knowledges, and most importantly a list of self-test questions to give you a better chance to pass the certification:

http://www.packtpub.com/oracle-soa-infrastructure-implementation-certification-handbook/book

Good exam !


Monday Jun 04, 2012

Addressing threats introduced by the BYOD trend

With the growth of the mobile technology segment, enterprises are facing a new type of threats introduced by the BYOD (Bring Your Own Device) trend, where employees use their own devices (laptops, tablets or smartphones) not necessarily secured to access corporate network and information.

In the past - actually even right now, enterprises used to provide laptops to their employees for their daily work, with specific operating systems including anti-virus and desktop management tools, in order to make sure that the pools of laptop allocated are spyware or trojan-horse free to access the internal network and sensitive information. But the BYOD reality is breaking this paradigm and open new security breaches for enterprises as most of the username/password based systems, especially the internal web applications, can be accessed by less or none protected device.

To address this reality we can adopt 3 approaches:

1. Coué's approach: Close your eyes and assume that your employees are mature enough to know what he/she should or should not do.

2. Consensus approach: Provide a list of restricted and 'certified' devices to the internal network. 

3. Military approach: Access internal systems with certified laptop ONLY

If you choose option 1: Thanks for visiting my blog and I hope you find the others entries more useful :)

If you choose option 2: The proliferation of new hardware and software updates every quarter makes this approach very costly and difficult to maintain.

If you choose option 3: You need to find a way to allow the access into your sensitive application from the corporate authorized machines only, managed by the IT administrators... but how? 

The challenge with option 3 is to find out how end-users can restrict access to certain sensitive applications only from authorized machines, or from another angle end-users can not access the sensitive applications if they are not using the authorized machine... So what if we find a way to store the applications credential secretly from the end-users, and then automatically submit them when the end-users access the application? With this model, end-users do not know the username/password to access the applications so even if the end-users use their own devices they will not able to login. Also, there's no need to reconfigure existing applications to adapt to the new authenticate scheme given that we are still leverage the same username/password authenticate model at the application level.


To adopt this model, you can leverage Oracle Enterprise Single Sign On. In short, Oracle ESSO is a desktop based solution, capable to store credentials of Web and Native based applications. At the application startup and if it is configured as an esso-enabled application - check out my previous post on how to make Skype essso-enabled, Oracle ESSO takes over automatically the sign-in sequence with the store credential on behalf of the end-users.
Combined with Oracle ESSO Provisioning Gateway, the credentials can be 'pushed' in advance from an actual provisioning server, like Oracle Identity Manager or Tivoli Identity Manager, so the end-users can login into sensitive application without even knowing the actual username and password, so they can not login with other machines rather than those secured by Oracle ESSO.

Below is a graphical illustration of this approach:



With this model, not only you can protect the access to sensitive applications only from authorized machine, you can also implement much stronger Password Policies in terms of Password Complexity as well as Password Reset Frequency but end-users will not need to remember the passwords anymore.

If you are interested, do not hesitate to check out the Oracle Enterprise Single Sign-on products from OTN !




















Tuesday May 29, 2012

Handling Hybrid Applications in Oracle ESSO

In a recent project involving Oracle ESSO (Oracle Enterprise Single Sign-On, a Desktop-based Single Sign-On solution that Oracle acquired from Passlogix in 2011), I stated to the customer that Oracle ESSO was flexible enough to handle Automatic Sign-on on most of the web and native applications running on PCs, including Text-based applications through a Terminal... And of course, after such a statement, you can imagine how satisfied the customer was when he found a very common application to prove that I was wrong ! And this application was nothing more than Skype, the popular VOIP application from the web...


Without getting into technical details, this is basically how Oracle ESSO works: it is able to identify the login form of any Web-Based application (by recognizing the URL and the HTML form) or any Windows-Native application (by recognizing the executable signature, and the UI forms within application). Once recognized, it takes over the login process by providing the appropriate credential, either recorded in a previous manual login, or provisioned by a Provisioning system such as Oracle Identity Manager or Tivoli Identity Manager.


The challenge with Skype was... it is not web nor windows based application. It is a new type of application called Hybrid application, with an embedded web server and browser to serve the HTML pages to render the UIs. The business logics (javascripts) are either stored locally, or accessed remotely through SOAP or REST services from Skype servers. This is a way to simplify development effort by having a consistent UIs and logics across different platform, including mobile devices.


Now it is not completely true that ESSO is not able to handle Skype. It does actually recognize the application as a web application, and then it is able to store the credential into the ESSO repository. This is an out-of-the-box mode which allows ESSO to store any website credential centrally and in a secure way, rather than utilizing the browser "remember credentials" capability. But in this mode we do not have control on the web application, such as preventing the automatic re-login after an explicite logout.


In order to add more control logic onto an application that we want to "eSSO"-enable, we need to use Oracle ESSO Logon Manager Admin Console to create an application template. But in this case we can not capture the application as a native windows application, because we can not drill out into the UI form; and we can not capture as a web application either, because we do not have the actual URL... By chance, in Oracle ESSO 11g, we have a new option to create an application template. In the past we need to specify in advance the type of application that we want to capture (web or native). Now, we can use the Title Bar button directly from the application that we want to add ESSO controls on. Here are the steps, by making sure that ESSO-LM Admin Console and Skype are already started:


1. Create a template from the Skype application Title Bar button



2. Ignore the Javascript errors... we do not need them anyway



3. Double confirm that ESSO has successfully recognized the "username" and "password" fields, and change the form name to match your need (Skype Login in this case)



4. Move to Fields Tab in the [Web] Window



5. Select SendKeys as Transfer method



6. Add 'Enter' key as the last action, because the submit button is not explicitly present so we have to reproduce the login sequence manually:



7. Now your template is completed and you can add your ESSO control to fit your requirements! In this case, I set the Logon Loop Grace Period to 480 minute, so when the end-user is logged in and decides to log out within this window, Oracle ESSO will not attempt to re-login again. And this timer is reset if Skype is restarted manually. 



Hope you enjoy the reading, and don't hesitate to download ESSO for your own testing!



Thursday May 03, 2012

Extract emails list from Group ID for BPEL/BPM Notification service

Within Oracle SOA suite, it is possible to create email, SMS, fax or instant messaging based notification via the Notification task usable within BPEL or BPMN processes. For setting up the different communication channels, you can refer to this excellent blog from Rubicon Red - http://www.rubiconred.com/blog/email-notification-with-soa-suite-11g/

However, unlike the notification within the Human Task where we can only provide user id or group id, we need to explicitly specify the email addresses or the phone numbers when using the Notification service independently. 

To address this need we can leverage the Identity Service functions available in BPEL to extract users properties. You can use the instructions below to extract those properties, either from an userid or from a groupid. But if you are lazy, you can download the SOA project with the processes below from here :)



Extract email address and mobile number from an userid


This step is fairly simple. We can use the Identity Service function 'ids:getUserProperty' to extract the user email address, or any other attribute available within the realm.

1. Create a synchronous BPEL process with the following input and output message type




2. Drag an Assign activity into the process 



3. Open the Assign Task, and drag the XPath Expression into the email attribute



4. Use the following expression to extract the email address from the userid ids:getUserProperty($inputVariable.payload/client:userid,'mail')



5. Then map the phone attribute using the expression ids:getUserProperty($inputVariable.payload/client:userid,'mobile') using the same approach

6. Finally, deploy and test the process







Extract email list from from a groupid


To extract an email list from a groupid, the process is a little bit more complicated. To summarize, there are 2 majors steps:

a. Extract the list of userid from the groupid, using the Identity Service function ids:getUsersInGroup()
b. Parse the userid list, and concatenate the email into a list from each userid, using the same ids:getUserProperty() as shown above

The challenge is to be able to extract the userid list, which is dynamic, and then be able to parse over this dynamic list to extract the property that we need. After some investigation, this is a way to achieve with a synchronous BPEL process

1. Create a synchronous BPEL process with the following variables

- usersList is mapped to the Users element specify in the XML schema here
- index and size are two int variables, with index initiated to "1" 



2. Drag and drop the Assign and While activities as shown below





3. In ExtractUsersList assign activity:

- Map the following functions onto $usersList/ns1:Users/ns1:user (using CopyList assignment) and $size:

ids:getUserInGroup($inputVariable.payload/client:groupid,true())
--> $usersList/ns1:user

count($usersList/ns1:user)
--> $size




4. In ParseUsersList while activity, use the following condition to loop over the $usersList/user elements 



5. Finally, in ConstructEmailsList assign activity add the logics to extract each email address and concatenate the result into a list with the seperator:


concat($outputVariable.payload/client:emails,
ids:getUserProperty($usersList/ns1:user[$index],'email'),
$inputVariable.payload/client:separator)

==>  $outputVariable.payload/client:emails

$index+1
==> $index



6. The BPEL process is now completed. Deploy, and test it out !











About

Hi, I am Manh-Kiet Yap (known as Kiet @oracle) and I'm currently the Technical Director at the APAC Advanced Customer Services.

I've recently received my 15 years of long service award, after being successively Technical Consultant in France, Presales at Hong Kong, FMW Product Manager in EMEA, Presales Mgr in APAC and finally Architect at Oracle ACS.

With my 15 years experience around Middleware, I hope you will find this blog valuable if you are navigating around Oracle Fusion Middleware !

View Manh-Kiet Yap's profile on LinkedIn

Search

Categories
  • Oracle
Archives
« April 2014
SunMonTueWedThuFriSat
  
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
   
       
Today