Monday Oct 17, 2011

OIA: Entitlements outside roles in BI Publisher

My colleague Rene has some great postings explaining the importance of keeping an eye on entitlements that fall outside roles whilst developing an RBAC model and how to achieve that within OIA.

In this posting I just add another example report which will expose entitlements falling outside roles but this time formatted for use with BI Publisher.  This will be useful to customers wishing to use BI Publisher as a reporting tool.

When loaded into BI Publisher the report will generate a listing by Business Unit, by Resource, by user for all the user's entitlements that fall outside the role model.  You can specify a Business Unit and Resource as parameters to the report.  This report will only include attributes flagged as minable to allow it to avoid including attributes that are unimportant from an entitlement  point of view (like Firstname, for example).

 The two BI Publisher files which define this report--the.xdo file which contains the SQL definition and the .rtf formatting file--are available here.

Wednesday Sep 14, 2011

Oracle Identity Analytics 11gR1 PS1 ( quick installation

The latest version of the Oracle Identity Analytics product, OIA, contains some considerable enhancements in terms of Certification usability and risk management as well as improvements in the integration with OIM.  See the documentation for this version here.

This sample ant project helps prepare a deployment and takes care of most of the standard manual tasks.  See the README.txt and the ant script usage screen for pre-requisites and usage. 

The usage screen is shown here:

     [java] Buildfile: build.xml

Helper script to install OIA11gR1PS1.

You should give the App server at least 1024m (-Xms1024m -Xmx1024m)


* Download the product zip file from the Oracle website and copy to the ./product-zips directory
* Download and prep the required jar file dependencies as described in the Installation Guide and copy to the ./custom directory:
* Download the oracle JDBC driver for your Oracle Database and copy to the ./custom directory:
* Fixup the

Typical sequence of commands:
    ant -projecthelp
    ant show-build-properties
        ant clean
        ant dist
        ant repo-drop-oracle
        ant repo-create-oracle

        ant deploy-rbacxhome       -- copy the built rbacxhome to the deployment location
        ant deploy-rbacx           -- deploy to WLS

Deploy rbacx to WLS:
        ant deploy

Undeploy rbacx from WLS:
        ant undeploy

Main targets:

 checkAntVersion        Ensure that we're running ant 1.7
 clean                  delete the dist and staging directories
 dist                   cleans the build directories and then rebuilds the war file
 repo-create-oracle     Run the repo set up script. Make sure neither rbacx schema nor rbacxservice user exist
 repo-drop-oracle       drop the schema and rbacxservice user
 show-build-properties  show the build properties
Default target: usage

Total time: 0 seconds

Friday Nov 13, 2009

In the mire: cleaning open provisioning tasks in Oracle Identity Manager

Looking at Oracle Identity Manager (OIM) I found I was accumulating open provisioning tasks that had either failed or that I simply no longer wanted.  The problem is the product does not offer a way to delete them.  Furthermore some resources (like the Sun DS Connector) will not allow you to start a new provisioning task for that resource until the old one has completed.  The obvious scheduled task 'Remove Open Tasks', as described here  appears to be more a cosmetic cleanup of some views into open tasks.  In fact looking into the task code, it does a time based 'DELETE from' on the OTI table--this turns out not to resolve the open provisioning task problem.  So, short of reverting to a clean snapshot of the database, what to do ?  I should say that the following is more in the way of an exploration that a recommendation.

The folks on the OTN forum offered some help but that was inconclusive.  Time to roll up the sleeves.

There are 225 tables in the OIM schema. Which ones are involved when a provisioning task is created ? What we need is a way to diff the database before and after an operation.  This will give an idea of the tables involved, the data model OIM is using and may allow us to remove those tasks--however recommended that may be.

For the purpose of reverse engineering table usage for individual application actions (such as in my case initiating a provisioning task) what we want is something that will indicate table data that has changed and do it's best to present those changes in a text form.  The delta will typically be small so we do not require a fancy diff capability.  It would be nice if the tool could work against Oracle, MS SQL Server and MySQL.  'Ah feel a wee tool comin' oooon.

This JDBC based tool, DbDump, does it's best to dump a specified set of tables to a text file. You will need to configure the jdbc properties and copy in the database driver jar files. We can then run a diff tool to compare before and after outputs.  The command line diff tool is adequate though windiff.exe or for example are easier to read.

< 'ADMINSERVER_WLSTORE'::'ID:-1','TYPE:-1','HANDLE:2''RECORD:000000707B7365727665723D41646D696E53657276657221686F73743D3132372E302E302E3121646F6D61696E3D626173655F646F6D61696E2173746F72653D41646D696E5365727665725F4F494D5F4A44424353544F5245217461626C653D41646D696E5365727665725F574C53746F72657D980976C5055875C800000124D3BC92EF'
> 'ADMINSERVER_WLSTORE'::'ID:-1','TYPE:-1','HANDLE:2''RECORD:000000707B7365727665723D41646D696E53657276657221686F73743D3132372E302E302E3121646F6D61696E3D626173655F646F6D61696E2173746F72653D41646D696E5365727665725F4F494D5F4A44424353544F5245217461626C653D41646D696E5365727665725F574C53746F72657D980976C5055875C800000124D3C43CE8'
> 'AUD_JMS'::'AUD_JMS_KEY:187','AUD_CLASS:UserProfileAuditor','IDENTIFIER:86','JMS_VALUE:BLOB','DELAY:1','FAILED:0','CREATE_DATE:2009-11-8','UPDATE_DATE:2009-11-8''PARENT_AUD_JMS_KEY:null'
> 'OBI'::'OBI_KEY:145','OBJ_KEY:11','REQ_KEY:null','ORC_KEY:null','OBI_STATUS:Approved','OBI_DEP_REQUIRED:null','OBI_STAGE_FLAG:2','QUE_KEY:null','USR_KEY:null','OBI_DATA_LEVEL:null','OBI_CREATE:2009-11-8','OBI_CREATEBY:1','OBI_UPDATE:2009-11-8','OBI_UPDATEBY:1','OBI_NOTE:null''OBI_ROWVER:0000000000000000'
> 'OSH'::'OSH_KEY:257','SCH_KEY:241','STA_KEY:4','OSH_ACTION:Engine','OSH_ASSIGN_TYPE:Default task assignment','OSH_ASSIGNED_TO_USR_KEY:1','OSH_ASSIGNED_TO_UGP_KEY:null','OSH_ASSIGNED_BY_USR_KEY:null','OSH_ASSIGN_DATE:2009-11-8','OSH_DATA_LEVEL:null','OSH_CREATE:2009-11-8','OSH_CREATEBY:1','OSH_UPDATE:2009-11-8','OSH_UPDATEBY:1','OSH_NOTE:null''OSH_ROWVER:0000000000000000'
> 'OSI'::'SCH_KEY:241','ORC_KEY:170','MIL_KEY:146','REQ_KEY:null','TLG_KEY:null','RSC_KEY:null','OSI_RECOVERY_FOR:null','OSI_RETRY_FOR:null','OSI_ASSIGNED_TO:null','TOS_KEY:32','PKG_KEY:34','ACT_KEY:1','ORD_KEY:1','ORC_SUPPCODE:00     ','OSI_ASSIGN_TYPE:Default task assignment','OSI_ESCALATE_ON:null','OSI_ASSIGNED_TO_USR_KEY:1','OSI_ASSIGNED_TO_UGP_KEY:null','OSI_RETRY_ON:null','OSI_RETRY_COUNTER:null','OSI_CHILD_TABLE_KEY:null','OSI_CHILD_OLD_VALUE:ecFFyIei7ntqs5tETSu38w==','OSI_ASSIGNED_DATE:2009-11-8','SCH_INT_KEY:null','OSI_LOG_KEY:null','OSI_DATA_LEVEL:null','OSI_CREATE:2009-11-8','OSI_CREATEBY:1','OSI_UPDATE:2009-11-8','OSI_UPDATEBY:1','OSI_NOTE:null''OSI_ROWVER:0000000000000000'
> 'OTI'::'OTI_KEY:190','SCH_KEY:241','SCH_TYPE:null','SCH_STATUS:P','SCH_DATA:null','SCH_PROJ_START:2009-11-8','SCH_PROJ_END:2009-11-8','SCH_ACTUAL_START:2009-11-8','SCH_ACTUAL_END:null','SCH_ACTION:null','SCH_OFFLINED:0','ORC_KEY:170','MIL_KEY:146','OSI_RETRY_FOR:null','OSI_ASSIGNED_TO:null','PKG_KEY:34','REQ_KEY:null','OSI_ASSIGNED_TO_USR_KEY:1','OSI_ASSIGNED_TO_UGP_KEY:null','OSI_ASSIGNED_DATE:2009-11-8','ACT_KEY:1','OSI_ASSIGN_TYPE:Default task assignment','PKG_TYPE:Provisioning','STA_BUCKET:Pending','OBJ_KEY:11','OTI_CREATE:2009-11-8','OTI_UPDATE:2009-11-8','OTI_CREATEBY:1','OTI_UPDATEBY:1','OTI_ROWVER:0000000000000000','OTI_DATA_LEVEL:0''OTI_NOTE:null'
> 'SCH'::'SCH_KEY:241','SCH_TYPE:null','SCH_STATUS:P','SCH_PROJ_START:2009-11-8','SCH_PROJ_END:2009-11-8','SCH_ACTUAL_START:2009-11-8','SCH_ACTUAL_END:null','SCH_DATA:null','SCH_REASON:null','SCH_ACTION:null','SCH_DATA_LEVEL:null','SCH_CREATE:2009-11-8','SCH_CREATEBY:1','SCH_UPDATEBY:1','SCH_UPDATE:2009-11-8','SCH_NOTE:null','SCH_ROWVER:0000000000000000''SCH_OFFLINED:null'

So the tables involved are: OSI, OSH, SCH, OIU, OTI, ORC and OBI.

Using the ORC_KEY and SCH_KEY the relevant rows can be deleted from the above tables in the order listed.  An order is implied as there are contraints on the tables.

select sch_key from OIMUSER.OTI where orc_key=XXX;

So that is be one way to remove the tasks brutally--whether supported or not.

Using the Client API to complete the tasks

Rather than deleting rows one can try to at least complete the tasks manually, making a call to the tcProvisioningOperationsIntf interface (see here for code examples):

tcProvisioningOperationsIntf provIntf =
   (tcProvisioningOperationsIntf) getUtilityFactory().getUtility("Thor.API.Operations.tcProvisioningOperationsIntf");

The tasksArray contains a list of task instance ids.  What are these? The Description field of the open provisioning task in the admin interface contains a number. It turns out that this number is the ORC key. However, in order to call into the Oracle API, as suggested on the OTN forum, we need the task instance id.  This turns out to be the SCH_KEY can be recovered using the query shown above.

To use the API to complete the tasks appropriately actually requires two steps.  The first is to make sure that when the task is completed that the appropriate status mapping takes place.  Do this in the Design Console by going to the Process Definition of the task, for example 'Create User' for your resource.  Then to the 'TaskToObjectStatusMapping' tab.  When a task is completed from the API, the completion code goes to 'MC', for 'Manually Complete'.  So we need to map 'MC' to 'Provisioned' in order to get the task to complete appropriately.  If the task was stuck in the 'System Validation' phase then mapping 'MC' to for example 'Revoked' will cause the provisioning event to be completed, allowing us to launch further provisioning tasks without any issues. 

Of course these manually completed tasks are still visible in the admin interface.  Perhaps the best way to clean them out would be to run the Task Archival maintenance scripts that come with the product as described here.


OIM does not appear to offer any easy way to clean up unwanted open provisioning tasks.  In fact it does not appear to offer any documented way at all to clean them up.  The best one can do in terms of what the product supports is to complete them and possibly then hide these tasks from view or archive them off.  In dev environments and possibly POCs going straight to the tables may be best, but without an official spec of the data model this is always risky.

Monday Aug 10, 2009

Climbing Mount RBAC: shun the snowy bit

There is an image I use that offers an informal way to understand one process for creating roles as part of an Enterprise Role Model (ERM) project.  You will probably never capture 100% of your entitlements in your ERM, but you will capture enough to realize significant Return on Investment (ROI) by improvements to business,  infrastructure and compliance processes.  Presenting it in this way as 'Mount RBAC' is an idea I first saw expressed by Squire Earl in the green pastures of the Sun campus in Austin, Texas.

So here is the image:

Where to start: the bottom

Start with the observation that there are typically small sets of entitlements that most users will receive. Usually this is not hard to identify: for example desktop login and email access.  Other candidates at this level would be an entry in and anonymous search access to a corporate white pages directory.  Typically this role would be called something like 'BaseAccess' or 'Employee'.  This level of the model can be thought of as being linked to the notion of worker type.  Typical worker types might be permanent employees, contractors, interns and so on.  We can think of these roles as being quite crude: they capture large numbers of users and define vanilla access to standard systems.  This approach also obeys the principle of least privilege: we will then go on to add additional entitlements to the user based on a finer grained analysis of his business functions and HR attributes.  We can see that this aids automation of the hiring process, for, once a worker is identified with a type we can provision the systems required to get him productive on day one of his job.

Where next: finer grained entitlements

We can proceed with analyzing the HR attributes to uncover further role definitions, linking entitlements to sets of users defined at the large scale in structures like 'Division', continuing down to 'Department' 'and 'JobFunction' or 'JobTitle'.  Of course the sets are not necessarily all contained inside one another Russian doll style: other attributes like 'Location' or 'BuildingNumber' or say 'ProjectName' are orthogonal to the pure job function and business activities structures.    What we find is that as we move to finer grained analysis it becomes more useful to use tooling to uncover the relationships between entitlements and sets of users.  So we can kick start the ERM definition process by _defining_ obvious roles but use a tool based role mining process once we have exhausted the more easily defined  roles.  Experience here is that efforts that rely solely on the definition approach tend to flounder in the mire of committee like attempts to determine what the roles should be.  A better approach at this level of granularity is to let tooling mine out the existing relationships and use those roles as the basis from which to refine further.

Where to stop: the snowy bit

Now one problem projects can run into is 'role explosion'.  The problem there is that so many roles are identified that managing those roles starts to be even more costly than managing the original lists of entitlements that we started out with.  This is why Mount RBAC has a snowy bit: we recognize up front that there will be aspects of a user's access rights that are exceptional, temporary or otherwise not worth the effort of bringing into the role model. This does not pose an audit or compliance risk because we do track those entitlements even though they lie outside the role model.


If you put your project's Business Analysts, HR, IT people, and middleware software in a giant bucket and shook it for 12 months the chances of a meaningful ERM deployment emerging is pretty low.  An alternative is an evolutionary path, each step offering tangible ROI. The refinement process described here where we start with large sets of users and work towards smaller sets with finer grained business roles provides one such approach.  At appropriate points you will need to deploy the right tooling--and stop when you start to get cold feet.

Thursday Jan 29, 2009

'Government' is not a four letter word

In certain circles 'Government' is a dirty word.  Amusing then to watch those financial chaps running back to Government for help.  However in a brazen display, worthy of Dick Turpin, they have put a new perspective on the phrase 'Tax is Theft' by themselves making off with our tax euros! Public subsidy, private profit.  Lovely.

In Identity circles the Government sector presents interesting challenges.  Some of the issues we see in the government sector that come up are federation, scalability, compliance and usability.

  • Federation is key for government because they have requirements to federate user identity and data across different government organizations--the department dealing with your health care data is not the one managing your driving license.  Here is a nice description of one EU federation use case.
  • The scalability issue arises due to the size of the user population (think of China!).   For example, most of the provisioning solutions on the market have grown out of enterprise scale deployments--moving up to scalability levels for government deployments requires innovative techniques and architectures.
  • Compliance is key because governments, in principle, are subject to more stringent laws on managing information about people.
  • Usability is key because we can make no assumptions about the nature of the user's knowledge.  In an enterprise we can ensure everyone has taken the 'Managing your Profile' training. We cannot do this so easily with 60 million users.  Internationalization, for example, is something that must be addressed up front, not as an afterthought.

The EU appears to be trying to get it's multiple heads around 'identity' as part of the eGovernment program.  There, you can find a link to their 'eIDM' roadmap.

Well I look forward to some fun POCs with the Sun Identity Management products in response to EU eGovernment RFPs!

Blog name could be better

In which Rob reflects on the quality of his blog name.[Read More]



« July 2016