Tuesday Feb 28, 2012

Importing hierarchical entitlement data in OIA - part 3

Just as I explained in my previous blog entry I will discuss importing account and entitlement information from Oracle Health Insurance (OHI) Back Office (BO) into Oracle Identity Analytics (OIA). The OHI BO application is used by healthcare insurers/payers and supports the administrative processing of member data and claims, as well as the product data (including the brands and available distribution channels) and healthcare procurement data required for this type of processing.

More important I will use the <attributeValue> / <attributeValueRef> element pair in the XML data to be imported into OIA so entitlement data can be defined once and referred to many times from the account data. In this particular case this is very useful since OHI Roles are referred to many times from several accounts.

OHI BO's entitlement data consist of OHI application Roles that contain zero or more so-called 'Moduleautorisaties' (Module Authorizations). This means we deal with hierarchical entitlement data. Similar to what I described in my previous blog entry I have used Talend ELT tool to extract the entitlement data (via SQL) from the OHI database and write these to XML files to be imported in OIA.

For more information about this ELT job design please contact me. Anyhow, in the last step I write the output into XML as can be seen from the first lines below (several elements are expanded '-' and other elements are collapsed '+'):

 <?xml version="1.0" encoding="UTF-8"?>
-<rbacx>
    <namespace namespaceShortName="OHI" namespaceName="Oracle Health Insurance"/>
   -<attributeValues>
      +<attributeValue id="Role_BHCC_0001">
      +<attributeValue id="Role_BHPA_0001">
      +<attributeValue id="Role_BHTP_0001">
      +<attributeValue id="Role_DOORSTART">
      +<attributeValue id="Role_DVWZVNTST_ROL">
      +<attributeValue id="Role_GBTP_OVIZA">
      +<attributeValue id="Role_GBTP_TUSSENP1">
      +<attributeValue id="Role_GEBRUIKER_ROL">
      +<attributeValue id="Role_INMAUT001">
      +<attributeValue id="Role_INMAUT002">
      +<attributeValue id="Role_MANAGER_ROL">
      +<attributeValue id="Role_OPENZORG_ROL">
      +<attributeValue id="Role_OPL MERK OPL">
      +<attributeValue id="Role_SCRIPTS">
      +<attributeValue id="Role_XYZ">
    </attributeValues>
   -<accounts>
      +<account id="AADOULI">
      -<account id="ABRUIJN">
         -<name>
             <![CDATA[ABRUIJN]]>
          </name>
          <endPoint>OHI</endPoint>
          <domain>Production</domain>
         -<attributes>
            -<attribute name="Role">
               -<attributeValues>
                   <attributeValueRef id="Role_DOORSTART"/>
                   <attributeValueRef id="Role_INMAUT001"/>
                   <attributeValueRef id="Role_MANAGER_ROL"/>
                </attributeValues>
             </attribute>
          </attributes>
       </account>
      +<account id="ALAKERVE">
      +<account id="ASIEGERS">
      +<account id="BGLAZEMA">
      +<account id="BLAAK">
      +<account id="BLAGEMAA">
...

As can be seen from these lines all the OHI application Roles are defined in the first <attributeValues> element section. These Roles are then referred to in the <accounts> element section below that, completely compliant with the schema file accounts.xsd as shipped with OIA.

Finally, all the OHI account and entitlement data is then imported into OIA (just as the accompanying glossary data) and correlated to the global users. An example of how the data can be examined within OIA can be seen in the picture below:

Have fun, René!

Thursday Feb 16, 2012

Importing hierarchical entitlement data in OIA - part 2

In my last blog entry I talked about importing hierarchical entitlement data into Oracle Identity Analytics (OIA). Today I want to discuss another example regarding Microsoft Windows shared files and folders permissions and show how easy these data can be transformed and imported in OIA for either attestation and / or auditing purposes.

All of the data is represented in two input files. One file containing an AD users export and the other one the file and folder permissions:

File #1 containing AD users adusers.csv is as follows:
(metadata: DN|CN|memberOf|sAMAccountName|displayName|sn|givenName):

CN=Rene Klomp,CN=Users,DC=domain,DC=com|Rene Klomp|CN=datagroup,CN=Users, DC=domain,DC=com;CN=homegroup,CN=Users,DC=domain,DC=com|renek|Rene Klomp|Klomp|Rene
CN=John Doe,CN=Users,DC=domain,DC=com|John Doe|CN=datagroup,CN=Users, DC=domain,DC=com;CN=homegroup,CN=Users,DC=domain,DC=com|johnd|John Doe|Doe|John
...

File #2 containing files and folders permissions shares.txt is as follows:
(metadata: share;group;permission):

home;homegroup;FULL CONTROL
SYSVOL;Everyone;READ
SYSVOL;Administrators;FULL CONTROL
SYSVOL;Authenticated Users;FULL CONTROL
data;datagroup;FULL CONTROL
NETLOGON;Everyone;READ
NETLOGON;Administrators;FULL CONTROL
...

This time I have used the tool 'Talend Open Studio for Data Integration' to join these two input datasets and transform the data into the right XML format for importing it into OIA. In Talend you design a Job which is made out of several components and a flow related to the data going through these various components. The Job I designed for these particular datasets is rather straightforward and easy to understand as can be seen in the screenshot below (by right-clicking on the image you should be able to examine it in the original size).

Within Talend Open Studio I start with two tFileInputDelimited components, each reading one of the two files. The 1st file adusers.csv has a memberOf attribute which is a multivalued attribute. It can contain a list of groups each separated by a ';'. Therefore the next step after reading this file is normalizing the data for the memberOf column using the tNormalize component. Next thing we need to do is joining both datasets. For this I have used the tMap component.

As you can see it is pretty straightforward to connect the input stream / attributes to the output stream / attributes and do the join based on a simple expression (just as a trivial example - group in input file shares.txt needs to be in memberOf in file adusers.csv).

Now that both sets are joined we can transform the data and write to XML.For that I have used the tAdvancedFileOutputXML component which writes the output in an intermediate XML file (in this case: out.xml). Again, pretty straightforward to define the structure and looping and grouping of elements as you can see in the picture below. The schema is still rather arbitrary but I will use XSLT to transform this into the right schema for OIA in the next step.

For that last step in the ELT transformation process I use the tXSLT component and an appropriate XSL Transformation file (in this case: AD_01_accounts.xsl). It picks up the file that was written in the step before, transforms according to the transformation defined in the XSL file and finally writes the output to our final AD_01_accounts.xml file.

If this whole process ends succesfully there is one more step that I have added. This is using the tXSDValidator component in Talend to check the result against a predefined schema. In this case I obviously use the accounts.xsd schema file as shipped with OIA (in this case version 11.1.1.5.0). As you can see in the first picture this validation process also ends successfully and ends with outputting '[job AD] File is Valid'.

Now we are ready to import this XML file into OIA - of course we have to configure a namespace for this particular resource first. This whole exercise took me less than 20 minutes to setup and finish!

All the files mentioned above can also be downloaded in this single package: data.zip. If you open the files individually in a browser by clicking on one of the links above, be sure to look at the source or save the file and open in an XML or other editor... otherwise the browser might just show you some blank page or a page with little information.

Have fun, René!

PS. I have formatted the final AD_01_accounts.xml document using XmlPad so it is easier to read than the default output which is not using any formatting at all - this is obviously just a visual thing for this blog and not needed for importing.

Tuesday Jul 12, 2011

Importing hierarchical entitlement data in OIA - a practical example

Oracle Identity Analytics (OIA) provides organizations with the ability to engineer and manage roles and automate critical identity-based controls and processes. In order to do so, we need to feed OIA with the relevant data such as typical HR data, e.g. business structures, users, etc., and account information from all the managed systems in scope, e.g. application user profiles containing common attributes like userid, firstname, lastname, employeeid but more interesting the authorization data or entitlements (which could be application roles, groups, permissions, privileges, responsibilities or whatever it is called within the system).

All of this data is stored within OIA in the so-called Identity Warehouse. This Identity Warehouse is a central repository that contains all of the important entitlement data for the organization (and of course also the other data as mentioned above). This data is imported from the organization's databases on a regular, scheduled basis. The Oracle Identity Analytics software has an import engine that supports complex entitlement feeds. The engine accepts either CSV flat files or XML files, and includes Extract, Transform, and Load (ETL) processing capabilities. Flat files require a schema for the import process. XML files are recommended for accounts with multi-value attributes or n-level hierarchies. Another way of feeding the Identity Warehouse is through an integration with a provisioning solution like Oracle Identity Manager (OIM). Such an integration will give OIA direct access to all of the entitlement data in the systems managed by OIM.

In this blog entry I want to focus on feeding the Identity Warehouse with multi-value attributes or n-level hierarchies through the import engine via XML files. As an example I will use some sample data that is coming from a Peoplesoft Financials system. This system is relatively simple in the sense that its authorizations are modeled 2-levels deep. Users are assigned to Roles and Roles are assigned to Permission Lists. Again, this is just a simple 2-level hierarchical example but could also be applied to any system having n-level (n > 2) hierarchical data.

The data that I want to import lives in 3 tables: a User table, a User - Role relationship table, and a Role - Permission relationship table. The Entity Relationship Diagram is as shown in the adjacent diagram.

Examples of table data for two user entries can be downloaded via the following files: user.csv, userrole.csv, and roleperm.csv.

The data in these tables needs to be joined and transformed to the right XML format. For that I have used an ETL tool called CloverETL. CloverETL is a Java based ETL framework which can be used to transform structured data, and to some degree free-form data. Of course, any other data integration platform like e.g. Oracle Data Integrator can be used for this purpose. Basically, we are reading the data, transforming it, and finally writing it to our final XML file. The graph in the screenshot below shows three so-called UniversalDataReaders (in green), reading each of the input tables. These are joined and written to XML through an XMLWriter component (in blue). This way, the data is joined and written to XML format.

However, the XML data coming out of this XMLWriter is not in the right format that we need to import it in Oracle Identity Analytics. This XML output is having specific tags based on the field names from the input tables, like:

<?xml version="1.0" encoding="UTF-8"?>
<rbacx component="XML_WRITER0" graph="PSFT" created="Mon May 09 01:34:05 CEST 2011">
  <User>
    <User_ID>P123456</User_ID>
    <Description>Janssen, J.</Description>
    <Last_Change>4/7/2008</Last_Change>
    <Locked_Out>0</Locked_Out>
    <Failed_Logins>0</Failed_Logins>
    <Enabled>0</Enabled>
    <Last_Signon_Dat>2008-04-07-15.28.50.000000</Last_Signon_Dat>
    <Role>
      <Role_Name>K_GL_INQUIRER</Role_Name>
      <Perm>
        <Permission_List>K_GL_INQUIRY</Permission_List>
      </Perm>
      <Perm>
        <Permission_List>K_REPORT_DEC</Permission_List>
      </Perm>
    </Role>
  ... 

Therefore we use an XSLTransformer component to transform this XML to the more generic format that we need and finally we write this XML data to a file ready for import through a UniversalDataWriter component. This XML format is for the most part using the more specific tags <attributes>, <attribute>, <attributeValues>, <attributeValue>, and <value>.

<?xml version="1.0" encoding="UTF-8"?>
<rbacx>
  <namespace namespaceName="Peoplesoft" namespaceShortName="PSFT" />
  <accounts>
    <account id="P123456">
      <name><![CDATA[P123456]]></name>
      <endPoint>PSFT</endPoint>
      <domain>Production</domain>
      <comments />
      <description>Janssen, J.</description>
      <attributes>
        <attribute name="Last_Change">
          <attributeValues>
            <attributeValue>
              <value>4/7/2008</value>
            </attributeValue>
          </attributeValues>
        </attribute>
	...several lines skipped...
        <attribute name="Last_Signon_Dat">
          <attributeValues>
            <attributeValue>
              <value>2008-04-07-15.28.50.000000</value>
            </attributeValue>
          </attributeValues>
        </attribute>
        <attribute name="ROLE">
          <attributeValues>
            <attributeValue>
              <value>K_GL_INQUIRER</value>
              <attributes>
                <attribute name="PERMISSION_LIST">
                  <attributeValues>
                    <attributeValue>
                      <value>K_GL_INQUIRY</value>
                    </attributeValue>
                    <attributeValue>
                      <value>K_REPORT_DEC</value>
                    </attributeValue>
                  </attributeValues>
                </attribute>
              </attributes>
            </attributeValue>
            ...

The XSL transformation file that I have used to transform the XML can be downloaded via this link: PSFT_01_accounts.xsl.

After defining a Peoplesoft resource type / namespace within Oracle Identity Analytics according to the screenshot shown here, we are now able to import the resulting XML file. After the hierarchical XML data has been successfully imported, the entitlement data can be used in attestation processes, for role mining, defining audit policies, etc. As an example how the account data is represented within OIA, we can take a look at the actual data for a user, e.g. P123456 within the identity warehouse. This results in the following two screenshots. When looking into the general details for this user's Peoplesoft account, we see in the left screenshot all the attributes as defined in this resource type's namespace. When clicking on the attribute named 'Role Name', we can see the list of roles for this account and also that this Role Name attribute also has 'sub attributes'. When double-clicking now one of the roles, we drill-down on its sub attributes. And obviously we find that the Role Name attribute has a sub attribute named 'Permission List'.

 Have fun, René!

Sunday Mar 13, 2011

Entitlements outside Roles Report in Oracle Identity Analytics

As a followup to my previous blog entry on reporting in Oracle Identity Analytics (OIA) I have looked at another probably very common OIA report. This report will list all the entitlements (imported in the Identity Warehouse) that are outside of (not contained in) any role. This report can be very useful during the role mining process to see what entitlements are not contained in any role.

I have setup the SQL query so that it will look at all attributes (entitlements, e.g. 'groups' in AD) that are set as 'minable' in the resource type configuration. Basically the SQL query will find for all of these attributes all of the values (SELECT ... FROM ... WHERE ...) and see if these are contained in any role through the relation role->policy->attribute (... AND NOT EXISTS (SELECT ... FROM ... WHERE...)). The final SQL looks as follows:

 SELECT
     ns1.namespacename   AS NAMESPACENAME,
     att1.name           AS ATTRIBUTENAME,
     av1.attribute_value AS ATTRIBUTEVALUE
 FROM
     attributes att1,
     attributecategories ac1,
     attribute_values av1,
     namespaces ns1
 WHERE
     ns1.namespacekey = ac1.namespacekey
 AND ac1.attributecategorykey = att1.attributecategorykey
 AND av1.attribute_id = att1.attributekey
 AND att1.isminable = '1'
 AND NOT EXISTS
     (
         SELECT
             1
         FROM
             roles rs,
             role_policies rp,
             role_versions rv,
             policies ps,
             policy_versions pv,
             policy_attributes pa,
             policy_attr_hier_nodes pahn,
             policy_attr_hier_nodes pahn2,
             attribute_values av
         WHERE
             rs.statuskey = 1
         AND rv.version_status_id = 1
         AND rs.rolekey = rp.rolekey
         AND rp.role_version_id = rv.id
         AND rp.policykey = ps.policykey
         AND ps.policykey = pa.policy_id
         AND ps.current_version_id = pa.policy_version_id
         AND pa.policy_attr_hier_id = pahn.id
         AND pahn.id= pahn2.root_id
         AND pahn2.attribute_value_id = av.id
         AND av.id = av1.id
     )
 ORDER BY
     NAMESPACENAME,
     ATTRIBUTENAME,
     ATTRIBUTEVALUE

I have taken this SQL as the basis for my iReport design for the report that I have called 'User Entitlements outside Roles'. It will display an ordered list (grouped by namespace) of all attributes that are set as minable and its values (entitlements), not contained in any role. The resulting jrxml file can be found here: UserEntitlementsOutsideRoles.jrxml.


An example of what the final report will look like is shown below. In this example I have ran it against a sample dataset where for two of the resources (Microsoft SQL Server, Windows Active Directory) some attributes have been set as minable (e.g. 'serverRoles', 'groups', etc.). As said before they have been set as minable here for the sake of reporting but these are obviously typically the same attributes taking part in a role mining process and hence more or less automatically the ones we are interested in...

Have fun, René...

Wednesday Jan 12, 2011

Soll Ist (Roles vs Actuals) Report in Oracle Identity Analytics

Oracle Identity Analytics (OIA; formerly Sun Role Manager) provides enterprises with the ability to engineer and manage roles and automate critical identity-based controls. Once roles are defined, certified, and assigned, the solution continues to deliver value throughout the user access lifecycle by providing a complete view of access-related data, automating certification, providing evidence of compliance, and enabling streamlined access changes.

During a Proof-of-Concept I have been looking into how to extend OIA (11gR1 BP01) with a so-called 'Soll Ist' or 'Roles vs Actuals' report. Such a report should detail the discrepancies between the entitlements that a user should have based on its roles within the organization compared to the actual entitlements that this user has on the various IT systems and applications. This shows both a practical example of how to add custom reports to OIA in general as well as a starter to adapting your own 'Soll Ist'-like reports.

Getting and Staying in Control

Oracle Identity Analytics (OIA) revolves around the core ‘Identity Warehouse’ within which we build a consolidated view of employee identities across the organization. In order to do this we first build an organizational or business unit structure against which we can begin to map a reporting hierarchy for the attestation process. This can be done in a number of ways but is typically achieved via the organizations HR system or via an existing IAM solution. Upon creating the outline structure of the organization, OIA then defines ‘namespaces’ or attributes within directories and applications that we are interested in reviewing from a compliance and role mining perspective. Once created, we can then import employee identity data from these namespaces showing who currently has access to what ('Actuals').

Having constructed the Identity Warehouse, OIA will provide a single point of reference for employee identities across the business showing a comprehensive breakdown by business unit and individual of all entitlements that exist. Using the OIA web client we can then begin the attestation process to cleanse this information to ensure accuracy and compliance to audit standards. Each ‘Business Unit Manager’ or responsible individual will be issued with an email from OIA asking them to log into the web client to perform the attestation task. This process may run through multiple iterations until the organization deems the identity data to be clean and representative of a correct operational state of the business.

At this stage, OIA is capable of producing reports for management and audit purposes as well as building rules and policies around access rights and entitlement privileges such as Segregation of Duties and associated preventative / detective controls. Dashboard views of compliance activities are also available for high level management and tracking of the process.

Having reached a position at which identity data is clean and operational rules defined, the business can then look to begin the process of role mining and definition. Using a hybrid approach to role mining, OIA provides the organization with the capability to analyze the physical elements of employee roles along with their IT entitlements or permissions to give the most comprehensive view of an ‘Enterprise Role’ possible. OIA generates suggested Roles via the use of mathematical algorithms that can then be presented back to the business and adopted as appropriate.

Having our roles defined, a Roles vs Actuals report could be another helpful tool in staying in control, next to the other already present functionality for automating access governance processes like periodic attestation or certification.

Oracle Identity Analytics Reports

Many out-of-the-box reports can be generated in Oracle Identity Analytics. Reports are valuable tools that auditors and end-user managers can use to evaluate, analyze, and review access controls in the organization.

Reports are broadly classified as follows:

  • Business structure reports: Out-of-the-box reports that run on selected business structures.
  • System reports: Out-of-the-box reports that are run on all users, roles, or policies in Oracle Identity Analytics.
  • Identity Audit reports: Open-audit exception reports based on audit policy scans.
  • Custom reports: Reports customized according to the requirements of your organization.
In order to create our aforementioned Soll Ist report we will need to define a custom report.
Custom Reports

The following steps are involved in creating and running custom reports:

  1. Creating a reports template using JasperReports / iReport. JasperReports is an open source Java reporting tool that can write to screen, to a printer, or to various file formats, including PDF, HTML, Microsoft Excel, RTF, ODT, comma-separated value (CSV), and XML. It reads its instructions from an XML or .jasper file.
  2. Using the Oracle Identity Analytics user interface, upload the reports template to Oracle Identity Analytics.
  3. Running or scheduling the report as needed.

First we need to define the SQL query to give us exactly what we want: the excessive actual entitlements in the various accounts for the user that are not reflected in the roles that are assigned to this person. Basically is what we want a combination of the 2 reports: UserEntitlements.jrxml - giving Actuals or 'Ist' and UserRoleBasedAccess.jrxml - giving Role based access or 'Soll'. For these exceptions we want to display the following fields: namespacename, firstname, lastname, username, rolename, accountname, endpointname, attributevalue, and attributename. The query that gives us the excessive IST entitlement information should look something like this:

SELECT <fields> FROM <IST tables> WHERE <IST condition> AND NOT EXISTS

( SELECT 1 FROM <SOLL tables> WHERE <SOLL condition> )

ORDER BY <order fields>

It would of course be possible to make any variation to this report. In this example report we want to find all the exceptions concerning the 'groups' attribute in Active Directory. Now that we have our SQL query we need to build a template which defines look and feel, what will be printed where, etc.

The heart of the JasperReports interface is iReport the visual report designer specifically designed for JasperReports. iReport gives administrators and report designers total control over the contents as well as the look and feel of every report. As a starting point for this example I have taken the existing UserEntitlements.jrxml template and have modified it for the Soll Ist Report. This means editing the title and SQL query as a bare minimum.

iReport

After we save our report, we can upload it to our OIA server via Reports -> Custom Reports -> New Custom Report.

To test the new Soll Ist report I have imported an example business structure, example set of globalusers and for one user an account on AD (via AD_01_accounts.csv). This user (Bernardine Mooney - bm20245) is working in the Marketing organization and its AD account has a multivalued group membership that shows 7 groups. This is our 'Ist':

At the same time I have defined a Role called Marketing and a Policy called Marketing_AD that is contained inside the Marketing role. The Marketing_AD policy defines only 5 group memberships as a 'Soll' state:

Marketing_AD policy

When we now run the Soll Ist report we should expect to see the 2 excessive groups that are present in the account for this user to show up in the report since they are not reflected in the roles assigned to this user. And this is exactly the output that we get as you can see in the picture below.


If you are interested in the files that I have used (SollIst.jrxml, SQL query, etc.) please give me a ping.

Have fun, Rene!

Friday Jul 09, 2010

Oracle IRM and Hot Folders

Last week I attended a training on Oracle Information Rights Management. Oracle Information Rights Management (IRM) is a Fusion Middleware security service that uses encryption to secure and track all copies of an organization's most sensitive documents and emails, regardless of how many copies are made, or where those copies are stored and used – even when those copies are sent outside the firewall. In Oracle IRM terminology the process of encrypting documents and emails is referred to as “sealing”. Oracle IRM servers expose a comprehensive set of IRM web services to enable the easy integration of “sealing” within the workflows of content management repositories, collaborative web applications, content filtering/monitoring systems, etc.

One of the features (not part of the product but available as a separate download from samplecode.oracle.com) gives you additional functionality to automatically seal files that are copied of moved into a folder. The Oracle IRM "Hot Folders" application monitors a set of file system folders and uses the IRM web services to automatically seal files copied or moved into them, to associated IRM classifications. This enables organizations to consistently and effectively apply IRM without requiring end users to explicitly seal files, by leveraging the familiar metaphor of placing confidential files “in a safe place”.

In communicating with Oracle IRM 11g all web services traffic is secured by the means of SSL encryption. Since in our setup all certificates are self-signed certificates, setting this up requires the CA certificate (in the TrustMyOwnSelf.jks keystore) to be trusted by the Hot Folders java application. For this we start the Hot Folders java application with the java option TRUST_HF:

 set IRM_HF=C:\\Users\\Rene\\Desktop\\HotFolders
 set TRUST_HF=-Djavax.net.ssl.trustStore=%IRM_HF%\\TrustMyOwnSelf.jks
 -Djavax.net.ssl.trustStorePassword=welcome1 
 -Djavax.net.ssl.trustStoreType=JKS
 java -Xms128m -Xmx512m %TRUST_HF%
 -Djava.util.logging.config.file=%IRM_HF%\\hotfolders-logging.properties 
 -jar %IRM_HF%\\hotfolders.jar %IRM_HF%\\hot.properties

However, it appears that the certificate that I generated during the training (for one-way SSL encryption from the IRM server to e,g, a browser sesion) is not suitable for using this for the Hot Folders application. This clearly generated some exceptions when trying to establish a connection from the Hot Folders application to the IRM server, the most relevant being:

 <Jul 8, 2010 11:48:43 PM PDT> <Warning> <Security> <BEA-090567> <The certificate chain
 received from irm.oracle.demo - 192.168.111.12 contained a V3 certificate which
 keyusage constraints forbid its key use by the key agreement algorithm.> 

Looking again very carefully at the Hot Folders documentation it appears that the certificate should have very specific properties:

 a. Subject CN: should match hostname of web service URL
 b. Basic constraints: Subject Type=CA
 c. Key Usage: Non-critical, Key Encipherment and Certificate Signing

Especially 'c. Key Usage' was not what I had enabled when generating my self-signed certificates. This one was actually marked 'Critical, Certificate Signing' only.

So I went back to the server machine and regenerated a new keypair and certificate now with some extra options (-keyusagecritical and -keyusage) added to the utils.CertGen command:

 java utils.CertGen
         -selfsigned
         -certfile MyOwnSelfCA.cer
         -keyfile MyOwnSelfKey.key
         -keyfilepass welcome1
         -cn "irm.oracle.demo"
         -keyusagecritical false
         -keyusage digitalSignature,keyEncipherment

After restarting the IRM server and trusting again the new certificate for the Hot Folders application everything works as a charm as you can see in the following recording...! Here you can see the Hot Folders application doing its work and scanning (in this case every 30 seconds but configurable) for new to be sealed documents in the defined "hot folders". In this case I have only configured one such folder: C:\\Users\\Rene\\Desktop\\HF-Public. All files that are dropped into that (hot) folder are destined to be sealed to the Public context as defined in the Oracle IRM administration GUI.

First we take a look at the IRM admin GUI where we can see the application user _HotFolderApp having the "Sealer" Role in the Public context. This means in short that this user has all the rights to be able to seal documents to the Public context. Next, we see how we start the Hot Folders application and finally we drop a new document into the HF-Public folder and we see how the documents gets sealed automagically!

Friday Dec 04, 2009

Portrait mode Sun Ray - 2

As a followup to my earlier blog entry (Portrait mode Sun Ray) here the scripts that I used with VDI 3.1 With these scripts it is very straightforward to have a couple of Sun Ray's displays rotated in VDI 3.1. Just register the pseudo.token within the Sun Ray admin interface, choose any Owner name but enter the value of 'rotate' at the field 'Other Information'...

Download the two scripts (vda and vda-info.sh) and put them into /etc/opt/SUNWkio/sessions/vda (be sure to backup the original vda script, e.g. 'cp vda vda.orig'). Next, reset the session for this Sun Ray DTU after which it should come back to you with a rotated screen.

Have fun, Rene.

Monday Nov 16, 2009

Managed Smartcard Services for VDI

One of our partners is offering Managed Smartcard Services. This means that an enterprise is able to outsource all of their smartcard and PKI services to a 3rd party. Part of their card lifecycle management service is the whole enrollment, personalization and issuance of the smartcards. This means that a new employee will get an invitation to visit one of a chain of certified professional photographers in his or her hometown neighborhood to take a picture to go on the card. The smartcard is then printed with the company's and personal information including the employee's photo and is provisioned with the user's certificates. The certificate will contain the user's UPN that will also need to get into the company's Active Directory so that the employees can use smartcard login to their desktop. For this to happen, many scenario's can be thought off including a fully automated provisioning of AD by tools like Sun Identity Manager up until manually adding new users from a batch csv file. After this is all done the card is then sent to the employee's home address together with a PIN letter, ready to activate and use.

Recently I have been investigating how this Managed Smartcard Service could be used to easily give employees access to their Windows desktops via strong authentication using our partner's Managed Smartcard Service. The basis for this setup is a customer using Citrix XenApp (formerly known as Presentation Server) to access its desktops using Sun Ray thin clients.

The entire demo environment has three servers, two clients and a live Internet connection. Clients are a fat laptop client and a Sun Ray thin client. The servers are a Sun Ray server (with PCSC lite software), a Citrix XenApp server with AET middleware software (used by our partner) installed on it, and a AD Domain Controller that is setup to trust certificates issued by our Managed Smartcard Service Provider's CA. This environment is connected to the Internet so the validity of certificates can be checked via OCSP at the Managed Smartcard Service OCSP responders. As can be seen in the short demonstration video (02:27) below both fat and thin clients can be used interchangeably without any problem.

Since the issued smartcards are sent directly from the 3rd party (our partner) to the company's employees, we needed to find a way to register the cards in the company's Sun Ray (or VDI) environment. The Sun Ray token ID is composed from several characteristics of the card being fabricator, type, batch and serial number of the card. Within the Sun Ray server software, this token ID is composed automatically by the corresponding smartcard config file. Composing this token ID beforehand through a script is theoretically possible but is easily broken since e.g. type and batch numbers could change without notice. Another idea would be to register the smartcard when it is used for the first time by the customer's employee.

However, even more simple is to make the Sun Ray services layer completely 'transparent' to all the backend services (like with VDI). This is what is used in this setup where just the certificates are read and transferred to the backend Citrix XenApp servers which have the AET middleware on it and where all the OCSP checking and final authentication based on the UPN in the certificate will take place.

All in all, we found that Managed Smartcard services can be used very easy together with Sun VDI and Citrix where both fat and thin clients can be used together...

Have fun, Rene.

Wednesday Nov 11, 2009

OpenSSO User Group Meets in North Europe

In cooperation with The Open Web SSO Project\* (OpenSSO) several OpenSSO User Group meetings will be organized in North Europe from 23 November until 10 December. All individuals and organisations who work professionally with OpenSSO in North Europe are invited to attend half a day of high quality content and networking.

To ensure this an agenda has been drawn up with some familiar names as presenters, who themselves have worked professionally with some of the largest OpenSSO solutions in Europe.

If you're interested in OpenSSO, its features (present & future) and like concrete demos then this is an opportunity not to be missed!

Please check out this web site with agenda, dates & locations:

http://www.supportrock.net/wiki/index.php/OpenSSO_User_Group_Meetings

Sunday Nov 08, 2009

Is it a card...? a token...? no wait, it's a Smart DisplayCard Token!

Recently I received some 'fairly new' smartcards from ActivIdentity which are called Smart DisplayCard Tokens. I decided to integrate the Smart DisplayCard Token with VDI. The idea here is that we use the smartcard functionality for accessing our VDI desktop from a Sun Ray thin client (e.g. from our workplace at the office) and access the same desktop from a fat client through Secure Global Desktop by using the OTP token functionality in the exact same card (e.g. at home or traveling).

Smart DisplayCard Token"The ActivIdentity Smart DisplayCard Token combines the security of a token with public key infrastructure (PKI) features for online authentication in a smart card form factor. The ActivIdentity Smart DisplayCard Token is embedded with a smart chip that supports standard smart card PKI capabilities such as email encryption and digital signatures. The token supports two user authentication modes: connected smart card mode for corporate-issued machines or disconnected Smart DisplayCard mode for authentication using a kiosk or mobile device."

This integration builds on work done some time ago (see my previous blog entry "Integrating Sun Secure Global Desktop with Radius Authentication"). There I had integrated Sun Secure Global Desktop with ActivIdentity 4TRESS AAA Server in order to get Radius Authentication.

As outlined before, the card can be used as a true "smart" smartcard where it will hold the user's certificates for smartcard logon to his or her desktop (see my previous blog entry "UZI-card VDI integration" for an example of how this could work). However, in this integration demo I use it more as a "dumb" smartcard that is assigned to a desktop or user in VDI and where the authentication is done against AD by username / password. This will be sufficiently secure for many scenario's where people access their desktops within the company network. And we do still have all the benefits of using a card like easy session mobility and such.

Again, while traveling or at home, we access our desktop by logging in to Secure Global Desktop and enter a one-time-password (OTP) for Radius Authentication. Although the card does not have a keypad we can still use it for multi-factor authentication (something you have, something you know). For each card we can generate a server-side PIN code which the user can enter right after (configurable) the OTP in the password field.

Please have a look at the short demo video (02:50) below which should give you an impression about how this could work...

All in all, I believe this ActivIdentity Smart DisplayCard Token is a great card with many possibilities, especially in combination with Sun VDI. It's a great thing to have if you are going for a multi-channel authentication strategy.

Have fun, Rene.

About

This blog covers exciting things I encounter about Oracle's software and related; that is Identity & Access Management, SOA, Security, Desktop, etc. The views expressed on this blog are my own and do not necessarily reflect the views of Oracle.

Search

Archives
« April 2014
SunMonTueWedThuFriSat
  
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
   
       
Today