Thursday Jul 31, 2014

Identity Management at Oracle OpenWorld 2014


Are you registered for Oracle OpenWorld 2014 to be held in San Francisco from September 28th to October 2nd? Visit the Oracle OpenWorld 2014 site today for registration and more information. We have highlighted some of the most talked about sessions that attendees will be trying to get in to see this year.  For the latest information on sessions (such as schedule changes to dates, times, venue locations) please continue to check back at the links below.

Business Transformation Case Studies in Identity Consolidation (CON7989) - This session will explore how customers are using Oracle Identity Management to deliver a unified identity management solution that gives users access to all their data from any device while providing an intelligent centralized view into user access rights. See how Oracle Identity management can securely accelerate your adoption of cloud services in the new digital economy.

Identity Governance Across the Extended Enterprise (CON7968) - In this session, see how Oracle's Identity Governance solution reduces risks and costs, while providing fast access to new services through an intuitive user self-service solution to thrive into today's economy.

Securing The New Perimeter: Strategies for Mobile Application Security (CON7993) - In this session, we will cover how enterprise mobility and the Internet of Things are both new IT endpoints that require melding device and user identities for security.

Access without Fear:Delivering an Optimale Multi-Channel user experience (CON7995) - In this session, we will review the role of the Oracle Access Management Platform and how it delivers an optimal user experience while guaranteeing the security of all access events.

Identity as a Service - Extend Enterprise Controls and Identity to the Cloud (CON8040) - In this session, we will cover how the Oracle Cloud Identity Service extends enterprise controls to the cloud, automating SaaS account provisioning, enabling single sign-on and providing detailed activity reports for today's customers.

Check back often, for a complete listing of all sessions available at Oracle OpenWorld 2014.

Identity Management executives and experts will also be at hand for discussions and follow ups. And don’t forget to catch live demonstrations of our complete Oracle Identity Management solutions set while at OpenWorld.

Follow the conversation on Oracle OpenWorld 2014 on Twitter with #OOW14 and as always, engage with us @oracleidm.

We recommend the use of the Schedule Builder tool to plan your visit to the conference and for pre-enrollment in sessions of your interest. You can search identity management sessions using the term “identity management” in the Content Catalog. We hope to see you there!

Wednesday Jul 30, 2014

Exploring the OIM API Wrapper (Part 2 of 2)

This is part 2 of a 2 part series. In part 1, we discussed developing these web service wrappers and handling security for both the OIM credentials and web service endpoints. In part 2, we'll demonstrate how to invoke these web services from your BPEL Approval Workflow (and even how to store your web service user credentials in the CSF).

We wanted to pass along a suggestion to use Fault Policies around your web service calls to retry the operation in the event of network issues. We won't cover the use of Fault Policies in this series of posts, but may discuss it in a future post. For more information about Fault Handling in BPEL specifically, check out this document from Oracle Documents Online

Invoking the Web Service
Now that you have deployed your web service and protected it with an OWSM policy, you will need to configure your BPEL Approval workflow to invoke the web service. This is actually quite simple and JDeveloper does most of the work for you.

To start, we will assume you already have created a workflow (if not, see Oracle's How-To document for more information).

Once you have a new workflow, you must create a new partner link. To do this, open the bpel file for your workflow (such as ApprovalProcess.bpel) and drag the Partner Link activity from the Component Palette onto the Partner Links swim lane section of your workflow screen.

The Create Partner Link window will appear. Here you will specify the name of the Partner Link, as well as the WSDL URL. After typing in the WSDL URL, click the Parse WSDL button. You will see a prompt notifying you that there are no Partner Link Types defined in the current WSDL. Click Yes. This prompt may appear twice, so click Yes both times. You will see the Partner Link Type field has been populated. Finally, under Partner Role, choose the role listed and then click OK. You will see the new Partner Link appear in the Partner Links swim lane.



Now that you have a Partner Link defined, you must define an Invoke activity by dragging and dropping it from the Component Palette into the main swim lane. Double click the new Invoke activity and the properties window will appear.

Type in a name for the Invoke activity, and then choose a Partner Link using the Partner Link Chooser (select the one you just created). You will see a list of operations to choose from. In our case, we’ll select Disable User.

For Input and Output variables, you will have to create these by clicking the + icon, starting with the Input variable. When the Create Variable dialog box appears, click OK to accept the defaults.  Repeat this process to create the Output variable.



Finally, click OK to close the Invoke properties box. You will see a line connecting the Invoke activity you just created to the Partner Link you created previously. Make sure you save the bpel file in JDeveloper.


Now that you have defined an Invoke activity for the new Partner Link, you must use the Assign activity to assign the proper input values to the Input variable you created in the previous step. Drag and drop an Assign activity from the Component Palette onto the BPEL workflow. As with any other BPEL assignment, simply choose the source value on the left side of the Copy Rules screen, and drag to a corresponding variable element on the right side, then click OK.



Repeat this process for the Output variable, if necessary. You have now successfully configured your BPEL workflow to invoke the custom web service. In the next section, we will cover how to pass credentials to the web service using the OWSM Client Policy.

Configure OWSM Client Policy
Previously we protected the Web Service endpoint with an OWSM Policy that required a username and password be provided along with the SOAP request, so we will have to configure our Partner Link to provide these credentials when the service is invoked. This is actually quite easy in JDeveloper. You could also this do in Enterprise Manager at runtime, but it will not persist if you redeploy the BPEL Approval workflow.

In your BPEL Workflow project, open the composite.xml file. On the right under the External Service swim lane, right click on your Partner Link and click Configure WS Policies. Beside Security, click the + sign to add a Security policy.





Choose oracle/wss_username_token_client_policy and click OK. Back on the Configure SOA WS Policies screen, select the policy under Security and click the pencil icon to edit the policy settings. For the csf-key row, you can specify a csf key name under Override Value or use the default value (basic.credentials). Here you must use a CSF key that has been defined in the oracle.wsm.security CSF map. This is very important – only keys defined in oracle.wsm.security will work. In our case, we defined a custom key called owsmUserCred that contains a valid username and password. At runtime, Weblogic will retrieve this CSF credential and use it to authenticate.



Click OK, and then click OK again to close the Configure SOA WS Policies window. Save the composite.xml file, then deploy your web service to the SOA server and associate it to an OIM Approval Policy as needed.

You now have successfully configured your BPEL Approval workflow to use the custom Web Service and to pass the credentials necessary to satisfy the OWSM policy assigned to the endpoint.

Justin Hinerman is an Identity and Access Management Engineer with IDMWORKS.  As a key Oracle Partner, IDMWORKS takes a focused approach to the implementation of a Service Oriented Architecture and Identity Management-based solutions.

Thursday Jul 17, 2014

Exploring the OIM API Wrapper (Part 1 of 2) - IDMWORKS

The need for custom OIM API operations within BPEL approval workflows happens more often than one might think. While there exists a capability to embed Java code within a BPEL workflow (with the Java Embedding activity), this is far from ideal, as anyone who has tried this will understand. In fact, the Java Embedding activity is designed to provide easy access to some basic utility code, not hundreds of lines worth of functionality. Therefore, we recommend that clients deploy custom Web Service wrappers for the OIM API calls.

This is part 1 of a 2 part series. In part 1, we will discuss developing these web service wrappers and handling security for both the OIM credentials and web service endpoints. In part 2, we'll demonstrate how to invoke these web services from your BPEL Approval Workflow (and even how to store your web service user credentials in the CSF).

Development

We’re not going to dig deep into the detail of developing these web services, mostly because it is outside the scope of this post, and there are several other fine resources out there that can walk you through creating JAX-WS web services. Refer to Oracle's documentation at the Oracle JDeveloper Tutorial page for more information.

At a high level, you can create a dynamic web project in Eclipse, and then create your classes and methods however you want. For every class that contains a web service, it must be annotated with @WebService, and every method you want to expose as an operation must be annotated with @WebMethod. Note there are some limitations on input and return parameters with web services created in this way, notably collections. For example, if you wish to return a HashMap<String, String> from a web service, you can’t do it. But if you wrap the HashMap in a wrapper class, it will work fine.

For example:

public class Response() {

public HashMap<String, String> items;

HashMap<String, String> getResponse() {};

public void setResponse(HashMap<String, String> items) {};

}

@WebMethod

public Response webOperation(String input) { … }

OIM Authentication

When invoking the API calls to OIM, you will need to authenticate with a user who has certain Administrative rights within OIM, such as xelsysadm. Creating a new OIMClient instance requires the username, password, and OIM t3 URL. In this case, the Credential Store Framework is perfectly suited to store these credentials. In our case, we store the OIM credentials using a Password key type in CSF, and the OIM t3 URL using a Generic key type.



Once the credentials were in place in the CSF, we simply invoked the CSF API (reference documentation) to retrieve the credentials. Note that the OOTB JPS policy should allow access to a key stored in the OIM map by default if your application is deployed on the Weblogic server and your classpath contains the jps-api.jar file located in the $MW_HOME/oracle_common/modules/oracle.jps_11.1.1/ directory. Otherwise, you will have to define an explicit policy (in Enterprise Manager, the System Policies screen).

Configure Web Service Policy In Owsm

Obviously exposing web service without any authentication that could create and modify users, provision accounts, etc. would be a huge risk from a security standpoint. Fortunately, you can use the Oracle Web Services Manager (OWSM) to require authentication when invoking the web services. If you use JDeveloper or the Oracle Enterprise Pack for Eclipse, you can define OWSM policies locally in your IDE. You can also do this via WLST. In our case, we’ll show you how to use Enterprise Manager to define these policies after you deploy your application.

To do this, login to Enterprise Manager and navigate Weblogic Domain -> Domain Name -> Server Name (for example, IDMDomain -> AdminServer). Right click on the server and click Web Services. You will see a list of Web Services deployed on your server.


Choose the Endpoint Name you wish to protect. The Web Service Endpoint screen will appear. Choose the OWSM Policies tab, and then click Attach/Detach. On the Attach/Detatch Policies screen, select the “oracle/wss_username_token_service_policy” policy. This will enforce a username and password for authentication on the web service call. You will see the policy appear in the “Attached Policies” section of the screen at the top.


Click OK. You will be returned to the Web Service Endpoint screen and the attached policy will be listed in the OWSM Policies list.

If you click Web Services Test (or use something similar such as SoapUI), you can validate that the policy has been applied. Click to expand the Security tab, then select the OWSM Security Policies radio button, and choose oracle/wss_username_token_client_policy from the list of available client policies. Provide the users for any user in the Weblogic domain security realm (such as the weblogic user), and click Test Web Service. Depending on your implementation, you may have to provide parameters in the Input Arguments tab, but in our case if we pass no input we just get back an error. This validates the security policy enforcement.


One important point here is that if you redeploy the web services application, you must re-apply the policies using the steps above.

That covers it for Part 1, and we hope you will check back next week for Part 2 in this blog series. 

Tuesday Jul 15, 2014

Three Reasons Management Will Thank You For Implementing IDM Monitoring - Aurionpro

Identity Management (IDM) platforms protect your most critical enterprise assets: your apps and your enterprise data.  Many companies spend significant investments designing and implementing IDM solutions, but an alarmingly few actively monitor the health of them. That’s like driving a new car for 30,000 miles without checking the oil. Like cars, all software products require maintenance. Active monitoring provides information in advance of potential failures and will help keep your IDM solution running smoothly. Since IDM solutions typically involve various layers of technology and include integrations with a number of source systems, monitoring should be seen as a critical component of a successful long-term IDM strategy.  

It’s unfortunate that IDM monitoring is often times evaluated after the IDM solution is already in place as there are significant benefits that can be overlooked. Three of these compelling reasons are:

1.    Up to 10X reduction in cost of issue resolution

It’s a well-known fact that issues are much more expensive to address in a production environment than during testing cycles. Barry Boehm, the famous Computer Scientist, quantified that the cost of finding and fixing a software problem after delivery is often 100 times more expensive than finding it earlier in the cycle. In our experience, the cost is approximately 10X more expensive, but either way, it’s clear that the earlier you find an issue the better.

Active monitoring can be an enormous cost saver due to its early symptom identification capabilities. Finding an issue before it strikes based on early warnings uncovered by active monitoring technologies, and resolving the issue in a development or testing environment can be a huge cost saver. If you’ve ever had to solve a complex performance- or integration-related issue in a production environment, I’m sure you can relate to just how important this can be.

In a large-scale IDM deployment, for example, there can be any number of root causes that might result in a Single Sign On (SSO) failure. The issue may reside at the application layer, the integration layer, the network layer, or the database layer.  Without a comprehensive monitoring solution that consolidates the data from each of the system’s components, it could be an onerous effort to sift through the extensive set of logs with the hope (and a prayer) that the issue can be identified.  We experienced this exact scenario recently and, thankfully, we had Oracle’s Enterprise Manager in place, which helped us to determine that our Directory replication was failing. Without this monitoring tool, it would have been a much more tedious and costly process to identify and resolve the issue.

The beauty of an active monitoring solution is that it immediately alerts you about the issue and provides sufficient information to initiate quick remedial action.  It also provides detailed reports that aid in the understanding of the system performance and stability trends.

2.    Most companies achieve ROI break even within 1-2 years

Putting an active monitoring solution in place is primarily a one-time effort and cost, as the ongoing resource needs to support the technology post-deployment are minimal. The million dollar question is whether or not the cost of the technology and the resource needs to set up such a solution is worth it? The short answer is YES. Avoidance of a single production-level issue (as was described above) might actually pay for the entire system by itself. Such IDM monitoring solutions also reduce manual monitoring costs while minimizing system down time, both of which also add up to hard cost benefits. We have often observed that the cost reductions and cost avoidance that result from an active Identity Management monitoring solution pay for the cost of the solution within a 1-2 year period.

3.    Identity Management monitoring solutions can be implemented quickly, and in phases


As is the case with most software categories these days, there are a number of options available that can help to achieve the benefits of active IDM solution monitoring. We’ve had a ton of success with Oracle’s Enterprise Manager (OEM) 12c product, Oracle’s integrated enterprise IT management product line. Oracle Enterprise Manager creates business value by leveraging the built-in management capabilities of the Oracle stack for traditional and cloud environments, allowing customers to achieve efficiencies while exponentially increasing service levels. If you’re deploying parts of Oracle’s Identity Management Suite, you’ll want to heavily consider deploying OEM.

Key OEM features include:

•    Automated Discovery of Identity Management Components
•    Performance and Availability Monitoring
•    Service Level Management

•    Configuration Management

There are also other licensed and open source monitoring solutions available on the market today. An interesting alternative to check out is Nagios, a viable open source solution for network and application monitoring. Homegrown solutions can also meet many system and network monitoring needs.

Regardless of the technology that is selected, it is recommended, in many cases, to take a phased approach when implementing such a solution. In this way, the processes for ongoing monitoring and addressing potential issues flagged by the monitoring solution can be ironed out while proving out the value and importance of the solution. The solution needs to cover the critical failure points, across database, application, network, machine, and hardware layers. For many Identity Management deployments, database failures are often the culprit of production-level issues. In provisioning solutions, connectivity to target systems need to be monitored closely as the integrations can often times be the failure points. Based on the type of IDM solution being implemented, monitoring should obviously be set up for the more likely failure points during the early phases of the monitoring solution deployment.

Conclusion

Monitoring is an important component to ensure a successful Identity Management solution and greatly helps to improve the health and stability of any IDM platform. To learn more about our best practices gained from leading hundreds of Identity Management implementations, please contact Kunwar Nitesh, an Associate Director in Aurionpro's India-based IDM delivery center, and a true domain and implementation expert across Oracle's Identity and Access Management solutions.

About

Oracle Identity Management is a complete and integrated next-generation identity management platform that provides breakthrough scalability; enables organizations to achieve rapid compliance with regulatory mandates; secures sensitive applications and data regardless of whether they are hosted on-premise or in a cloud; and reduces operational costs. Oracle Identity Management enables secure user access to resources anytime on any device.

Search

Archives
« July 2014 »
SunMonTueWedThuFriSat
  
1
2
3
4
5
6
7
8
9
10
11
12
13
14
16
18
19
20
21
22
23
24
25
26
27
28
29
  
       
Today