Friday Aug 15, 2014

Ensuring High Availability in ADF Task Flows

Just a quick article today on ADF Controller Scopes and specifically ensuring that your application is correctly propagating state stored in PageFlow and View Scope across the cluster. This information can be found in the product doc and in Jobinesh Purushothaman's excellent book (Chapter 12 - Ensuring High Availability), however, more references means more eyes and fewer mistakes!  

Some Background

When you store state in a managed bean scope how long does it live and where does it live?  Well hopefully you already know the basic answers here, and for scopes such as Session and Request we're just dealing with very standard stuff. One thing that might be less obvious though, is how PageFlow and View Scope are handled.  Now these scopes persist (generally) for more than one request, so there is obviously the possibility that you might get a fail-over between two of those requests.  A Java EE server of whatever flavour doesn't know anything about these extra ADF memory scopes so it can't be automatically managing the propagation of their contents can it?  Well the answer is yes and no.  These "scopes" that we reference from the ADF world are ultimately stored on the Session (albeit with a managed lifetime by the framework), so you'd think that everything should be OK and no further work is going to be needed to ensure that any state in these scopes is propagated - right?  Well no, not quite, it turns out that several key tasks are often missed out. So let's look at those.

First of All -  Vanilla Session Replication

Assuming that WebLogic is all configured, this bit at least is all automatic right? Well no. In order to "know" that an object in the session needs to be replicated WebLogic relies on the HttpSession.setAttribute() API being used to put it onto the session. Now if you instanciate a managed bean in SessionScope through standard JSF mechanisms then this will be done and you're golden.  Likewise if you grab the Faces ExternalContext and grab the Session throught that (e.g. using the getSession() API), then call the setAttribute() API on HttpSession, you've correctly informed WebLogic of the new object to propagate.

You might already see though, that there is a potential problem in the case where the object stored in the session is a bean and you're changing one of its properties. Just calling an attribute setter on an object stored on the session will not be a sufficient trigger to have that updated object re-propagated, so the version of the object elsewhere will be stale.  So when you update a bean on the session in this way, and want to ensure that the change is propagated, then re-call the setAttribute() API.   
Got it? OK, on to the ADF scopes:

Five Steps to Success For the ADF Scopes 

The View and PageFlow scopes are, as I mentioned, ultimately stored on the session.  Just as in the case of any other object stored in that way,  changing an internal detail of those representaive objects would not trigger replication. So, we need some extra steps and of course we need to observe some key design principles whilst we're at it:

  1. Observe the UI Manager Pattern and only store state in View and PageFlow scope that is actually needed and is allowed (see 2)
  2. As for any replicatable Session scoped bean, any bean in View or PageFlow scope must be serializable (there are audits in JDeveloper to gently remind you of this).
  3. Only mark for storage that which cannot be re-constructed. Again a general principle; we wish to replicate as little as possible, so use the transient marker in your beans to exclude anything that you could possibly reconstruct over on the other side (so to speak). 
  4. In the setters of any attributes in these beans (that are not transient) call the ControllerContext markScopeDirty(scope) API. e.g. ControllerContext.getInstance().markScopeDirty(AdfFacesContext.getCurrentInstance().getViewScope());   This does the actual work of making sure that the server knows to refresh this state across the cluster
  5. Finally, set the HA flag for the controller scopes in the .adf/META-INF/adf-config file. This corresponds to the following section inside of the file:
<adf-controller-config xmlns="http://xmlns.oracle.com/adf/controller/config">
  <adf-scope-ha-support>true</adf-scope-ha-support>
</adf-controller-config> 
If this flag is not set, the aforementioned markScopeDirty() API will be a no-op. So this flag provides a master switch to throw when you need HA support and to avoid the cost when you do not. 

So if you've not done so already, take a moment to review your managed beans and check that you are really all doing this correctly. Even if you don't need to support HA today you might tomorrow... 

Wednesday Aug 13, 2014

Maven and ADFBC Unit Tests in 12.1.3

An issue that has come up recently has revolved around setting your Maven POM up in 12.1.3 such that you can run ADF BC JUnit Tests successfully both interactively in the IDE and headless through Maven maybe in your Hudson Jobs.  Out of the box, the default POM that you will end up with will be missing a couple of vital bits of information and need a little extra configuration.

Here are the steps you'll need to take:

Step 1: Use The Correct JUnit Dependency

Once you have created some unit tests JDeveloper will have added dependencies for JUnit from the JDeveloper JUnit extensions, something like this:

<dependency>
  <groupId>com.oracle.adf.library</groupId>
  <artifactId>JUnit-4-Runtime</artifactId>
  <version>12.1.3-0-0</version>
  <type>pom</type>
  <scope>test</scope>
</dependency>
<dependency>
  <groupId>com.oracle.adf.library</groupId>
  <artifactId>JUnit-Runtime</artifactId>
  <version>12.1.3-0-0</version>
  <type>pom</type>
  <scope>test</scope>
</dependency> 

Delete both of these entries, if they exist, and drop in a dependency to the vanilla JUnit 4.11 library instead:

<dependency>
  <groupId>junit</groupId>
  <artifactId>junit</artifactId>
  <version>4.11</version>
  <type>jar</type>
  <scope>test</scope>
</dependency> 

Failing to make this change will result in the following error:

 java.lang.NoClassDefFoundError: org/junit/runner/notification/RunListener 

Step 2: Configure the Surefire Plugin to Select the Correct JUnit Version

This is probably not needed but it ensures that Surefire is left in no doubt about what version of JUnit it shoudl be working with (in this case a version of 4.7 or higher):

<plugin>
  <groupId>org.apache.maven.plugins</groupId>
  <artifactId>maven-surefire-plugin</artifactId>
  <version>2.17</version>
  <dependencies>
    <dependency>
      <groupId>org.apache.maven.surefire</groupId>
      <artifactId>surefire-junit47</artifactId>
      <version>2.17</version>
    </dependency>
  </dependencies>
</plugin>

Step 3 : Identifying Your connections.xml File 

When running outside of JDeveloper we need to set things up so that  the unit tests can actually find the connection information that defines the datasource that your Application Modules are using.  To do this, we need to add a configuration section to the  Surefire plugin to add the location into the classpath.  Add the collowing configuration into the Surefire plugin, after the <dependencies>  section:

<configuration>
  <additionalClasspathElements>
    <additionalClasspathElement>
      ${basedir}/../.adf
    </additionalClasspathElement>
  </additionalClasspathElements>
</configuration> 

This will ensure that the connection information can be found.  If you forget this step you'll get a stack trace including the message:

MDS-00013: no metadata found for metadata object "/META-INF/connections.xml" 

Step 4 - Supply the Missing JPS Library

Finally we need to supply the location of one extra required library.  This requirement will hopefully be resolved in the next release, but for now add it.  Again this is added to the Surefire plugin configuration <additionalClasspathElements> 

<additionalClasspathElement>
  ${oracleHome}/oracle_common/modules/oracle.jps_12.1.3/jps-manifest.jar
</additionalClasspathElement> 

Omitting this will result in the error:

WARNING: No credential could be loaded for Reference = Reference Class Name: oracle.jdeveloper.db.adapter.DatabaseProvider

For reference here's the complete Surefire plugin definition 

<plugin>
  <groupId>org.apache.maven.plugins</groupId>
  <artifactId>maven-surefire-plugin</artifactId>
  <version>2.17</version>
  <dependencies>
    <dependency>
      <groupId>org.apache.maven.surefire</groupId>
      <artifactId>surefire-junit47</artifactId>
      <version>2.17</version>
    </dependency>
  </dependencies>
  <configuration>
    <additionalClasspathElements>
      <additionalClasspathElement>
        ${basedir}/../.adf
      </additionalClasspathElement>
      <additionalClasspathElement>
        ${oracleHome}/oracle_common/modules/oracle.jps_12.1.3/jps-manifest.jar
      </additionalClasspathElement>
    </additionalClasspathElements>
  </configuration>
</plugin> 

About

Hawaii, Yes! Duncan has been around Oracle technology way too long but occasionally has interesting things to say. He works in the Development Tools Division at Oracle, but you guessed that right? In his spare time he contributes to the Hudson CI Server Project at Eclipse
Follow DuncanMills on Twitter

Note that comments on this blog are moderated so (1) There may be a delay before it gets published (2) I reserve the right to ignore silly questions and comment spam is not tolerated - it gets deleted so don't even bother, we all have better things to do with our lives.
However, don't be put off, I want to hear what you have to say!

Search

Archives
« August 2014 »
MonTueWedThuFriSatSun
    
1
2
3
4
5
6
7
8
9
10
11
12
14
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
       
Today