Friday Jan 16, 2015

Using the Oracle Public Maven Repository With ADF Applications

Apache Maven lovers, rejoice! Thanks to the Fusion Middleware team, Oracle now has a public repository for you to use! It is located at http://maven.oracle.com. The official Fusion Middleware documentation even contains a section on how to configure your environment to leverage it. While accurate, this documentation is fairly generic. My aim in this post is to explain you how to use the repository with ADF applications. I recorded a video demonstration as well.

I started to use Maven with ADF back in 2007, when the first technical previews for 11g were made public. I had used Maven on other Java projects before. I was the architect on a large software development project and wanted my team to implement a workflow build around continuous integration principles. At the time, JDeveloper didn't support Maven. Thus, we had to deploy our own internal repository. I wrote a tool to scan the JDeveloper extensions, extract the various JAR files and import them in the repository. Nowadays, this process is automated; you simply have to use the Maven Synchronization Plug-in.

JDeveloper is not only compatible with Maven, but it actually ships with it. This started with Maven 3.0.4 back in version 12.1.2. JDeveloper 12.1.3 brought small but significant improvements to Maven support, and updated the Maven version to 3.0.5. I strongly recommend you to use the Maven release that ships with JDeveloper, as the various Oracle plugins involved may not have been tested with later releases. You will be on your own if you do otherwise.

Let's now see in detail how to setup your environment in order to compile and deploy ADF applications with Maven from the command line.


Step 1: Register 

The repository may be public, but you need to register in order to access it. This is to ensure you have accepted the licence agreement for the artifacts it contains. You can register here: 

https://www.oracle.com/webapps/maven/register/license.html.

Every time you will access the repository, you will need to provide your OTN user name and password. 


Step 2: Create environment variables and alter the path. 

The Maven distribution bundled with JDeveloper is located in the following directory. 

JDEV_HOME/oracle_common/modules/org.apache.maven_<version>

You have to create two environment variables: M2_HOME, which must point to the directory above, and M2, which must be set to %M2_HOME%\bin on Windows or $M2_HOME/bin on Linux and OS X. You must add the M2 variable to the PATH as well. In addition, ensure JAVA_HOME exists and points to a valid Java install.

Once you are done, you may test your setup by executing the maven -version command on the command line. Below is the output I get on a Windows 8.1 machine. 

Microsoft Windows [Version 6.3.9600]
(c) 2013 Microsoft Corporation. All rights reserved.

C:\Users\Frédéric>mvn -version
Apache Maven 3.0.5 (r01de14724cdef164cd33c7c8c2fe155faf9602da; 2013-02-19 08:51:28-0500)
Maven home: C:\Oracle\jdev1213\oracle_common\modules\org.apache.maven_3.0.5
Java version: 1.7.0_60, vendor: Oracle Corporation
Java home: c:\Java64\jdk1.7.0_60\jre
Default locale: en_CA, platform encoding: Cp1252
OS name: "windows 8.1", version: "6.3", arch: "amd64", family: "windows"
C:\Users\Frédéric>


Step 3: Define the repository 

You must provide Maven the information it needs in order to connect the repository. This can be achieved by adding a repository definition either to your POM or to Maven's settings.xml file. Personally, I prefer to go the settings.xml route since such information will likely be shared among several applications.

Typically, setting.xml is found in the user's home directory under the .m2 subdirectory.  On my Windows machine, the full path is c:\users\Frédéric\.m2. The equivalent path would be /Users/Frédéric/.m2 under OS X. You can learn more about this file on the official Maven website

To ensure that the repository definitions contained in settings.xml will be available at runtime, they should reside in a profile marked as active by default. 

<profiles>
    <profile>
      <id>Oracle</id>
      <activation>
          <activeByDefault>true</activeByDefault>
      </activation>
      <repositories>
        ...
        </repository>
      </repositories>
      <pluginRepositories>
        ...
      </pluginRepositories>
    </profile>
</profiles>

The repository definition  must look like this:

<repositories>
    <repository>
      <id>maven.oracle.com</id>
      <releases>
        <enabled>true</enabled>
        <updatePolicy>never</updatePolicy>
      </releases>
      <snapshots>
        <enabled>false</enabled>
        <updatePolicy>never</updatePolicy>
      </snapshots>
      <url>https://maven.oracle.com</url>
      <layout>default</layout>
    </repository>
</repositories>
<pluginRepositories>
    <pluginRepository>
      <id>maven.oracle.com</id>
      <releases>
        <enabled>true</enabled>
      </releases>
      <snapshots>
        <enabled>false</enabled>
      </snapshots>
      <url>https://maven.oracle.com</url>
      <layout>default</layout>
    </pluginRepository>
</pluginRepositories>

Only releases are enabled, since Oracle will not deploy snapshots to the repository. In addition, I set the update policy to never since Oracle will not provide patches or other updates. New artifacts will be published for future releases of ADF, but you will have to explicitly reference the new version numbers in your POMs to pick them up. We did this for stability reasons.

The repository is defined twice; once as an artifact repository and once as a plugin repository. The latter is needed since the ojmake and ojdeploy plugins are needed in order to build ADF applications. Moreover, the WebLogic plugin might be required for runtime deployment. This distinction between artifact and plugin repositories has been introduced in Maven 3.


Step 4: Update Wagon-http 

Wagon-http is a component used by Maven to access remote repositories. The Oracle public Maven repository is protected by the same single sign-on (SSO) solution that other Oracle web sites use. Unfortunately, older versions of wagon-http are not compatible with such enterprise-grade infrastructures. However, it is possible to override the version that is bundled with Maven.

To use the Oracle repository, you will need wagon-http 2.8. Simply download the required JAR file from the following location:

 http://central.maven.org/maven2/org/apache/maven/wagon/wagon-http/2.8/wagon-http-2.8-shaded.jar

You will then need to copy the file to the M2_HOME\lib\ext directory. 

Please note that wagon-http 2.8 is included in Maven 3.2.5 and higher. Thus, this step will likely become unnecessary in the future, since new releases of JDeveloper will probably provide an even more recent build of Maven.  


Step 5:  Define the server

While is is possible to define repositories in the POM, certain settings must absolutely be defined in the settings.xml file. Here is how the server definition look like.

<server>
  <id>maven.oracle.com</id>
  <username>blueberry.coder@oracle.com</username>
  <password>{HsygnP77JNIHdWNRgDiuknhzqnt0NFtIpTlQ4jlwOGk=}</password>
  <configuration>
	<basicAuthScope>
	  <host>ANY</host>
	  <port>ANY</port>
	  <realm>OAM 11g</realm>
	</basicAuthScope>
	<httpConfiguration>
	  <all>
		<params>
		  <property>
			<name>http.protocol.allow-circular-redirects</name>
			<value>%b,true</value>
		  </property>
		</params>
	  </all>
	</httpConfiguration>
  </configuration>
</server>

 In theory, you could write your password in clear text and things would work. However, Oracle strongly recommends that you encrypt the password using the tools provided by Maven.


Step 6: Encrypt your password 

There are two tasks to perform in order to complete this step.

First, you must generate and store a master password. This password is used to encrypt other passwords and should not be the same used by your Oracle account. To generate it, open a command prompt and execute the following command:

mvn --encrypt-master-password <password>

This will give you back a string such as {RpmTqVoMD0kHBbAIe2Jq1vdcM8HuPb/uvdnO+R4c67g=}.You should then create a file called settings-security.xml in the same folder as the settings.xml file used by Maven. The contents of this file should look like this:

<settingsSecurity>
  <master>{RpmTqVoMD0kHBbAIe2Jq1vdcM8HuPb/uvdnO+R4c67g=}</master>
</settingsSecurity>

 Once you are done with the master password, you can encrypt the actual password for your account using the command below.

mvn --encrypt-password <password>

 Simply paste the string returned in the server definition that you created in step 5.


Step 7: Add the WebLogic plugin to your POM 

If you want to deploy your ADF application to WebLogic through Maven, you will need to reference the appropriate plugin in its top-level POM. 

<plugin>
    <artifactId>weblogic-maven-plugin</artifactId>
    <groupId>com.oracle.weblogic</groupId>
    <version>12.1.3-0-0</version>
    <configuration/>
    <executions>
        <execution>
            <id>wls-deploy</id>
            <phase>pre-integration-test</phase>
            <goals>
                <goal>deploy</goal>
            </goals>
            <configuration>
                <remote>false</remote>
                <adminurl>t3://127.0.0.1:7777</adminurl>
                <user>weblogic</user>
                <password>welcome1</password>
                <source>${basedir}/deploy/FullMavenTest_Project1_FullMavenTest.ear</source>
                <targets>DefaultServer</targets>
                <verbose>true</verbose>
                <name>${project.build.finalName}</name>
            </configuration>
        </execution>
    </executions>
</plugin>

In this example, the WLS administrative password has been specified in clear text. The userConfigFile and userKeyFile parameters make it possible to encrypt the password, however. In addition, the source parameter must point to the EAR produced by the build process. The name for that EAR is managed through JDeveloper settings, and not through the POM. Moreover, please note that the value for the targets parameter should be set to DefaultServer if you want to deploy to the JDeveloper integrated WLS instance.

The WebLogic plugin requires a specific JAR in the classpath in order to execute. This requirement can be fulfilled by adding this dependency to the POM:

<dependency>
    <groupId>com.oracle.weblogic</groupId>
    <artifactId>weblogic-classes</artifactId>
    <version>12.1.3-0-0</version>
    <scope>system</scope>
    <systemPath>${oracleHome}/wlserver/server/lib/weblogic-classes.jar</systemPath>
</dependency>

The systemPath parameter must be the absolute path to the weblogic-classes.jar file. 


Step 8: Update your JDBC settings

In the case you plan to deploy your ADF application to WebLogic using Maven, you need to change a specific JDBC setting using JDeveloper. The reason for this is that, normally, the IDE will regenerate the WLS JDBC descriptors at deployment time. Unfortunately, this results in an error message when deploying without JDeveloper, whether through the command line or the WLS console. The specific error message is: « No credential mapper entry found for password indirection »

To avoid this, create a JDBC datasource on the WLS server and uncheck the following setting in the Application Properties dialog. 

Please note that defining a credential mapping using the WLS console will not work.  

If you have deployed the application from JDeveloper before, you will need to delete the generated weblogic-jdbc.xml and remove the reference to it inside the weblogic-application.xml descriptor. You will find both files under the src/META-INF subdirectory of the application workspace.


Step 9: Let's do this!

You are now ready to compile and deploy your application. You can do so on the command line only for the time being. Moreover, you will need to specify a value for the ORACLE_HOME environment variable, since this is normally something JDeveloper will do for you. Here is an example:

mvn pre-integration-test -Denv.ORACLE_HOME=c:\Oracle\jdev1213

 Just remember to make sure the JDeveloper integrated WLS instance is running before starting the build process if you want to deploy on it.

At this time, it is not possible to index the contents of the Oracle public repository. We will need to update JDeveloper in order to make this possible. 

Thursday Dec 18, 2014

GeneratedPassword: a hidden MAF gem

The transition from ADF Mobile to the Oracle Mobile Application Framework earlier this year brought us many great new features. Some of them were very visible, such as new Data Visualization components or the new Oracle Alta UI skin. Others are more difficult to find. In this post, I want to introduce you to one of the latter, namely the GeneratedPassword class, which is part of the oracle.adfmf.framework.api package.

The sole function of GeneratedPassword is to generate and manage random passwords. Each of the passwords is identified by a key, which will be used to retrieve it when needed. The passwords are saved to an encrypted credential store, similar to the one used to store user passwords related to login connections. To create a password, simply pass the desired key to the appropriate method along with a seed which will add entropy to the generation algorithm. Subsequent calls to that method using the same key and seed will result in a different password every time. 

At this point, maybe you are asking yourself what those random passwords are good for. Personally, I find them very useful to enhance the protection of encrypted SQLite local databases. In the current version of Oracle MAF, it is necessary to provide a password in order to encrypt or decrypt a database. If you hardcode that password, it will be shared by all deployed instances of your application. If one instance is compromised, then all others will be at risk. Using a random password for that use case is a good mitigation measure, since every instance uses a different encryption key. Compromising multiple instance is thus much more time consuming. 

Tuesday Sep 09, 2014

Mobile and Social login connections in Oracle MAF: a few hints to install the server-side components

Of all the new security-related features introduced in the Oracle Mobile Application Framework, the most interesting ones are in my opinion support for the OAuth 2.0 protocol and the tight integration to Oracle's identity management solutions. The former enables you to build MAF applications that will integrate with popular public APIs, such as the ones offered by Google and Facebook. The latter makes the implementation of comprehensive access control scenarios significantly easier, while keeping things extremely simple in a developer perspective. This is all thanks to the Oracle Access Management Mobile and Social (OAMMS) component of IDM. Don't believe me? Have a look at this recording I made for the Oracle Mobile Platform channel on YouTube. 

If you want to try OAMMS for yourself, you will need to install it in your own environment. Overall, the process is fairly painless and is similar to other Fusion Middleware products. However, there are a few things should be aware of. Here are a few hints to guide you along the way.

  1. You need at least IDM 11gR2 PS2
    In other words,  MAF is certified with OAMMS 11.1.2.2 or later. Ensure you download the correct version! 


  2. Use JDK 7
    Java 8 has been with us for more than a year now. Public updates for Java 6, on the other hand, have stopped back in... 2011. If you were installing a production server today, I would strongly recommend you to use Java 7.

    Oracle WebLogic Server 10.3.6 is certified for use with Java 7 on Windows, Linux and other platforms. The official documentation explains at length how to use both together. The critical part is to ensure to override some of the standard JDK classes with ones provided with Weblogic:

    After installing WebLogic Server, copy the following files from WL_HOME/modules to JAVA_HOME/jre/lib/endorsed, where WL_HOME is the WebLogic Server installation home directory: javax.annotation_1.0.0.0_1-0.jar, javax.xml.bind_2.1.1.jar and javax.xml.ws_2.1.1.jar


  3. Install both OAM and OAMMS
    Technically, OAMMS can be installed in standalone mode. However, you will get a much more useful setup if you deploy it alongside OAM, since you will gain the capacity to configure SSO for web service calls and remote URL access. In addition, OAMMS is already preconfigured to use OAM for authentication when you install both at the same time.

    I do not recommend to install Oracle Adaptive Access Manager (OAAM) if you are building  a development environment. Some of the features of the product, such as IP address geolocation, require third party dependencies that cannot be obtained for free. 


  4. Don't forget to configure the security store
    Once the software has been installed, it is essential to perform an additional configuration process for the database security store. For a brand new install, you should execute the command shown below. In this case, WebLogic was installed in /oracle/wls1036, the IDM binaries were in /oracle/wls1036/Oracle_IDM1 and I had created a domain named idmps2. The value for the -p parameter is the password for the OPSS schema you created using the Repository Creation Utility (RCU) before installing the IDM software. 

    /oracle/wls1036/oracle_common/common/bin/wlst.sh /oracle/wls1036/Oracle_IDM1/common/tools/configureSecurityStore.py -d /oracle/wls1036/user_projects/domains/idmps2/ -c IAM -p oracle -m create 


  5. Upgrade the OPSS schema
    Another thing you need to do before starting your OAMMS WebLogic domain for the first time is to update the OPSS schema using the patch set assistant. This is necessary to ensure that the versions for the database and the binaries are in sync.


  6. Install the most recent Identity Management Suite Bundle Patch
    Finally, it is essential to deploy the latest bundle patch for the product. At the time of writing, this was patch 18662903. The patch corrects an important problem in the user interface for the OAuth authentication service, among other things. This install is done trough OPatch, by the way. The necessary executable is installed alongside the IDM binaries; you do not need to have your own OPatch installation. 

Once you are done, you will need to configure OAMMS properly before your MAF applications can authenticate against it. Fortunately, you can learn about what you need to do on YouTube

Monday Jun 30, 2014

JDeveloper 12.1.3 is good news for Maven fans

JDeveloper 12c 12.1.3 is finally available. The list of new features and enhancements is quite impressive. Have a look! You will not be disappointed. Personally, I was very happy to discover than Apache Maven support has been enhanced in two small but critical areas: paths and archetypes. Don't get me wrong: Maven support in 12.1.2 is leaps and bounds ahead what 11g offers. But the tweaks brought in 12.1.3 make a significant difference. 

The main issue with the Maven support in 12.1.2 is that the POM files generated by JDeveloper contain absolute paths. This is problematic, since applications will not compile correctly unless the code resides in the same location on all developer workstations and build servers. This is not always possible or even desirable. Thus, I described how to replace those absolute paths with relative ones in a previous blog post. Fortunately, JDeveloper 12.1.3 does things differently and writes its POMs with relative paths instead. 

Maven archetypes help developers create new applications from scratch  from the command line. It is now possible to build a new ADF application that way using the oracle-adffaces-ejb archetype introduced by Oracle in JDeveloper 12.1.3. The resulting application will use EJB for its model layer. To use the archetype, simply issue a command like the one below:

mvn archetype:generate
 -DarchetypeGroupId=com.oracle.adf.archetype
 -DarchetypeArtifactId=oracle-adffaces-ejb
 -DarchetypeVersion=12.1.3-0-0
 -DgroupId=oracle.test
 -DartifactId=my-maven-test
 -Dversion=1.0-SNAPSHOT

 Obviously, this command will be successful only if the Maven binaries directory has been added to the path. Remember that Maven is provided by JDeveloper and can be found in the ORACLE_HOME/oracle_common/modules directory - although you can use your own install instead.

 If you prefer to use a GUI, you can create an application from inside JDeveloper by using a little known option introduced in 12.1.2. First, open the New... gallery and select the Maven subcategory (under the General category). Then, select the Generate from Archetype item.

This will bring up the dialog shown below.

Fill the various values per Maven conventions. The application top level directory will be created under the directory you specify. To select the archetype to use, click on the looking glass besides the Maven archetype field. JDeveloper will then display the Search for Archetypes dialog. To use it, simply type a search string and press Enter. JDeveloper will list all the matching archetypes available for the repositories selected.

Simply select the appropriate archetype and click on OK to create the new application. 

Depending how your environment has been setup, it is possible that your local Maven repository doesn't contain an archetype catalog. If that's the case, the Local Repository option will be grayed out in the Search for Archetypes dialog. To fix this, execute the command below. 

mvn archetype:crawl -Dcatalog=<maven directory folder>/archetype-catalog.xml 

 On Linux and OS X, the local repository is usually found at $HOME/.m2.  On Windows, the default location is %HOMEDRIVE%%HOMEPATH%/.m2. 

Thursday Apr 03, 2014

Going to Vegas? Don't gamble with your learning and attend my sessions!

Next week, I will be in Las Vegas for the Collaborate 2014 conference. I have the privilege to deliver two sessions there. Are you intrigued by ADF Essentials? Would you like to know how to use APEX applications as a back-end for mobile applications? Here is your chance to learn more!

178: A Free Toolkit for Modern Web Applications: A Look at ADF Essentials
Date: 04/08/2014 (Tuesday)
Time: 4:15 PM
Location: Level 3, Murano 3205

179: Bring Your APEX Application to iOS and Android with ADF Mobile
Date: 04/10/2014 (Thursday)
Time: 4:15 PM
Location: Level 3, Murano 3203

I will probably spend a lot of time at our Mobility booth as well. Drop by and say hello if you're there!

Friday Mar 28, 2014

ADF Mobile: the Access Control Service

ADF Mobile applications use standard HTTP mechanisms for authentication.  The HTTP protocol, however, does not handle authorization. Thus, to enable applications to obtain the roles and privileges of a specific user,  you need to implement a REST web service called the Access Control Service. In this post, I will show you how to implement the foundations for that service. 

The product documentation states that the Access Control Service consumes and produces JSON data. The snippet below illustrates how parameters are passed to the service. 

{
        "userId": "johnsmith",
        "filterMask": ["role", "privilege"],
        "roleFilter": [ "role1", "role2" ],
        "privilegeFilter": ["priv1", "priv2", "priv3"] 
}

All parameters other than userId are optional. FilterMask is used to specify which filters should be applied to the request (role, privilege or both). RoleFilter and privilegeFilter simply enumerate the filter values. If no filters are specified, the web service should return the list of all roles and privileges for the user. Otherwise, the service should only verify if the user belongs to the roles listed in roleFilter and if he has been granted the privileges listed in privilegeFilter.

The ADF Mobile documentation gives the JSON snippet below as an example of a service response.

{
        "userId": "johnsmith",
        "roles": [ "role1" ],
        "privileges": ["priv1", "priv3"] 
}

As you can see, the service returns only the roles and privileges that actually apply to the user. 

The easiest way to implement the service is to take advantage of the POJO mapping feature offered by some JAX-RS implementations, such as Jersey. The first step is to build POJO class definitions for the service request and response. JDeveloper made this very easy. I only had to type a few lines of code for the class attributes and generated the accessors and constructors. 

This is the code for the request class. 

package oracle.sample.maf.accesscontrol.bo;

import javax.xml.bind.annotation.XmlRootElement;

@XmlRootElement
public class ACSRequest {
    
    private String userId;
    private String[] filterMask;
    private String[] roleFilter;
    private String[] privilegeFilter;

    public ACSRequest() {
        super();
    }

    public ACSRequest(String userId, String[] filterMask, String[] roleFilter, String[] privilegeFilter) {
        super();
        this.userId = userId;
        this.filterMask = filterMask;
        this.roleFilter = roleFilter;
        this.privilegeFilter = privilegeFilter;
    }


    public void setUserId(String userId) {
        this.userId = userId;
    }

    public String getUserId() {
        return userId;
    }

    public void setFilterMask(String[] filterMask) {
        this.filterMask = filterMask;
    }

    public String[] getFilterMask() {
        return filterMask;
    }

    public void setRoleFilter(String[] roleFilter) {
        this.roleFilter = roleFilter;
    }

    public String[] getRoleFilter() {
        return roleFilter;
    }

    public void setPrivilegeFilter(String[] privilegeFilter) {
        this.privilegeFilter = privilegeFilter;
    }

    public String[] getPrivilegeFilter() {
        return privilegeFilter;
    }
}

The @XmlRootElement annotation makes it very easy to generate both JSON and XML from a single set of objects; I added it to the class even though it wasn't necessary in this specific use case.

The code for the response class is straightforward.  

package oracle.sample.maf.accesscontrol.bo;

import javax.xml.bind.annotation.XmlRootElement;

@XmlRootElement
public class ACSResponse {
    
    private String userId;
    private String[] roles;
    private String[] privileges;
    
    public ACSResponse() {
        super();
        roles = new String[0];
        privileges = new String[0];
    }
    
    public ACSResponse(String p_userId, String[] p_roles, String[] p_privileges) {
        super();
        userId = p_userId;
        roles = p_roles;
        privileges = p_privileges;
    }

    
    public String getUserId(){
        return userId;
    }
    
    public void setUserId(String p_id){
        userId = p_id;
    }
    
    public String[] getRoles(){
        return roles;
    }
    
    public void setRoles(String[] p_roles){
        roles = p_roles;
    }

    public String[] getPrivileges(){
        return privileges;
    }
    
    public void setPrivileges(String[] p_privileges){
        roles = p_privileges;
    }
}
  

By default, POJO mapping is not enabled in Jersey. It is thus essential to add the following init parameter to the Jersey servlet declaration in the web.xml for the application.

        <init-param>
            <param-name>com.sun.jersey.api.json.POJOMappingFeature</param-name>
            <param-value>true</param-value>
        </init-param>

The service itself is implemented as yet another POJO. I generated a skeleton for the class using the REST web service wizard in JDeveloper. The method that will process the request must be configured to process requests made using the POST HTTP verb. Hence, the single method in the class is annotated with @POST.

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
package oracle.sample.maf.accesscontrol.service;

import javax.ws.rs.Consumes;
import javax.ws.rs.POST;
import javax.ws.rs.Path;
import javax.ws.rs.Produces;
import javax.ws.rs.core.Response;

import oracle.sample.maf.accesscontrol.bo.ACSRequest;
import oracle.sample.maf.accesscontrol.bo.ACSResponse;

@Path("user")
@Consumes("application/json")
@Produces("application/json")
public class UserRessource {
    public UserRessource() {
    }

    @POST
    public Response postData(ACSRequest request) {
        
        // Replace this with actual logic.
        ACSResponse rolesAndPrivileges = new ACSResponse(request.getUserId(), 
                                                         new String[] { "user" }, 
                                                         new String[] { "user" });
        Response.ResponseBuilder builder = Response.ok(rolesAndPrivileges);
        return builder.build();
    }
}

The @Path annotation on line 12 determines the path to the REST resource, whereas @Consumes and @Produces specify the expected data formats for the method's input and output. 

The sample class above doesn't contain error handling code and returns a hard coded response. A production implementation should use the various static methods in javax.ws.rs.core.Response to build responses. The ok(Object) method will typically be used; other methods such as noContent(), notAcceptable(List) and serverError() will be called instead when specific conditions are met or if exceptions have been raised. 

There are many ways to obtain the data needed to populate the response. You could query a database, for example, or use the OPSS APIs to query an LDAP server. Whatever option you choose, your service implementation should ensure that users cannot query another user's roles and privileges unless they have administrative privileges themselves. In other words: exposing on the internet a web service which enables anybody to identify privileged user accounts for your back-end is a bad idea. Ideally, your service should:

  • Accept connections over HTTPS only
  • Check that the credentials used to establish the SSL / TLS connection match the userId in the service input - unless the user can manipulate the roles and privileges of other users
The Access Control service is an essential component of the ADF Mobile architecture. Since Oracle only specifies the service signature, you can implement it in the way that best fits your infrastructure and the level of authorization granularity you expect.  

Thursday Feb 27, 2014

Sometimes a new skin is not enough

I often have the opportunity to present about Oracle's mobile technologies. In those presentations, I usually explain to the audience that building and maintaining a mobile application will force them to make their software development processes more agile. The reason for this is simple. Mobile technologies evolve at consumer speed. Mobile operating systems are updated frequently, and applications must follow suit. The best proof I can give is the evolution of ADF Mobile itself. The initial version of the framework has been released in October 2012. Between that date and January 2014, we published one new version and five distinct patch sets for it; that's one update every 11 weeks on average. Such frequent releases are unusual for Oracle. In the mobile space, they're the norm

Recently, I installed ADF Mobile Patch 5. This version includes a new skin which brings an iOS 7 look and feel for Apple devices as well as a native look and feel for Android devices. In addition, JDeveloper now supports version 5 of XCode to package and deploy iOS applications. After the update, my applications didn't look right when I deployed and ran them. It turns out you must update the skin-family in adfmf-config.xml from this

<skin-family>mobileFusionFX</skin-family>

to this

<skin-family>mobileAlta</skin-family>

In addition, I suggest you perform a clean all (Build > Clean All) before deploying your applications. Mine would not pick up the new skin otherwise. 

Monday Jan 27, 2014

Picking up the threads in ADF Mobile

There is a huge difference between the actual performance of an application and the user's perception of that performance. Typically, developers will try to improve the latter by delegating time-consuming tasks to background threads; in other words: asynchronous processing makes it possible to keep the user interface responsive at all times. This is especially important in mobile applications, where network bandwidth and latency can fluctuate wildly in a short time frame. Users are not necessarily aware of changes in network conditions, and thus will readily ascribe any slowdown to the application itself. Consequently, multithreaded programming is an essential part of the mobile developer's tool set.

ADF Mobile applications run on a Java virtual machine. Therefore, they can start threads that will exist in the context of the JVM process. In the current release, the ADF Mobile JVM follows the JavaME CDC specification, which is based on Java 1.4. This means that, unfortunately, the improvements brought by JSR 166 (java.util.concurrent)  are not available. On the other hand, threads are well integrated in the ADF Mobile framework. They can invoke AdfmfJavaUtilities.invokeDataControlMethod or AdfmfJavaUtilities.setELValue, for example. This makes it possible for you to update the user interface or refresh a bound collection in memory from a thread among other things.

The Apple iOS and Google Android operating systems manage application-related resources themselves. In iOS, when you switch to another application, the current application is suspended. On the other hand, Android's behavior in the same scenario will vary depending on the free memory available on the device. Typically, the processes belonging to an application will continue to run in the background after the switch; when memory is scarce, the operating system may force-kill the process. What happens when you switch away from an ADF Mobile application is thus dependent on the underlying OS. Any threads started by the application process will behave in the same way as the process itself.  By default, threads will suspend and resume by themselves on iOS; they will still run in the background on Android. 

If you want to implement multithreading in your application, my recommendation is to always manage the state of your threads explicitly and to interrupt them when the application is deactivated or suspended. This will ensure the integrity of your data and will make the application behave the same way independently of the operating system. Interrupting a thread is done by calling the interrupt() method of the Thread class and by checking the return values of the interrupted() or isInterrupted() methods inside the run() method of the thread or of the runnable.  The proper location for the call to interrupt() is a listener class implementing the oracle.adfmf.application.LifeCycleListener interface; such listeners must be registered in  adfmf-application.xml. The activate() and deactivate() methods it specifies will be invoked even if the application is killed through the Android task manager. Typically, in addition to interrupt the threads, the application will need to do the following in order to ensure a proper deactivation:

  • Write any restorable state to an appropriate store
  • Close database cursors and connections
  • Defer pending web service requests
  • Release resources such as files

These tasks can be performed by the threads themselves, by their associated java.lang.Runnable instances, or somewhere else. Be careful, though, since activate() and deactivate()will not be called if the application is terminated. It is also possible to implement listeners at the feature level if more granularity is needed. Such listeners implement the oracle.adfmf.feature.LifeCycleListener interface instead. Please note calls to activate() and deactivate() are blocking; you will need to be careful to ensure the application doesn't look unresponsive to the user.

Resource contention is without a doubt one of the greatest challenges any multithreaded application must solve. In ADF Mobile, each local database corresponds to a single file; the SQLite database engine thus implements a complex but reliable locking system.  Fortunately, ADF Mobile encapsulates all the complexity. If two threads - each possessing its own JDBC connection to the database - try to write at the same time, no exception will be thrown. One of the threads will own the write lock and will be able to proceed, while the other will wait. In other words: there can be only one database connection in write mode at any given time. All other connections will be in read-only mode until they can acquire the write lock. This will influence the design of your application. For example: if you have to insert a sizable number of records in a background thread, you will perform the operation in smaller batches in order to yield the lock to other threads of higher priority. 

Writing a good application is not easy, nor is writing a good performing one.  Multithreading can help with the latter, but you must be careful not to waste resources when the application is not in the foreground. After all, performance is not the only component in the user's perception of your application; battery life counts as well... 

Friday Oct 25, 2013

Juggling with JDKs on Apple OS X

I recently got a shiny new MacBook Pro to help me support our ADF Mobile customers. It is really a wonderful piece of hardware, although I am still adjusting to Apple's peculiar keyboard layout. Did you know, for example, that the « delete » key actually performs a « backspace »? But I disgress... As you may know, ADF Mobile development still requires JDeveloper 11gR2, which in turn runs on Java 6. On the other hand, JDeveloper 12c needs JDK 7. I wanted to install both versions, and wasn't sure how to do it.  

If you remember, I explained in a previous blog entry how to install JDeveloper 11gR2 on Apple's OS X. The trick was to use the /usr/libexec/java_home command in order to invoke the proper JDK. In this case, I could have done the same thing; the two JDKs can coexist without any problems, since they install in completely different locations. But I wanted more than just installing JDeveloper. I wanted to be able to select my JDK when using the command line as well. On Windows, this is easy, since I keep all my JDKs in a central location. I simply have to move to the appropriate folder or type the folder name in the command I want to execute. Problem is, on OS X, the paths to the JDKs are... let's say convoluted. 

Here is the one for Java 6.

/System/Library/Java/JavaVirtualMachines/1.6.0.jdk/Contents/Home

The Java 7 path is not better, just different.

/Library/Java/JavaVirtualMachines/jdk1.7.0_45.jdk/Contents/Home

Intuitive, isn't it? Clearly, I needed something better...

On OS X, the default command shell is bash. It is possible to configure the shell environment by creating a file named « .profile » in a user's home folder. Thus, I created such a file and put the following inside:

export JAVA_7_HOME=$(/usr/libexec/java_home -v1.7)
export JAVA_6_HOME=$(/usr/libexec/java_home -v1.6)

export JAVA_HOME=$JAVA_7_HOME

alias java6='export JAVA_HOME=$JAVA_6_HOME'
alias java7='export JAVA_HOME=$JAVA_7_HOME'

 The first two lines retrieve the current paths for Java 7 and Java 6 and store them in two environment variables. The third line marks Java 7 as the default. The last two lines create command aliases. Thus, when I type java6, the value for JAVA_HOME is set to JAVA_6_HOME, for example. 

I now have an environment which works even better than the one I have on Windows, since I can change my active JDK on a whim. Here a sample, fresh from my terminal window.

fdesbien-mac:~ fdesbien$ java6
fdesbien-mac:~ fdesbien$ java -version
java version "1.6.0_65"
Java(TM) SE Runtime Environment (build 1.6.0_65-b14-462-11M4609)
Java HotSpot(TM) 64-Bit Server VM (build 20.65-b04-462, mixed mode)
fdesbien-mac:~ fdesbien$ 
fdesbien-mac:~ fdesbien$ java7
fdesbien-mac:~ fdesbien$ java -version
java version "1.7.0_45"
Java(TM) SE Runtime Environment (build 1.7.0_45-b18)
Java HotSpot(TM) 64-Bit Server VM (build 24.45-b08, mixed mode)
fdesbien-mac:~ fdesbien$ 

Et voilà! Maximum flexibility without downsides, just I like it. 

Monday Oct 07, 2013

ADF at 12c: time to get the facts straight

The 12c release is a major milestone both for ADF and JDeveloper. Most of you already know how strategic ADF is to Oracle; day after day, thousands of our own developers use it to build Fusion Applications, Enterprise Manager, SOA Suite and WebCenter among others. What is not readily apparent is how much maturity there is in the framework. The roots of ADF can be traced back to 1999, when the first release of Java Business Objects (JBO) was made available. The ancestor of today's ADF Faces components, User Interface XML (UIX), has been introduced in 2002. More than ten years later, ADF is still going strong and the best is yet to come.

I have been developing with ADF since 2007. Since then, I often had the opportunity to introduce new developers to it. While I was often greeted with skepticism, the natural qualities of the framework and the productivity brought by JDeveloper usually won minds if not hearts. I saw this once again at OpenWorld 2013. Obviously, ADF is not perfect and there are several worthy alternatives in the market. But what surprises me is that many of the objections made against ADF stem from misconceptions - even after all those years. Here are five of the most common ones:

  • ADF is not open.
  • ADF is just for huge enterprise applications.
  • ADF is proprietary.
  • ADF is tied to JDeveloper,  Weblogic and Oracle Database.
  • ADF is expensive.

My aim in this post is to get the facts straight. Let's discuss each of them.

ADF is not open

This is something I heard frequently. But what does « open » mean? Is it about access to the source code? About technical interoperability? Maybe about stewardship? Customers covered by a valid support contract can request access to the ADF source code. Not only that, but some of its components have been released under open-source licenses, the most significant being Apache MyFaces Trinidad. And since ADF is built on the top of Java Enterprise Edition, it integrates with other solutions running on the platform. True, Oracle keeps full control over strategic orientations and new features. But our company is making significant efforts to better address the concerns of the community. The ADF Enterprise Methodology Group, for example, is a great forum to propose and discuss new features. We follow closely what is posted there and will never hesitate to open enhancements requests if needed.

ADF is just for huge enterprise applications

This is a funny one, and probably comes from the fact that ADF is based on Java. Yet, small and simple applications are a cinch to implement with the framework; it focuses on productivity first. Lots of people forget that ADF favors a code last approach. In other words: most ADF artifacts can be implemented declaratively rather than through code. In addition, most of the time, developers will build the user interface simply by drag and dropping attributes from the data controls palette. Moreover, ADF puts great emphasis on reuse. Entity Objects, View Objects, Task Flows and Page Fragments are inherently reusable. You can push this even further by using page fragments, JSF templates and declarative components. Thus, you can reduce the actual size of your applications by sharing code extensively between applications. It is also essential to remember that ADF implements several common software patterns, such as Model-View-Controller, for you. This results in a little more complexity, but ensures that even the smallest of your applications adhere to industry best practices. 

ADF is proprietary

ADF is certainly unique to Oracle. In fact, it represents one of our biggest competitive advantages in the marketplace. Yet, some people conveniently forget it is a superset of Java Enterprise Edition first and foremost. ADF Faces, for example, is probably the most comprehensive set of JSF components available right now; the Data Visualization components now render to HTML 5 instead of Flash. On the other hand, ADF offers extensive support for the SOAP protocol and the WS-* extensions, which are industry standards. Yes, ADF deviates from JEE in some cases - but typically this is because it was ahead of the curve. ADF BC is rooted in JBO, a technology introduced in 1999. EJBs didn't deliver the performance and features required by developers at the time. In 2008, ADF Controller and Task Flows brought more flexibility than the standard JSF controller - which finally caught up in 2013 in JEE 7. We even make it possible to use EJB or JPA to implement business logic if you prefer them to ADF BC. 

Moving forward, you can expect ADF to integrate many more standards, but not at the cost of innovation.

ADF is tied to JDeveloper,  Weblogic and Oracle Database

This one was true a few years ago. Nowadays, you can build ADF applications in Eclipse by installing the Oracle Enterprise Pack for Eclipse plugins. You can use almost any SQL92-compliant database with ADF, and we even offer optimizations specific to IBM DB2 and Microsoft SQL Server. Best of all, we offer integration to various Application Lifecycle Management platforms in JDeveloper and OEPE, but are not offering one ourselves. You get to choose the tools you prefer to support your development process. And with the free ADF Essentials, you can deploy your ADF applications to nearly all containers implementing the Java Enterprise Edition web profile. GlassFish server is an obvious choice here, but old favorites like Apache Tomcat and JBoss can be used too. 

ADF is expensive

No, it's not. For a long time, ADF has been merely inexpensive as it was bundled with WebLogic Server. With ADF Essentials, the core features are free; the features cut from from it are essentially hooks to other Fusion Middleware products. The developer tools, by the way, are completely free. You cannot really appreciate that fact unless you had to pay for multiple IDE licenses for your team, something I had to do earlier in my carrier when I was building software for IBM and Microsoft platforms. 

Conclusion

ADF 12c is there. And it's here to stay. Maybe you should consider it...

Wednesday Sep 04, 2013

Out of the matrix: the ADF EMG goes to OpenWorld!

The ADF Enterprise Methodology Group is an invaluable resource for ADF architects and experienced developers. Where else on the web can you discuss best practices and methodologies to deliver enterprise-level ADF applications? If you like it as much as I do, you will be glad to know there is an ADF EMG day this year at Oracle OpenWorld. Yes, the EMG leaves the matrix and boldly comes to the real world for a full day of presentations by well-known experts from the ADF community. And there is no better way to prepare for the conference than to share stories from the trenches over a few beers ar the ADF EMG social night. I hope to meet you there!

Sunday Sep 01, 2013

Think TV is boring? Think again!

Since July, I have been busy contributing to an exciting new initiative from Oracle. Our aim is simple: to revolutionize TV! While Apple's infamous iTV is still little more than a rumor, we are delivering! Ok. Maybe I got a little carried away... Anyway, if you are an ADF architect or developer, you really should  check our ADF Architecture TV channel on YouTube. There, you will find a series of weekly episodes which will help you learn ADF design and best practices from key Oracle ADF specialists. This is real, deep technical content; not marketing fluff. Make sure you subscribe as we have over 100 episodes planned covering security, web services, PL/SQL integration, UI design and much more! This week's episode was recorded by Frank Nimphius and is about Task Flows. 

The episodes I recorded are about internationalization. There are five of them and cover resource bundles, character encoding and time zone management, among other things. I will let you know when they are released. 

Who would have thought TV could get this interesting? Here is the official trailer for the channel:


Monday Jul 29, 2013

JDeveloper 12c and Maven: Using relative paths in your POM files

One of the greatest things about JDeveloper 12c is the significantly improved support for Apache Maven. By default, the POM files produced by my favorite IDE contain some absolute paths, such as the folder for the ojmake/ojdeploy executables and references to project (.jpr) and application (.jws) files. Here is a sample of such paths extracted from an application I created on a Windows machine.

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
      <plugin>
        <groupId>com.oracle.adf.plugin</groupId>
        <artifactId>ojmake</artifactId>
        <version>12.1.2-0-0</version>
        <configuration>
          <ojmake>
            C:\Oracle\Middleware1212\jdeveloper\jdev\bin\ojmake.exe
          </ojmake>
          <files>
            C:\OracleData\JDEV_USER_DIR\mywork\MavenTest\Model\Model.jpr
          </files>
          <usemaven>
            true
          </usemaven>
        </configuration>
        <executions>
          <execution>
            <phase>compile</phase>
            <goals>
              <goal>compile</goal>
            </goals>
          </execution>
        </executions>
      </plugin>

This is perfectly fine if your development team uses workstations that are configured in a consistent way and run under the same operating system. It is often not the case in the real world. Fortunately, Maven provides features that make it very easy to use relative paths instead.

The first thing you should know is that Maven supports the use of variables in POM files. Those variables are always referenced using the following syntax:

${variable_name}

There are several built-in variables provided by Maven. One of them is ${basedir}, which points to the folder where the current POM is located. In addition, Maven can access any environment variable defined by the operating system. This is achieved though the following syntax:

${env.variable_name}

 Thus, it is possible to remove all absolute paths from the sample above by using ${basedir} and referencing an environment variable. Suppose I created such a variable named OJ_HOME, which points to  C:\Oracle\Middleware1212\jdeveloper\jdev\bin. Then, the POM would look like this:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
      <plugin>
        <groupId>com.oracle.adf.plugin</groupId>
        <artifactId>ojmake</artifactId>
        <version>12.1.2-0-0</version>
        <configuration>
          <ojmake>
            ${env.OJ_HOME}\ojmake.exe
          </ojmake>
          <files>
            ${basedir}\ViewController.jpr
          </files>
          <usemaven>
            true
          </usemaven>
        </configuration>
        <executions>
          <execution>
            <phase>compile</phase>
            <goals>
              <goal>compile</goal>
            </goals>
          </execution>
        </executions>
      </plugin>

This POM will run on any workstation, granted the OJ_HOME variable is set to a suitable value.

Thursday Jun 20, 2013

Saved by the local database!

In my last post, I told you about my latest ADF Insider Essentials recording on the local database, and pointed you to the companion code sample. I had lots of feedback about both. I am glad to see I have so many viewers and readers!

Among all the questions I got, one was asked very frequently: « How can I transparently retrieve data from the database when a web service call fails? » In other words: how can the local database save my life if the web service doesn't respond? This is a bit different from what I had implemented initially. Thus, I built a new version of the sample application which does exactly that. In the original sample, the code simply detected the presence or absence of a network connection; an available connection meant the web service call was assumed to succeed. Otherwise, an exception was raised and displayed to the user. Thus, the key change to obtain the desired behavior is simply to catch the exception. Then, it is easy to invoke the method that retrieves data from the database instead.

Here is the relevant method in the sample application.

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
    private CountryBO[] getCountriesFromWS() {
        try {
            GenericType genericReturnValue =
                (GenericType)AdfmfJavaUtilities.invokeDataControlMethod("HR_WS", null, "findCountry", new ArrayList(),
                                                                        new ArrayList(), new ArrayList());
            CountryBO[] returnValue =
                (CountryBO[])GenericTypeBeanSerializationHelper.fromGenericType(CountryBO[].class, genericReturnValue,
                                                                                "result");

            Arrays.sort(returnValue);
            return returnValue;

        } catch (AdfInvocationException aie) {
            if (AdfInvocationException.CATEGORY_WEBSERVICE.compareTo(aie.getErrorCategory()) == 0) {
                AdfmfContainerUtilities.invokeContainerJavaScriptFunction("oracle.adfinsider.localdb.countries",
                                                                          "navigator.notification.alert",
                                                                          new Object[] { "The web service is unavailable. \n\n Data has been retrieved from the local cache.",
                                                                                         "null", "Warning", "Ok" });

                return getCountriesFromDB();
            } else {
                throw new RuntimeException(aie);
            }
        } catch (Exception ex) {
            Utility.ApplicationLogger.severe(ex.getMessage());
            throw new RuntimeException(ex);
        }
    }

Another nice thing this code snippet demonstrates is how to call JavaScript code from Java business logic. At line 15, I invoke a method which is part of the Apache Cordova library to display a warning message to the user. Cordova is an integral component of ADF Mobile, but your AMX pages must be properly configured in order to use it. I added the proper references to the countriesList.amx page like this:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
  <amx:panelPage id="pp1">
    <amx:verbatim id="v1">
        <script type="text/javascript">if (!window.adf) window.adf = {}; adf.wwwPath = "../../../../www/";</script> 
        <script type="text/javascript" src="../../../../www/js/base.js"></script>
        <script type="text/javascript" src="../../../../www/js/cordova-2.2.0.js"></script>
    </amx:verbatim>
    <amx:facet name="header">
      <amx:outputText value="#{viewcontrollerBundle.COUNTRIES}" id="ot1"/>
    </amx:facet>
...
  </amx:panelPage>

I placed the script references (lines 3 to 5) inside a verbatim tag, which ensures that they will be rendered as is in the page.

While I was at it, I fixed a few other issues with the sample. In the original version, the database connection was closed inside the stop() method of the LifeCycleListenerImpl class. The stop() method is usually called when the use exits the application; there is no guarantee, however. Thus, the connection wouldn't be closed properly in some corner cases. To fix this, I moved the code to the deactivate() method, which doesn't suffer from the same drawback and will be called each time the user switches to another application. This is much better, as the connection will be properly closed even if the device crashes while the application is inactive.

You can download the refreshed sample application here

Friday May 17, 2013

See the ADF Mobile local database in action

ADF Insider recordings are probable one of the best parts of my job. They are a lot of work, sure. But they are lots of fun to do and join so many members of the ADF Community... 

My latest recording is on the ADF Mobile local database. In it, I explore the various aspect of the feature and devote a healthy chunk of time to the management of the database file. The slides contain a few selected code snippets, but I thought it would be better to build a sample application to fully illustrate the concepts. In particular, I wanted to show how it is possible to retrieve data from either a web service or the local database while binding the UI to a POJO Data Control. 

My sample application  is made is made of two distinct components: 

   - A simple SOAP web service (SDO view object) built on the top of the HR database schema. 

   - The ADF Mobile application itself, that demonstrates local database techniques and calls the web service when a network is available. Data is fetched from the local database when their is no connectivity.

I contributed the application to the ADF Enterprise Methodology Group samples repository. It is not listed on the web pages right now, but you can download it from the following location:

https://svn.java.net/svn/smuenchadf~samples/ADFMobileLocalDatabase.zip

My recording is available on our YouTube channel here: http://www.youtube.com/watch?v=-XzE1n_j5Nc

About

Frédéric Desbiens

The musings of a member of the Mobility and Development Tools Product Management team.

I focus here on my favorite development frameworks, namely Oracle ADF and the Oracle Mobile Application Framework (MAF), but also have a strong interest in SOA and web services.

The views expressed on this blog are my own and do not necessarily reflect the views of Oracle.

Search

Archives
« March 2015
SunMonTueWedThuFriSat
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
    
       
Today