Tuesday Jan 24, 2012

Visual Notification During Long Running Transactions

From time to time, there will be transactions within your application which do not finish within the "blink of an eye". Later this week I'll be writing up a series of articles specifically on launching long running tasks asynchronously, but first I wanted to cover the case of a transaction which has to happen in real-time but ties up the browser whilst it's doing so. 

A file upload is a typical example of this. The user selects a 500MiB file and presses submit, it's probably time to go and get a cup of tea.  So the question is, how do you tell the user to go and find something more interesting to do for a while and, for that matter, please don't close your browser window...

Both Frank Nimphius and Andrejus Baranovskis have written articles in the past  (How-to show a glasspane and splash screen for long running queries and ,Glasspane in ADF Faces RC, respectively). However, both of those articles concentrated on showing a dialog as a glasspane to block user input and notifiy the user of some information.  In my case the upload was already in a dialog, so popping up another layered dialog on top of that would be ugly, so I wanted to find a way to display a loading indicator of some sort, in-line. 

The result is a JavaScript routine that can be called from  a clientListener which will either pop the glasspane and dialog if you point it to a dialog, otherwise it will display an inline component, such as an image with a spinning logo or some text.

The Script

 Here is the JavaScript, the first method showWhenBusy(), is the one called from the clientListener. It reads the ID of the component that we want to show/hide from an clientAttribute called loadingIndicatorId. This makes the code nice and generic as we've not had to hardcode component IDs. 

//Global variable to hold the component ref.
var loadingIndicatorComponent; 

function showWhenBusy(event) {
  //get the dialog or other component we want to show and hide
  var componentId = event.getSource().getProperty('loadingIndicatorId');
  loadingIndicatorComponent = AdfPage.PAGE.findComponent(componentId);
    
  if (loadingIndicatorComponent != null) {
    AdfPage.PAGE.addBusyStateListener(loadingIndicatorComponent,handleBusyStateCallback);        
    event.preventUserInput();
  }
  else {
    AdfLogger.LOGGER.logMessage(AdfLogger.SEVERE, "Requested indicator compoenent not found");
  }
}

As you can see, all that this method does is to store the indicator component into a global JS variable and then create a busy state listener that the framework will invoke as it starts and ends the blocking operation.

The Listener

The listener is where all of the work happens. here we first of all check to see if the requested indicator component is a dialog or not, and then based on the busy state we do the right thing to show or hide.  In the case of the dialog this is a simple matter of calling the show() and hide() methods on the component.  In the case of any other component we achieve the effect by setting the CSS display style.  Note that in order to do this, we need to get a handle to the real DOM element that represents this component.  This is what the call to AdfAgent.AGENT.getElementById() call is doing:

function handleBusyStateCallback(event){
        
  if(loadingIndicatorComponent != null){
    // Check is this is a dialog as 
    // this needs different treatment
    var isDialog =
        (loadingIndicatorComponent.getComponentType() == "oracle.adf.RichPopup");

  if (event.isBusy()){     if (isDialog){       loadingIndicatorComponent.show();        }       else {       loadingIndicatorComponentId = AdfAgent.AGENT.getElementById(loadingIndicatorComponent.getClientId());         loadingIndicatorComponentId.style.display = "inherit";       }     }     else {     if (isDialog){       loadingIndicatorComponent.hide();       }       else {       loadingIndicatorComponentId = AdfAgent.AGENT.getElementById(loadingIndicatorComponent.getClientId());         loadingIndicatorComponentId.style.display = "none";       }

      AdfPage.PAGE.removeBusyStateListener(loadingIndicatorComponent, handleBusyState);     }  }    }

Wiring it up 

In order to call the script here we need to have a reference to it in the page. The normal place would be using an <af:resource> tag in the metaContainer facet of the document:

<af:document>
  ...
  <f:facet name="metaContainer">
    <af:resource type="javascript" source="/js/longRunningNotification.js"/>
  </f:facet>
</af:document>

Then the triggering component itself  uses a client listener to wire up the action and a clientAttribute to pass in the value of the required indicator component:

<af:commandButton text="Start Fong File Upload with Inline Message"
                  id="cb_upload_i"
                  partialSubmit="true">
  <af:clientAttribute name="loadingIndicatorId"
                      value="#{requestScope.uploadBB.loadingIndicatorId}"/>
  <af:clientListener method="showWhenBusy"
                     type="action"/>
</af:commandButton> 

Notice that in this case, rather than passing a hardcoded ID through to the clientAttribute I'm calling a  backing bean getter (#{requestScope.uploadBB.loadingIndicatorId}). The idea of this is that we can ask the component itself for it's correct ID, reducing the margin for error. I have to give Frank the credit for this, it was his idea as we discussed this issue.

Set Up the Indicator Component 

For this to work, the component that I'm using as the indicator  needs a few attributes set:

  1. rendered and visible must be true
  2. clientComponent must be true 
  3. bindings must be set to associate the component with a reference in a backing bean
  4. If the component is not a dialog then we need to set it's initial display state to none so it will not be visible. This is done with inlineStyle.   

Here's a sample of a panelBox that we might use as the "busy" indicator: 

<af:panelBox text="Uploading your large file...." id="pb1"
             clientComponent="true"
             binding="#{uploadBB.loadingBox}"
             inlineStyle="display:none;">
 <af:panelGroupLayout id="pgl5" layout="horizontal">
    <af:spacer width="60" height="10" id="s1"/>
    <af:image source="/images/working.gif" id="i1"/>
 </af:panelGroupLayout>
</af:panelBox

Finally Wiring the ID 

The only missing bit now is how we get the ID of the component above into the clientAttribute that the JavaScript method is pulling. Recall that this was bound to the expression "uploadBB.loadingIndicatorId". So here's the implementation of that getter that lives in the page backing bean:

public String getLoadingIndicatorId() {
  return getLoadingBox().getClientId(FacesContext.getCurrentInstance());
} 

I think that this nicely extends Frank's technique to open up a whole new range of UI possibilities when you're doing something that is going to take some time and want to keep the user entertained. 

Friday Jan 06, 2012

APPC from ANT

Further to my earlier article on using the weblogic.appc command to precompile the pages in ADF applications the question came up about how to automate the same thing from Ant. 

Now in principle this should be pretty simple as the same appc command is available as an Ant task.  However, the documentation is not particularly illuminating and completely lacks any examples so you can fumble around for hours if you're a skim reader and don't pay really close attention. 

Anyway an example is worth a 1000 words. Here's the same example as last time but in Ant form.  You will still need to follow the same process for digging out the libraries that you require.

build.xml

<project name="AppcTest" default="precompile" basedir=".">
  <description>Sample build file using Ant to call WLS APPC</description>

  <property name="wls_root" value="C:/builds/WLS_PS4" />
  <property name="wls_home" value="${wls_root}/wlserver_10.3" />
  <property name="adf_lib_root" value="${wls_root}/oracle_common/modules"/>
  <property name="common_lib_root" value="${wls_home}/common/deployable-libraries"/>  

  <path id="wls.classpath">
    <fileset dir="${wls_home}/server/lib">
      <include name="*.jar"/>
    </fileset>
  </path>

  <taskdef name="wlappc" classpathref="wls.classpath" 
           classname="weblogic.ant.taskdefs.j2ee.Appc"/>  

  <target name="precompile" description="Calls WLS APPC to pre-compile an EAR">
    <wlappc source="myapp.ear" 
            verbose="true" 
            classpath="${adf_lib_root}/oracle.adf.share_11.1.1/adfsharembean.jar" >
      <library file="${adf_lib_root}/oracle.adf.view_11.1.1/adf.oracle.domain.webapp.war"/>
      <library file="${adf_lib_root}/oracle.adf.model_11.1.1/adf.oracle.domain.ear"/>
      <library file="${common_lib_root}/jstl-1.2.war"/>
      <library file="${common_lib_root}/jsf-1.2.war"/>
    </wlappc>
  </target>

</project>

Wednesday Dec 21, 2011

ADF Performance Presentation from UKOUG

I've given this presentation a few times now at various events around the world, most recently at the UK Oracle Users Group. It's an ever evolving topic area so I'm sure the paper will change over time, but here's the version for now:

This is very much based on our experience of tuning real ADF Enterprise Applications, for both internal and external customers.  Hopefully there are a few useful nuggets of information in there for you.

Thursday Nov 24, 2011

JSP Precompilation for ADF Applications

A question that comes up from time to time, particularly in relation to build automation, is how to best pre-compile the .jspx and .jsff files in an ADF application. Thus ensuring that the app is ready to run as soon as it's installed into WebLogic. In the normal run of things, the first poor soul to hit a page pays the price and has to wait a little whilst the JSP is compiled into a servlet. Everyone else subsequently gets a free lunch. So it's a reasonable thing to want to do...

Let Me List the Ways

So forth to Google (other search engines are available)... which lead me to a fairly old article on WLDJ - Removing Performance Bottlenecks Through JSP Precompilation. Technololgy wise, it's somewhat out of date, but the one good point that it made is that it's really not very useful to try and use the precompile option in the weblogic.xml file. That's a really good observation - particularly if you're trying to integrate a pre-compile step into a Hudson Continuous Integration process. That same article mentioned an alternative approach for programmatic pre-compilation using weblogic.jspc. This seemed like a much more useful approach for a CI environment. However, weblogic.jspc is now obsoleted by weblogic.appc so we'll use that instead.  Thanks to Steve for the pointer there.

And So To APPC

APPC has documentation - always a great place to start, and supports usage both from Ant via the wlappc task and from the command line using the weblogic.appc command. In my testing I took the latter approach.

Usage, as the documentation will show you, is superficially pretty simple.  The nice thing here, is that you can pass an existing EAR file (generated of course using OJDeploy) and that EAR will be updated in place with the freshly compiled servlet classes created from the JSPs. Appc takes care of all the unpacking, compiling and re-packing of the EAR for you. Neat. 

So we're done right...? Not quite.

The Devil is in the Detail

 OK so I'm being overly dramatic but it's not all plain sailing, so here's a short guide to using weblogic.appc to compile a simple ADF application without pain. 

Information You'll Need

The following is based on the assumption that you have a stand-alone WLS install with the Application Development  Runtime installed and a suitable ADF enabled domain created. This could of course all be run off of a JDeveloper install as well

1. Your Weblogic home directory. Everything you need is relative to this so make a note.  In my case it's c:\builds\wls_ps4.

2. Next deploy your EAR as normal and have a peek inside it using your favourite zip management tool. First of all look at the weblogic-application.xml inside the EAR /META-INF directory. Have a look for any library references. Something like this:

<library-ref>
   <library-name>adf.oracle.domain</library-name>
</library-ref> 

 Make a note of the library ref (adf.oracle.domain in this case) , you'll need that in a second.

3. Next open the nested WAR file within the EAR and then have a peek inside the weblogic.xml file in the /WEB-INF directory. Again  make a note of the library references.

4. Now start the WebLogic as per normal and run the WebLogic console app (e.g. http://localhost:7001/console). In the Domain Structure navigator, select Deployments.

5. For each of the libraries you noted down drill into the library definition and make a note of the .war, .ear or .jar that defines the library. For example, in my case adf.oracle.domain maps to "C:\ builds\ WLS_PS4\ oracle_common\ modules\ oracle. adf. model_11. 1. 1\ adf. oracle. domain. ear". Note the extra spaces that are salted throughout this string as it is displayed in the console - just to make it annoying, you'll have to strip these out.

6. Finally you'll need the location of the adfsharebean.jar. We need to pass this on the classpath for APPC so that the ADFConfigLifeCycleCallBack listener can be found. In a more complex app of your own you may need additional classpath entries as well. 

Now we're ready to go, and it's a simple matter of applying the information we have gathered into the relevant command line arguments for the utility

A Simple CMD File to Run APPC 

Here's the stub .cmd file I'm using on Windows to run this.

@echo off
REM Stub weblogic.appc Runner
setlocal

set WLS_HOME=C:\builds\WLS_PS4
set ADF_LIB_ROOT=%WLS_HOME%\oracle_common\modules
set COMMON_LIB_ROOT=%WLS_HOME%\wlserver_10.3\common\deployable-libraries
set ADF_WEBAPP=%ADF_LIB_ROOT%\oracle.adf.view_11.1.1\adf.oracle.domain.webapp.war
set ADF_DOMAIN=%ADF_LIB_ROOT%\oracle.adf.model_11.1.1\adf.oracle.domain.ear
set JSTL=%COMMON_LIB_ROOT%\jstl-1.2.war
set JSF=%COMMON_LIB_ROOT%\jsf-1.2.war
set ADF_SHARE=%ADF_LIB_ROOT%\oracle.adf.share_11.1.1\adfsharembean.jar

REM Set up the WebLogic Environment so appc can be found
call %WLS_HOME%\wlserver_10.3\server\bin\setWLSEnv.cmd
CLS

REM Now compile away!
java weblogic.appc -verbose -library %ADF_WEBAPP%,%ADF_DOMAIN%,%JSTL%,%JSF% -classpath %ADF_SHARE% %1

endlocal

Running the above on a target ADF .ear  file will zip through and create all of the relevant compiled classes inside your nested .war file in the \WEB-INF\classes\jsp_servlet\ directory (but don't take my word for it, run it and take a look!)

And So...

In the immortal words of  the Pet Shop Boys, Was It Worth It? Well, here's where you'll have to do your own testing. In  my case here, with a simple ADF application, pre-compilation shaved an non-scientific "3 Elephants" off of the initial page load time for the first access of each page. That's a pretty significant payback for such a simple step to add into your CI process, so why not give it a go.

Friday Sep 23, 2011

Adventures in Logging Index

It occurred to me that a lot of folks are referring to the Adventures in ADF Logging Series, but there was no master page as such which pulled them all together. Frankly I wanted a simple URL that I could then link from my various OOW, DOAG and UKOUG sessions this year.  So here are the individual entries in the series:

Article Contents
Part 1 Basic introduction to the ADF logger covering programmatic logging
Part 2 Covers the JDeveloper Code templates I created to make it a snap to insert logging code
Part 3 How to control and configure your logging output
Part 4 Browsing and filtering log output results
Part 5

The true power of the ADF logger including ADF request tracing for performance analysis

Part 6

Seeing all your log output on the console in 12c

Related Logging and Diagnostics Topics

  1. Selective Suppression of Log Messages 
  2. Click History in ADF 12c
  3. Click History Part 2 - Access from Java

Bonus External Material

Friday Sep 02, 2011

Heatmap Styling in Tables

A question that has come up a couple of times recently is that of applying cell level styling, for example background color / colour within a conventional ADF Table.  In the Pivot table this is trivial, you just apply the styleclass to the component inside the <dvt:dataCell>. For the conventional table, however, you'll run into a slight problem with this approach in that the chances are that your styled inputText or outputText will not completely fill the cell and you'll end up with whitespace around the component and a ragged right margin depending on the length of the data in each. 

The solution is, however, simple.  Within the <af:column> wrap the display component inside of a <af:panelGroupLayout>. Ensure that you set the PGL to a vertical layout to ensure that it is stretched to fill the table cell.  Then just apply your styleclass expression to the PGL rather than the input or output text within.

So you might end up with something like this:

<af:column  headerText="#{bindings.Accounts.hints.AccountStatus.label}">
  <!-- stlyes with the same name as the Status value exist in the CSS --> 
  <af:panelGroupLayout layout="vertical" styleClass="#{row.bindings,Status.inputValue}">
    <af:outputText value="#{row.bindings.Status.inputValue}"/>
  </af:panelGroupLayout>
</af:column> 

Friday Jun 03, 2011

Adventures in ADF Logging - Part 3

Controlling Logging Output

So far in this series I've just been concentrating on how to add logging calls to your code. That's all very well but frankly, not a lot of use without some way to switch all that on.

When using a logging framework such as java.util.Logging you generally have some external configuration file which you would use to define which loggers are active at what level and for what class / package. In the case of the basic java logging this is accomplished with the logging.properties file that lives in the JRE /lib directory (or can be passed as a -D parameter as you start up the JVM). When using the ADFLogger the idea is similar except this time logging is controlled using an XML congiguration file - logging.xml which lives in the WLS servers directory structure, it turns out that you don't need to know the exact location, so only look for it if you are really curious at this stage.

Making it All Too Easy

Editing the logging.xml file would be OK, however, we like to make life easier than that and so JDeveloper actually supplies a really simple to use graphical interface to edit this configuration. In fact, as we'll see it does a bit more than that as well. Let's step through that now:

Step 1 - Start Your Engines

Or rather start your embedded WebLogic instance. We can configure logging before the WLS is running but if we wait until it is actually running and your application is initially deployed it we'll actually get visibility into all of the packages that are durrently deployed into the running server. This will come in handy later on when we come to look at some of the really cool things that we can do with extisting instrumentation but I'll cover that in a later episode! For now just run your application.

Step 2 - Open the Logging Editor

There are two ways to get to the logging configuration screen.

  1. Open the Application Server Navigator (Ctrl + Shift + G), right mouse click on the IntegratedWeblogicServer and select  Configure Oracle Diagnostic Logging for "IntegratedWeblogicServer".
  2. From the Log window (usually docked underneath your editor) pull down the Actions menu  and select Configure Oracle diagnostic Logging.

Both approaches open the same editor on top of the logging.xml file. Here it is, I've expanded the first couple of levels in the tree to make it interesting:

Image of the Logging configuration Screen

 So what we see here is a hierarchy of all the packages in the running app server. Notice that some of the icons are green, these are packages or classes that have a presence in the logging.xml already and some are yellow. This latter group are transient, that is, we can temporarily set up logging on that package or class for the duration of the application server, but once the application server shuts down all the logging configuration will be reverted.. You'll notice the checkbox at the top of this screen that is labeled Hide Transient Loggers, well you can guess what that does now.

Step 3 - Configure Some Logging

 The rest is all fairly self explanatory. To configure logging at any level you click on the level column and from the pull down select the level you want to look at.  In the image above you can see that I've set the logging for the Root Logger to the INFO level. This implicitly means that any child of that node (everything in this case because it's the root) will also be logging at the INFO level unless it explicitly overrides.  Normally your root logger would log at WARNING or SEVERE, you just want to keep an eye out for any and all problems at that level.

So perhaps we wanted to switch on CONFIG level logging for everything under the oracle.demo root we would just set that in the level column. Of course with Config level Info messages would still be printed out as well. So you can be as fine grained about this as you want, controlling the logging level on a class by class or package by package basis.

If need to set up a logger on a class / package that is not visible in the structure (yet) you can use the green add (+) button at the top left of the editor to add a new one in.  The loggers you add can be persistent (stored in the logging.xml) or transient (discarded at the end of the session) Note that transient loggers can only be created when you are actually running the internal WLS instance. 

As well as the logging levels it is also possible to configure the handlers.  In logging, the handler is the thing that takes your log message and does something with it - prints it to the console, writes it to a file etc. If you select the root logger in the editor here you will see that it defines three handlers listed in the Handler Declaration section of the screen at the bottom of the editor:

  • odl-handler
  • wls-domain
  • console

The first two would be the default for a standalone WLS instance, but the console we've automatically added for you in the case of the embedded WLS so that you will be able to see the output of your carefully placed logging commands in the log window of the IDE (Hint - Now is a good time to go to Tools > Preferences and increase the size of your logging buffer). I recommend that you do not change these handlers and just continue to inherit from the root. The reason for this will become evident in a later installment.

For example in the sample I'm running here  I have a few CONFIG and INFO level logging calls in the code and this generates the following output on the console:

<Library> <initSampleData> Setting up Library sample data
<MainPageBackingBean> <<init>> Creating a new Instance
<MainPageBackingBean> <handleSlotCartridgeMove> Dropping Module : 1 : Left-12
<MainPageBackingBean> <<init>> Creating a new Instance
<MainPageBackingBean> <handleSlotCartridgeMove> Dropping Module : 1 : Left-14
<TaskFlowUtils> <diagnosticInitializer> Key:"displaySlot" Value:"Module : 1 : Left-9" Type:[oracle.demo.whippet.model.Slot]

Cryptic yes - but I know what it all means in the context of this application and it's exactly the information that I need to monitor what's going on. It tells me that by backing bean classes are being created for every request, and in this case the drophander for some drag and drop processing has received the payload I expect.  Finally one of my standard pieces of instrumentation which I embed in my task flow initializer call is reporting all of the parameters passed to it. All good stuff. 

Next time we'll take a look at this output information  in more detail and explore some of the real power of this logging once you start to relate all of this information together into sequences of events.

Tuesday May 31, 2011

Adventures in ADF Logging - Part 2

Logging Templates

OK last time I said I'd tell you about how to look at the logging output next, but then I got all enthusiastic this morning and thought I'd create some code templates to help you use the ADFLogger. Code templates are a really neat feature of JDeveloper and if there is some bit of code (like logging) that you use a lot then 5 minutes spent building a template can save you a bunch of time in the long run.

Here are the templates I've created:

Shortcut Purpose
lgdef A basic static class logger definition
lgdefr A basic static class logger definition with resource bundle
lgdefp A basic static package logger definition
lgi Log statement for an informational message
lgc Log statement for configuration information
lgw Log statement for a warning message
lgs Log statement for an error message
lgig Guarded log statement for an informational message
lgcg Guarded log statement for configuration information
lgwg Guarded log statement for a warning message
lgsg Guarded log statement for an error message

Installing the Templates

I've made these templates available as an XML export that you can download from here: loggingTemplates.xml

To install these:

  1. Open  Tools > Preferences from the JDeveloper menu
  2. Expand the Code Editor  > Code Templates node in the preferences navigator
  3. Select the More Actions  > Import menu option as shown here and import the xml file.

Logging import

Friday May 27, 2011

Adventures in ADF Logging - Part 1

I see a lot of ADF code from both internal and external users of the framework and one thing that strikes me is how underused the ADFLogger is. There are a fair few blog articles in the community about the ADFLogger already, but they mostly repeat the basics and I wanted to go a little deeper here. 

The ADFLogger is a logging mechanism which is built into the ADF framework. It's basically  a thin wrapper around the java.util.Logging APIs with a few convenience methods thrown in and, most importantly some specific features integrated  into both JDeveloper and Enterprise Manager. All in all it's preferable to use this built-in logger for these reasons, plus it can help you avoid the kind of class-loading issues if you picked up some random version of something like Log4J.

Using the ADF Logger 

At it's most basic you create and use a one of these loggers like this:

First define the logger itself - usually a static variable in a particular class - for example one of your managed beans:

    private static ADFLogger _logger = 
            ADFLogger.createADFLogger(MainPageBackingBean.class); 

You'll notice here that the argument to the createLogger() function is the class. It can also be simply an identifying name or even a Java Package (check out the JavaDoc to learn more). By using a class reference here we are able to refine the logging output to focus very specifically on what's happening in this class. There is nothing wrong, however, in say defining a package level logger if you don't need that level of granularity / control. Indeed to probably want to define a at least a parent logger at the top level of the hierarchy of  loggers within your namespace (package structure) so that you can simply define a resource bundle for all the loggers to share. 

    private static ADFLogger _logger = 
                 ADFLogger.createADFLogger(
                              Package.getPackage("oracle.demo.whippet"),
                              "oracle.demo.whippet.LoggingResBundle"); 

This resource bundle will then* be inherited by any logger in the same hierarchy. Note also that the createADFLogger()  function will create the logger instance for you, or, in this kind of scenario return one that already exists.

*OK well actually no. That should happen but in my testing it's actually not working correctly in Patchset 4. So for now if you want to associate a resource bundle with the logger do so at the level of the logger you are using to log rather than some parent.

Next, throughout your code you can sprinkle logging statements at various levels which are (I hope) fairly self explanitory, for example in this constructor:

    public MainPageBackingBean() {
        super();
        _logger.info("Creating a new instance");
        if (BindingContext.getCurrent() == null) {
            _logger.warning("Injected Data-Binding Reference is null");
        } else {
            try {
                doSomethingComplex();
            } catch
            weirdAppException waex;
            {
                _logger.severe("Unexpected exception doing complex thing",
                               waex);
            }
        }
    }

As you can see we generally just pass a String message to the logger, although in the error case we can pass a throwable exception as well.  As I alluded to earlier you can also grab your Strings from a resource bundle, rather than hardcoding them. This probably makes sense for error messages but may be overkill for programmers eyes only messages. To grab the resource bundle you simply call the getFormattedMessage() function on the logger, or you can also get hold of the resource bundle that it references using getResourceBundle().

    _logger.severe(_logger.getFormattedMessage("errors.db_connection")); 

Where the resource  bundle contains something like:

   errors.db_connection=Unable to establish connection with database!

You can pass parameters to inject into the resource string as well 

The final thing I wanted to mention in this posting was the use of guard conditions. Although logging calls themselves are cheap if logging is not enabled, you may actually need to do quite a lot of work to prepare the stuff you want to log. For example, I have a standardized method that I use as a Task Flow Initializer to dump out the contents of the PageFlowScope for that taskflow.  That's a pretty expensive operation so you want to bypass that whole thing if you know that every single log message within it will be ignored.

Therefore you can use the  isLoggable() method to wrap the whole thing. Here's the example:

    public void diagnosticInitializer() {

        //Only do the work if we need to
        if (_logger.isLoggable(Level.INFO)) {
            AdfFacesContext actx = AdfFacesContext.getCurrentInstance();
            FacesContext fctx = FacesContext.getCurrentInstance();
            //Gather key information to dump
            String windowName =  actx.getWindowIdProvider().getCurrentWindowId(fctx);
            String viewPort = ControllerState.getInstance().getCurrentViewPort().getClientId();
            Map pageflowScopeMap = actx.getPageFlowScope();

            _logger.info("TaskFlow Diagnostics for " + viewPort);

            if (!pageflowScopeMap.isEmpty()) {
                Iterator mapIter = pageflowScopeMap.entrySet().iterator();
                while (mapIter.hasNext()) {
                    Map.Entry entry = (Map.Entry)mapIter.next();
                    String varKey = entry.getKey().toString();
                    Object varValue = entry.getValue();
                    String varClass = "n/a";
                    if (varValue != null){
                        varClass = varValue.getClass().getName();
                    }
                    StringBuilder bldr = new StringBuilder();
                    Formatter formatter = new Formatter(bldr, Locale.US);
                    formatter.format("Key:\"%s\" Value:\"%s\" Type:[%s]", varKey, varValue, varClass);                    
                    _logger.info(bldr.toString());
                }
            } else {
                _logger.info("No PageFlowScope associated with this TaskFlow instance");
            }
        }
    }

Thats all for now, next time we'll look at how you can actually view these log messages. 

Friday May 06, 2011

New Oracle Author Podcast

Just published, is my recent interview with a well known face in the ADF community and OTN ACE Director, Sten Vesterli of the aptly named Scott/Tiger. Sten's been busy working away on his second book Oracle ADF Enterprise Application Development  - Made Simple. So the podcast is available now on OTN - download and listen.
About

Hawaii, Yes! Duncan has been around Oracle technology way too long but occasionally has interesting things to say. He works in the Development Tools Division at Oracle, but you guessed that right? In his spare time he contributes to the Hudson CI Server Project at Eclipse
Follow DuncanMills on Twitter

Note that comments on this blog are moderated so (1) There may be a delay before it gets published (2) I reserve the right to ignore silly questions and comment spam is not tolerated - it gets deleted so don't even bother, we all have better things to do with our lives.
However, don't be put off, I want to hear what you have to say!

Search

Archives
« April 2014
MonTueWedThuFriSatSun
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
    
       
Today