Monday Nov 21, 2011

Things to watch for at UK Oracle Users Group in Birmingham

Grant Ronald has put together a handy summary of all of the Oracle Application Development Framework sessions for the UKOUG Tech Conference this year.:

If you're at the event then come along! 

Friday Sep 23, 2011

Adventures in Logging Index

It occurred to me that a lot of folks are referring to the Adventures in ADF Logging Series, but there was no master page as such which pulled them all together. Frankly I wanted a simple URL that I could then link from my various OOW, DOAG and UKOUG sessions this year.  So here are the individual entries in the series:

Article Contents
Part 1 Basic introduction to the ADF logger covering programmatic logging
Part 2 Covers the JDeveloper Code templates I created to make it a snap to insert logging code
Part 3 How to control and configure your logging output
Part 4 Browsing and filtering log output results
Part 5

The true power of the ADF logger including ADF request tracing for performance analysis

Part 6

Seeing all your log output on the console in 12c

Related Logging and Diagnostics Topics

  1. Selective Suppression of Log Messages 
  2. Click History in ADF 12c
  3. Click History Part 2 - Access from Java

Bonus External Material

Friday Sep 02, 2011

Heatmap Styling in Tables

A question that has come up a couple of times recently is that of applying cell level styling, for example background color / colour within a conventional ADF Table.  In the Pivot table this is trivial, you just apply the styleclass to the component inside the <dvt:dataCell>. For the conventional table, however, you'll run into a slight problem with this approach in that the chances are that your styled inputText or outputText will not completely fill the cell and you'll end up with whitespace around the component and a ragged right margin depending on the length of the data in each. 

The solution is, however, simple.  Within the <af:column> wrap the display component inside of a <af:panelGroupLayout>. Ensure that you set the PGL to a vertical layout to ensure that it is stretched to fill the table cell.  Then just apply your styleclass expression to the PGL rather than the input or output text within.

So you might end up with something like this:

<af:column  headerText="#{bindings.Accounts.hints.AccountStatus.label}">
  <!-- stlyes with the same name as the Status value exist in the CSS --> 
  <af:panelGroupLayout layout="vertical" styleClass="#{row.bindings,Status.inputValue}">
    <af:outputText value="#{row.bindings.Status.inputValue}"/>

Thursday Aug 11, 2011

Arcane zip commands for your ADF App and Hudson

I'm in the process of automating the builds of some of my sample applications that I create for proof of concepts and the like, and based on the fact that I'm heavily involved in the Hudson project now it made a lot of sense to use Hudson to help me. Now this entry is not about Hudson per-say, it is, however, related in that it's the type of syntax that you would use in a shell command executed from Hudson.

In this case I have a standard ADF project (Model, ViewController and the like)  which has been checked out from Subversion by Hudson and built with OJDeploy.  What I wanted to do in this case was also create a clean source archive from the checked out sources, without having to do a separate SVN export to get a clean copy.  So this involved working out the correct syntax of the unix zip command to exclude the artifacts (such as classes, WAR files and .svn directories) that I didn't want in the source zip file.

Anyway here's the command that creates the source zip in my top level deploy directory - hopefully it will save you a bunch of time if you ever need to do the same thing, it's one of those fiddly to get right things that always takes up way too much time.

The following is all one command, I've just split it up onto two lines for readability:

zip -r ./deploy/${JOB_NAME}_${BUILD_NUMBER} * 
    -x 'deploy/*' -x */deploy/* -x *classes/* -x *.data* -x *.svn*

This command is issued as a Hudson build step from the ${WORKSPACE} which is also the ADF Application workspace root directory

Thursday Jul 14, 2011

Book Recommendation - Oracle ADF Enterprise Application Development - Made Simple

Throughout the early part of this year I was involved in the technical review of a new ADF Book; Oracle ADF Enterprise Application Development - Made Simple (PACKT Press ISBN 978-1-849681-88-9), authored by Oracle ACE Director Sten Versterli. Sten and I subsequently went on to do a podcast on the same subject.  Well, as of a couple of weeks ago the book is now available for real and I've gotten my hands on a final copy to re-read, something which I've been doing over the last week or so.  

I have to say that this is a book which I really really like. I think the approach is spot on in terms of what the book covers and the tone.  It's not attempting to re-hash the well trodden material that has already been covered in the doc set or some of the other books out there.  Rather it takes a much more pragmatic and practical approach to the tasks involved in really building out a practical ADF application in a team development environment, going beyond the coding mechanics and APIs into the practicalities.  A lot of Sten's real world experience is really shining through here, and it's great advice to listen to. 

I'd say this one is a must for any team leader or architect about to start their first ADF project in earnest.

Saturday Jun 11, 2011

Adventures in ADF Logging - Part 5

Unleashing the True Power of the ADFLogger

So far we've seen that the logging facility  provided by the ADFLogger and the attendant tools in JDeveloper is all pretty useful and provides a powerful way of implementing and examining your logs. However, lets's take it to the next level (here's where it gets really interesting). 

You are, I'm sure, familiar with the -Djbo.debugoutput=console switch which is a command line argument to WebLogic or the ADF/BC tester which provides a bunch of useful trace information such as when bindings are constructed and ADF/BC queries and bind variables are manipulated.  It's generally regarded as a key tool in any ADF Hackers toolbox.  However, you may have thought to yourself: This is great but it's a pain having to re-run the application (and restart WLS) just to switch this diagnostic on... And you'd be right, it is a pain and certainly not something you would entertain on a production system. But guess what, we can use the logging capabilities to do the same thing without a restart.

Debugoutput Without the Pain

 To work this magic, just run your application without the -Djbo.debugoutput flag and once it's running bring up the ODL Configuration Screen (Refer back to Part 3 in this series if you've not already read up on how to do that).

Now switch on FINEST for the following packages:

  • oracle.adf
  • oracle.adfinternal
  • oracle.jbo 

Here's what it should look like:

Switching on Logging for ADF

Now go back to the web browser and carry out the activity that you're trying to trace (remember - no need to restart, the logging kicks in immediately). Once you're done open up the ODL Analyzer (refer to Part 4 in this series). The main thing that you'll observe is an awful lot of logging has happened. In this case I happen to have added my own logging call into an Entity Object accessor method so I can search the log for the call that is part of the transaction I'm interested in:

Search for an interesting record

Now I can use that Related By Request option that we looked at last time to see all the log messages associated with this one:

Filter By Request

Which leads to this more constrained list: 

Filtered List of results

Now scroll down this list of messages and I'll start to encounter useful messages such as this set:

Interesting ADF Actions

In this case it's the SQL that is being generated to get the rowcount on a VO and I can look at the key information such as the actual count returned, the SQL and the bind variable substitution.

But Wait, There's More

The final piece in the jigsaw  is that mysterious Related By ADF Request option that I skipped over last time. Now that we have a full-fat diagnostic log let's use that option instead of plain Related by Request. When you do this you're taken to that By ADF Request tab in the analyzer. This is what you'll see:

ADF Request View

What you're seeing here is a complete breakdown of the ADF Request by phase with elapsed times and a really simple to understand view of what happened when, and how log it took. You can view the details of each message in the bottom pane, and in this case again we're looking at an Estimated Row Count call.

At this point I should probably shut up and let you drool. Enjoy!

Friday Jun 10, 2011

Adventures in ADF Logging - Part 4

Viewing The Results of Your Hard Work

Previously in this series we've seen how you can instrument your code and then subsequently switch logging on using the configuration editor for logging.xml. However, so far we've just looked at that output in the console / log window of JDeveloper which is OK in the simple debugging scenario but really is a pale comparison with what we could be doing here.

Recall from the previous installment of this series, that the root logger defines several handlers for the log output and that console was on that list. Well another handler on that list is the odl-hander - the Oracle Diagnostic Logging (ODL) utility. Let's look at how we can use that to get a bit more information out of the logging that we've painstakingly added to our code.

JDeveloper  has a build in viewer for the ODL logs which you can simply get to from the Tools > Oracle Diagnostic Log Analyzer menu option, or from the drop down menu in the log window:

Starting ODL Analyzer

 You'll see that in the case of the Action Menu we have a shortcut to just open the log for the current session which is what you need most of the time, however, you can also select any ODL diagnostic log (from any WebLogic instance, not just the integrated WLS) and diagnose that.

Now The Power is Yours

The ODL Analyzer will now appear and you can start to play. Note that once the ODL Analyzer is open you can click the magnifying glass icon at the top of the screen to browse the file system for other logs.  The file dialog that comes up already has a shortcut for you pointing to the WLS log directory of the integrated container:

ODL Log file selection

In this case I'm viewing the current log and I've filtered it to just show me logging generated in the last hour and coming from the oracle.demo.whippet package. However, you can see that there is a lot of filtering that you can do to slice and dice your logs:

Filtered ODL Screen

Notice that there are two tabs in the ODL Analyzer, make sure the one labelled By Log Message is selected at this stage.

This window actually has a detail view which in this case is collapsed below the table of log entries. That will give you more information about the selected log entry such as the authenticated user and more detail such as the stack trace if you logged a Throwable.

You can control the columns displayed in the table using the small downwards facing arrow above the right hand scrollbar. Next have a play with the Related pulldown . Click this for a particular log entry and this will present 3 options:

  • Related By Request - will filter the view to just show the logging entries that came out of the same HTTP Request 
  • Related By Time  - will filter the view to a specified timeslice (you'll get a pull-down to allow you to define how long back in time you want to look before the specified entry. The default is 30 seconds). This can be really handy when other things are happening on the system which may well be part of a separate request but impact the thing you are trying to diagnose - for example, an af:poll kicks off a PPR request that you were not expecting.
  • Related By ADF Request  - This one we'll cover in the next article in this series

So as you can see, the ODL analyzer gives you a lot of power to both filter your logging output and start to logically trace what's happening within your application. Next time though I'll look at what happens when we incorporate that information into the context of the ADF Lifecyle. That's when things get extremely exciting.

Friday Jun 03, 2011

Adventures in ADF Logging - Part 3

Controlling Logging Output

So far in this series I've just been concentrating on how to add logging calls to your code. That's all very well but frankly, not a lot of use without some way to switch all that on.

When using a logging framework such as java.util.Logging you generally have some external configuration file which you would use to define which loggers are active at what level and for what class / package. In the case of the basic java logging this is accomplished with the file that lives in the JRE /lib directory (or can be passed as a -D parameter as you start up the JVM). When using the ADFLogger the idea is similar except this time logging is controlled using an XML congiguration file - logging.xml which lives in the WLS servers directory structure, it turns out that you don't need to know the exact location, so only look for it if you are really curious at this stage.

Making it All Too Easy

Editing the logging.xml file would be OK, however, we like to make life easier than that and so JDeveloper actually supplies a really simple to use graphical interface to edit this configuration. In fact, as we'll see it does a bit more than that as well. Let's step through that now:

Step 1 - Start Your Engines

Or rather start your embedded WebLogic instance. We can configure logging before the WLS is running but if we wait until it is actually running and your application is initially deployed it we'll actually get visibility into all of the packages that are durrently deployed into the running server. This will come in handy later on when we come to look at some of the really cool things that we can do with extisting instrumentation but I'll cover that in a later episode! For now just run your application.

Step 2 - Open the Logging Editor

There are two ways to get to the logging configuration screen.

  1. Open the Application Server Navigator (Ctrl + Shift + G), right mouse click on the IntegratedWeblogicServer and select  Configure Oracle Diagnostic Logging for "IntegratedWeblogicServer".
  2. From the Log window (usually docked underneath your editor) pull down the Actions menu  and select Configure Oracle diagnostic Logging.

Both approaches open the same editor on top of the logging.xml file. Here it is, I've expanded the first couple of levels in the tree to make it interesting:

Image of the Logging configuration Screen

 So what we see here is a hierarchy of all the packages in the running app server. Notice that some of the icons are green, these are packages or classes that have a presence in the logging.xml already and some are yellow. This latter group are transient, that is, we can temporarily set up logging on that package or class for the duration of the application server, but once the application server shuts down all the logging configuration will be reverted.. You'll notice the checkbox at the top of this screen that is labeled Hide Transient Loggers, well you can guess what that does now.

Step 3 - Configure Some Logging

 The rest is all fairly self explanatory. To configure logging at any level you click on the level column and from the pull down select the level you want to look at.  In the image above you can see that I've set the logging for the Root Logger to the INFO level. This implicitly means that any child of that node (everything in this case because it's the root) will also be logging at the INFO level unless it explicitly overrides.  Normally your root logger would log at WARNING or SEVERE, you just want to keep an eye out for any and all problems at that level.

So perhaps we wanted to switch on CONFIG level logging for everything under the oracle.demo root we would just set that in the level column. Of course with Config level Info messages would still be printed out as well. So you can be as fine grained about this as you want, controlling the logging level on a class by class or package by package basis.

If need to set up a logger on a class / package that is not visible in the structure (yet) you can use the green add (+) button at the top left of the editor to add a new one in.  The loggers you add can be persistent (stored in the logging.xml) or transient (discarded at the end of the session) Note that transient loggers can only be created when you are actually running the internal WLS instance. 

As well as the logging levels it is also possible to configure the handlers.  In logging, the handler is the thing that takes your log message and does something with it - prints it to the console, writes it to a file etc. If you select the root logger in the editor here you will see that it defines three handlers listed in the Handler Declaration section of the screen at the bottom of the editor:

  • odl-handler
  • wls-domain
  • console

The first two would be the default for a standalone WLS instance, but the console we've automatically added for you in the case of the embedded WLS so that you will be able to see the output of your carefully placed logging commands in the log window of the IDE (Hint - Now is a good time to go to Tools > Preferences and increase the size of your logging buffer). I recommend that you do not change these handlers and just continue to inherit from the root. The reason for this will become evident in a later installment.

For example in the sample I'm running here  I have a few CONFIG and INFO level logging calls in the code and this generates the following output on the console:

<Library> <initSampleData> Setting up Library sample data
<MainPageBackingBean> <<init>> Creating a new Instance
<MainPageBackingBean> <handleSlotCartridgeMove> Dropping Module : 1 : Left-12
<MainPageBackingBean> <<init>> Creating a new Instance
<MainPageBackingBean> <handleSlotCartridgeMove> Dropping Module : 1 : Left-14
<TaskFlowUtils> <diagnosticInitializer> Key:"displaySlot" Value:"Module : 1 : Left-9" Type:[oracle.demo.whippet.model.Slot]

Cryptic yes - but I know what it all means in the context of this application and it's exactly the information that I need to monitor what's going on. It tells me that by backing bean classes are being created for every request, and in this case the drophander for some drag and drop processing has received the payload I expect.  Finally one of my standard pieces of instrumentation which I embed in my task flow initializer call is reporting all of the parameters passed to it. All good stuff. 

Next time we'll take a look at this output information  in more detail and explore some of the real power of this logging once you start to relate all of this information together into sequences of events.

Tuesday May 31, 2011

Adventures in ADF Logging - Part 2

Logging Templates

OK last time I said I'd tell you about how to look at the logging output next, but then I got all enthusiastic this morning and thought I'd create some code templates to help you use the ADFLogger. Code templates are a really neat feature of JDeveloper and if there is some bit of code (like logging) that you use a lot then 5 minutes spent building a template can save you a bunch of time in the long run.

Here are the templates I've created:

Shortcut Purpose
lgdef A basic static class logger definition
lgdefr A basic static class logger definition with resource bundle
lgdefp A basic static package logger definition
lgi Log statement for an informational message
lgc Log statement for configuration information
lgw Log statement for a warning message
lgs Log statement for an error message
lgig Guarded log statement for an informational message
lgcg Guarded log statement for configuration information
lgwg Guarded log statement for a warning message
lgsg Guarded log statement for an error message

Installing the Templates

I've made these templates available as an XML export that you can download from here: loggingTemplates.xml

To install these:

  1. Open  Tools > Preferences from the JDeveloper menu
  2. Expand the Code Editor  > Code Templates node in the preferences navigator
  3. Select the More Actions  > Import menu option as shown here and import the xml file.

Logging import

Friday May 27, 2011

Adventures in ADF Logging - Part 1

I see a lot of ADF code from both internal and external users of the framework and one thing that strikes me is how underused the ADFLogger is. There are a fair few blog articles in the community about the ADFLogger already, but they mostly repeat the basics and I wanted to go a little deeper here. 

The ADFLogger is a logging mechanism which is built into the ADF framework. It's basically  a thin wrapper around the java.util.Logging APIs with a few convenience methods thrown in and, most importantly some specific features integrated  into both JDeveloper and Enterprise Manager. All in all it's preferable to use this built-in logger for these reasons, plus it can help you avoid the kind of class-loading issues if you picked up some random version of something like Log4J.

Using the ADF Logger 

At it's most basic you create and use a one of these loggers like this:

First define the logger itself - usually a static variable in a particular class - for example one of your managed beans:

    private static ADFLogger _logger = 

You'll notice here that the argument to the createLogger() function is the class. It can also be simply an identifying name or even a Java Package (check out the JavaDoc to learn more). By using a class reference here we are able to refine the logging output to focus very specifically on what's happening in this class. There is nothing wrong, however, in say defining a package level logger if you don't need that level of granularity / control. Indeed to probably want to define a at least a parent logger at the top level of the hierarchy of  loggers within your namespace (package structure) so that you can simply define a resource bundle for all the loggers to share. 

    private static ADFLogger _logger = 

This resource bundle will then* be inherited by any logger in the same hierarchy. Note also that the createADFLogger()  function will create the logger instance for you, or, in this kind of scenario return one that already exists.

*OK well actually no. That should happen but in my testing it's actually not working correctly in Patchset 4. So for now if you want to associate a resource bundle with the logger do so at the level of the logger you are using to log rather than some parent.

Next, throughout your code you can sprinkle logging statements at various levels which are (I hope) fairly self explanitory, for example in this constructor:

    public MainPageBackingBean() {
        super();"Creating a new instance");
        if (BindingContext.getCurrent() == null) {
            _logger.warning("Injected Data-Binding Reference is null");
        } else {
            try {
            } catch
            weirdAppException waex;
                _logger.severe("Unexpected exception doing complex thing",

As you can see we generally just pass a String message to the logger, although in the error case we can pass a throwable exception as well.  As I alluded to earlier you can also grab your Strings from a resource bundle, rather than hardcoding them. This probably makes sense for error messages but may be overkill for programmers eyes only messages. To grab the resource bundle you simply call the getFormattedMessage() function on the logger, or you can also get hold of the resource bundle that it references using getResourceBundle().


Where the resource  bundle contains something like:

   errors.db_connection=Unable to establish connection with database!

You can pass parameters to inject into the resource string as well 

The final thing I wanted to mention in this posting was the use of guard conditions. Although logging calls themselves are cheap if logging is not enabled, you may actually need to do quite a lot of work to prepare the stuff you want to log. For example, I have a standardized method that I use as a Task Flow Initializer to dump out the contents of the PageFlowScope for that taskflow.  That's a pretty expensive operation so you want to bypass that whole thing if you know that every single log message within it will be ignored.

Therefore you can use the  isLoggable() method to wrap the whole thing. Here's the example:

    public void diagnosticInitializer() {

        //Only do the work if we need to
        if (_logger.isLoggable(Level.INFO)) {
            AdfFacesContext actx = AdfFacesContext.getCurrentInstance();
            FacesContext fctx = FacesContext.getCurrentInstance();
            //Gather key information to dump
            String windowName =  actx.getWindowIdProvider().getCurrentWindowId(fctx);
            String viewPort = ControllerState.getInstance().getCurrentViewPort().getClientId();
            Map pageflowScopeMap = actx.getPageFlowScope();

  "TaskFlow Diagnostics for " + viewPort);

            if (!pageflowScopeMap.isEmpty()) {
                Iterator mapIter = pageflowScopeMap.entrySet().iterator();
                while (mapIter.hasNext()) {
                    Map.Entry entry = (Map.Entry);
                    String varKey = entry.getKey().toString();
                    Object varValue = entry.getValue();
                    String varClass = "n/a";
                    if (varValue != null){
                        varClass = varValue.getClass().getName();
                    StringBuilder bldr = new StringBuilder();
                    Formatter formatter = new Formatter(bldr, Locale.US);
                    formatter.format("Key:\"%s\" Value:\"%s\" Type:[%s]", varKey, varValue, varClass);                    
            } else {
      "No PageFlowScope associated with this TaskFlow instance");

Thats all for now, next time we'll look at how you can actually view these log messages. 


Hawaii, Yes! Duncan has been around Oracle technology way too long but occasionally has interesting things to say. He works in the Development Tools Division at Oracle, but you guessed that right? In his spare time he contributes to the Hudson CI Server Project at Eclipse
Follow DuncanMills on Twitter

Note that comments on this blog are moderated so (1) There may be a delay before it gets published (2) I reserve the right to ignore silly questions and comment spam is not tolerated - it gets deleted so don't even bother, we all have better things to do with our lives.
However, don't be put off, I want to hear what you have to say!


« April 2014