Adventures in ADF Logging - Part 3

Controlling Logging Output

So far in this series I've just been concentrating on how to add logging calls to your code. That's all very well but frankly, not a lot of use without some way to switch all that on.

When using a logging framework such as java.util.Logging you generally have some external configuration file which you would use to define which loggers are active at what level and for what class / package. In the case of the basic java logging this is accomplished with the logging.properties file that lives in the JRE /lib directory (or can be passed as a -D parameter as you start up the JVM). When using the ADFLogger the idea is similar except this time logging is controlled using an XML congiguration file - logging.xml which lives in the WLS servers directory structure, it turns out that you don't need to know the exact location, so only look for it if you are really curious at this stage.

Making it All Too Easy

Editing the logging.xml file would be OK, however, we like to make life easier than that and so JDeveloper actually supplies a really simple to use graphical interface to edit this configuration. In fact, as we'll see it does a bit more than that as well. Let's step through that now:

Step 1 - Start Your Engines

Or rather start your embedded WebLogic instance. We can configure logging before the WLS is running but if we wait until it is actually running and your application is initially deployed it we'll actually get visibility into all of the packages that are durrently deployed into the running server. This will come in handy later on when we come to look at some of the really cool things that we can do with extisting instrumentation but I'll cover that in a later episode! For now just run your application.

Step 2 - Open the Logging Editor

There are two ways to get to the logging configuration screen.

  1. Open the Application Server Navigator (Ctrl + Shift + G), right mouse click on the IntegratedWeblogicServer and select  Configure Oracle Diagnostic Logging for "IntegratedWeblogicServer".
  2. From the Log window (usually docked underneath your editor) pull down the Actions menu  and select Configure Oracle diagnostic Logging.

Both approaches open the same editor on top of the logging.xml file. Here it is, I've expanded the first couple of levels in the tree to make it interesting:

Image of the Logging configuration Screen

 So what we see here is a hierarchy of all the packages in the running app server. Notice that some of the icons are green, these are packages or classes that have a presence in the logging.xml already and some are yellow. This latter group are transient, that is, we can temporarily set up logging on that package or class for the duration of the application server, but once the application server shuts down all the logging configuration will be reverted.. You'll notice the checkbox at the top of this screen that is labeled Hide Transient Loggers, well you can guess what that does now.

Step 3 - Configure Some Logging

 The rest is all fairly self explanatory. To configure logging at any level you click on the level column and from the pull down select the level you want to look at.  In the image above you can see that I've set the logging for the Root Logger to the INFO level. This implicitly means that any child of that node (everything in this case because it's the root) will also be logging at the INFO level unless it explicitly overrides.  Normally your root logger would log at WARNING or SEVERE, you just want to keep an eye out for any and all problems at that level.

So perhaps we wanted to switch on CONFIG level logging for everything under the oracle.demo root we would just set that in the level column. Of course with Config level Info messages would still be printed out as well. So you can be as fine grained about this as you want, controlling the logging level on a class by class or package by package basis.

If need to set up a logger on a class / package that is not visible in the structure (yet) you can use the green add (+) button at the top left of the editor to add a new one in.  The loggers you add can be persistent (stored in the logging.xml) or transient (discarded at the end of the session) Note that transient loggers can only be created when you are actually running the internal WLS instance. 

As well as the logging levels it is also possible to configure the handlers.  In logging, the handler is the thing that takes your log message and does something with it - prints it to the console, writes it to a file etc. If you select the root logger in the editor here you will see that it defines three handlers listed in the Handler Declaration section of the screen at the bottom of the editor:

  • odl-handler
  • wls-domain
  • console

The first two would be the default for a standalone WLS instance, but the console we've automatically added for you in the case of the embedded WLS so that you will be able to see the output of your carefully placed logging commands in the log window of the IDE (Hint - Now is a good time to go to Tools > Preferences and increase the size of your logging buffer). I recommend that you do not change these handlers and just continue to inherit from the root. The reason for this will become evident in a later installment.

For example in the sample I'm running here  I have a few CONFIG and INFO level logging calls in the code and this generates the following output on the console:

<Library> <initSampleData> Setting up Library sample data
<MainPageBackingBean> <<init>> Creating a new Instance
<MainPageBackingBean> <handleSlotCartridgeMove> Dropping Module : 1 : Left-12
<MainPageBackingBean> <<init>> Creating a new Instance
<MainPageBackingBean> <handleSlotCartridgeMove> Dropping Module : 1 : Left-14
<TaskFlowUtils> <diagnosticInitializer> Key:"displaySlot" Value:"Module : 1 : Left-9" Type:[oracle.demo.whippet.model.Slot]

Cryptic yes - but I know what it all means in the context of this application and it's exactly the information that I need to monitor what's going on. It tells me that by backing bean classes are being created for every request, and in this case the drophander for some drag and drop processing has received the payload I expect.  Finally one of my standard pieces of instrumentation which I embed in my task flow initializer call is reporting all of the parameters passed to it. All good stuff. 

Next time we'll take a look at this output information  in more detail and explore some of the real power of this logging once you start to relate all of this information together into sequences of events.

Comments:

Post a Comment:
Comments are closed for this entry.
About

Hawaii, Yes! Duncan has been around Oracle technology way too long but occasionally has interesting things to say. He works in the Development Tools Division at Oracle, but you guessed that right? In his spare time he contributes to the Hudson CI Server Project at Eclipse
Follow DuncanMills on Twitter

Note that comments on this blog are moderated so (1) There may be a delay before it gets published (2) I reserve the right to ignore silly questions and comment spam is not tolerated - it gets deleted so don't even bother, we all have better things to do with our lives.
However, don't be put off, I want to hear what you have to say!

Search

Archives
« April 2014
MonTueWedThuFriSatSun
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
    
       
Today