Analyze server logs in GlassFish 3.1

How to analyze logs in GlassFish 3.1?

There are two options available:
1. Fire up GlassFish and open the admin web UI inside a browser (for example http://localhost:4848) and launch Log Viewer.
2. Use the collect-log-files subcommand in remote mode to collect log files into a ZIP archive.

In above both options, it would download log files even if your instance is down. Means, you can analyze your logs even if your instance is crashed.

How to use web UI?

1. Fire up GlassFish and open the admin web UI inside a browser (for example http://localhost:4848).
2. Click on 'Server(Admin Server)' -> Click on 'View Log Files' Button.
3. It opens new window with recent log records.
4. If you have multiple instances running then select another instance from 'Server' drop down and click on 'Search' button.
5. If you have multiple log files(rotated server log files) then select another file name from 'Log File' drop down list and click on 'Search' button.

How to use collect-log-files command?

Use the collect-log-files subcommand in remote mode to collect log files into a ZIP archive. Optionally, you can target a server name/cluster name/stand alone instance name. Default value for target is Server. For given target command would download all log files from log folder and make it zip. Default location for storing zip files under GlassFish_Home/domains/domain1/collected-logs. If user needs to store it under specific folder then specify the folder path with --retrieve option as true.

1. Run following command to get log files for Server

./asadmin collect-log-files
Created Zip file under /space/gfv3/v3setup/glassfish3/glassfish/domains/domain1/collected-logs/log_2010-12-15_15-46-23.zip.
Command collect-log-files executed successfully.

2. Run following command to get log files for Cluster having two instances running.

./asadmin collect-log-files --target cluster1
Log files are downloaded for instance1.
Log files are downloaded for instance2.
Created Zip file under /space/gfv3/v3setup/glassfish3/glassfish/domains/domain1/collected-logs/log_2010-12-15_15-54-06.zip.
Command collect-log-files executed successfully.

3. Run following command to get log files for Cluster having two instances running under /space/output directory.

./asadmin collect-log-files --target cluster1 --retrieve true /space/output
Log files are downloaded for instance1.
Log files are downloaded for instance2.
Created Zip file under /space/output/log_2010-12-15_15-55-54.zip.
Command collect-log-files executed successfully.

Unzip the generated zip file and analyze your logs. When you unzip log_2010-12-15_15-55-54.zip the directory structure looks like as below:

GlassFish_Home/domains/domain1/collected-logs/logs
	- instance1
		- server.log
		- .....
	- instance2
		- server.log
		- .....

Comments:

Great, thanks!

Posted by Zezo on December 21, 2010 at 07:08 AM IST #

Thanks for your post. Since I have to deal with some tomcat clusters everyup allday, I suppose that my experience is quite the same as yours. I think that compacting all log files in a same place is not the solution. It doesn't solve the main problem which is 'how to analyze logs'. I'm persuade that the best practice for clusters is to configure log4j to write in a central repository. I will implement it as soon as possible.

Posted by alexdp on February 01, 2011 at 08:19 PM IST #

Post a Comment:
  • HTML Syntax: NOT allowed
About

Naman

Search

Categories
Archives
« April 2014
SunMonTueWedThuFriSat
  
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
   
       
Today