Performance Regression Testing Using Japex and WSTest

by Deep Singh and Bharath Mundlapudi

Good performance can be a competitive advantage for any product. A code change in any module in a product can negatively impact the performance of the product. For that reason, it's important for developers to do performance testing after making code changes, in other words, do performance regression testing.

This Tech Tip focuses on performance regression testing. In examining that task, the tip illustrates the use of an open source performance regression testing framework called Japex. Note that other performance regression tools are also available.

Japex is a micro-benchmark development and testing framework. A benchmark is a test or tests that assesses the performance of a product. A micro-benchmark is a simple version of a benchmark, often used to test a specific module of code. Japex allows you to write a micro-benchmark for a module, automatically execute the micro-benchmark, and produce results that identify performance impacts due to changes in the module. The Japex framework is highly configurable and can be used with practically any Java product.

The Sample Application: Using WSTest and Japex

A sample application accompanies this tip. The sample demonstrates performance regression testing using Japex and WSTest, an open source micro-benchmark developed in the wstest project. WSTest is a client-server micro-benchmark for Java web services that stresses Java EE web services technologies. The micro-benchmark simulates a multi-threaded server program that processes in parallel multiple SOAP-based web services requests.

The sample application uses a subset of the tests provided by WSTest, and includes the driver class for Japex. A sample configuration file is also included which is used to configure the tests.

Now let's examine how to set up and run the sample application.

Setting Up Your Environment

The sample for the tip uses an open source reference implementation of Java EE 5 called GlassFish. If you haven't already done so, download the GlassFish application server from the GlassFish Community Downloads page.

You actually need to install two different builds of the GlassFish application server (you can choose any two different builds), and run the domain instances on ports 8080 and 9090 respectively. The sample application executes performance tests against the two builds and shows the performance results for each build. The performance differences are the result of differences in the web services technology code between the two builds. Note that only server-side performance is tested. The sample does not test client-side performance. In this way, you can gauge the impact of code changes between the two builds.

You also need JDK 5.0, which you can download from the Java SE Downloads page, and Apache Ant 1.6.5, which is in the GlassFish bundle (in Windows, it's in the lib\\ant subdirectory).

To run the performance tests, you need Japex, which you can download from japex Documents & files page.

Installing the Sample Application

Download the sample package and unzip its contents. The root directory for the sample is techtip\\Sample. Change the current directory to the techtip\\Sample directory. Edit the script -- setenv.sh (for UNIX) or setenv.bat (for Windows) -- to reflect your build environment. For example, change the value of the JAVA_HOME environment variable in the script to the location of JDK 5.0 on your system. Then execute the script to set up your environment. Set the AS_HOME environment variable, to the installation directory of one of the GlassFish builds. Because only server-side performance is tested, you only need to set one AS_HOME variable on the client side.

Building the Web Service

The sample builds a web service using a WSDL file that is packaged with WSTest. WSTest provides an endpoint implementation class, TestServiceImpl.java.

WSTest also provides a build file, build.xml, that specifies ant tasks for the sample. The ant build task generates portable artifacts using the WSDL and compiles them into class files. The task also compiles the endpoint implementation class and creates a deployable WAR file. Another task, ant deploy, in the build.xml file deploys the WAR file.

You can get more details about building and deploying WSTest from the wstest project site.

Building the Client

Here are the steps to build the Japex-based web service client:

  1. Write the driver class.
  2. Write configuration files.
  3. Generate portable artifacts for web service execution.
  4. Compile the client.

Write the Driver Class

A driver class is needed to simulate virtual users and generate load on the server machine. A Japex-based driver class, TestServiceXMLDriver.java, is packaged with the sample application. You can locate it in src\\client directory. The sample includes two different driver classes for comparing performance of the two different builds of the application server.

The Japex driver class should extend the JapexDriverBase class, which implements the interface JapexDriver from the Japex driver framework. The JapexDriverBase class is part of the Japex driver framework and implements the following methods from the JapexDriver interface:

   public void initializeDriver();
      
     public void prepare(TestCase testCase);    
     public void warmup(TestCase testCase);
     public void run(TestCase testCase);
     public void finish(TestCase testCase);
      
     public void terminateDriver();
     

You can get more details about the JapexDriver Interface and JapexDriverBase class from the site.

In the sample, the TestServiceBaseDriver.java class in the src\\client directory extends the JapexBaseDriver class. The TestServiceBaseDriver class overrides the methods initializeDriver(), initializeDriver(), and run() for the JapexDriverBase class.

The initializeDriver() method in the TestServiceBaseDriver.java class creates a service port and initializes it with an endpoint address.

   public void initializeDriver() {
       TestService ts = new TestService();
       stub = ts.getTestServicePort();
       ....
            ((BindingProvider) stub).getRequestContext().put(
                BindingProvider.ENDPOINT_ADDRESS_PROPERTY, 
                getParam("endpoint"));
       ....

The prepare() method populates variables, objects, and data structures and readies them for next phase.

   public void prepare(TestCase testCase) {
       methodName = testCase.getParam("methodName").intern();      

       if (methodName == "echoDate") {
           try {
               xmlCal = DatatypeFactory.newInstance()
                   .newXMLGregorianCalendarDate(2005, 4, 25, 0);
           }
       ...

The run() method makes web service calls to methods on the server using the service port created during the initialization phase.

   public void run(TestCase testCase) {
       if (methodName == "echoVoid") {
           stub.echoVoid();
       } 
       ...

For more details about writing a driver class, see the Japex manual.

Write Configuration Files

The Japex configuration file defines test suites. A test suite can have one or more drivers and/or test cases. Each driver can have one or more test cases. The configuration file can also have zero or more global parameters. The global parameters appear before drivers and the drivers appear before test cases in the configuration file.

You can use parameters in the configuration file to configure the performance tests. The parameters can be of three different scopes: global, driver, or test case. The global parameters apply to all drivers and test cases. The driver parameters apply to a specific driver and test cases. The test case parameters apply to specific test case.

The Japex configuration file for the sample application is TestServiceXMLDriver-for-build-comparison.xml. The file is located in the configs directory. The test suite is named WSpex. The file defines a number of global parameters.

   <testSuite name="WSpex" 
          xmlns="http://www.sun.com/japex/testSuite"
       xmlns:xi="http://www.w3.org/2001/XInclude">

       <param name="japex.numberOfThreads" value="1"/>
       <param name="japex.warmupTime" value="00:00:10"/>
       <param name="japex.runTime"    value="00:00:10"/>
       
       <param name="japex.warmupsPerDriver" value="1"/>
       <param name="japex.runsPerDriver" value="1"/>
       <param name="japex.reportsDirectory" value="reports"/>

The parameters that begin with the keyword japex are reserved for Japex. The japex.numberOfThreads parameter defines the number of simulated users. The parameters japex.warmupTime and japex.runTime are test case parameters. They set the warm up and run duration for all test cases. The parameters japex.warmupsPderDriver and japex.runsPerDriver define the number of warm ups and runs that should be executed for each driver.

The last parameter, japex.reportsDirectory, specifies the location of the report files.

A few more configuration files are created in the configs\\entities directory. The files TestServiceXMLDriver-for-build-one.xml and TestServiceXMLDriver-for-build-two.xml define drivers for two different builds of the application server.

   <driver name="TestServiceXMLDriver-for-build-one" 
          xmlns="http://www.sun.com/japex/testSuite">

       <param name="japex.driverClass" 
         value="com.sun.wstest.TestServiceXMLDriver"/>
       <param name="description" 
         value="TestService XML driver using JAX-WS from AS"/>
       <param name="endpoint.port" value="8080"/>
       <param name="endpoint" 
         value= http://${endpoint.host}:${endpoint.port}/WSTest/TestService"/>

These files define which driver class to use. They also specify the port number and endpoint URL. In the sample, the driver class is TestServiceXMLDriver. The two different builds of the application server should have their instances listening on different port numbers.

The configuration file, classpath-as.xml, defines the class path that Japex can use to configure and run the tests. The file all-tests.xml defines all the test cases and any applicable input parameters. These files are also in the configs\\entities directory.

   <testCaseGroup xmlns="http://www.sun.com/japex/testSuite">
       <testCase name="echoVoid">
          <param name="methodName" value="echoVoid"/>
       </testCase>
       ...
       <testCase name="echoSynthetic4K">
           <param name="methodName" value="echoSynthetic"/>
           <param name="size" value="4096"/>
       </testCase>
       ...

For more details about writing a configuration file, see the Japex manual.

Generate Portable Artifacts and Compile the Client

The way to generate portable artifacts and compile all the Java classes for the client is similar to that for server. The ant build task does this automatically.

Running the Sample

After running the script to set up your environment, you can run the sample as follows:

  1. Start both GlassFish instances by entering the following command for each instance:

      %AS_HOME%/bin/asadmin start-domain domain1

  2. Build the web service, the client, and needed artifacts, and package them into a WAR file by entering the following command in the techtip\\Sample directory:

      ant build

  3. Deploy the application by entering the following command:

      ant deploy

  4. Run the application by entering the following command:

      ant run

  5. You should see output that looks something like this:

       run:
            [java] Reading configuration file 
       'configs/TestServiceXMLDriver-for-build-comparison.xml' ... 
            [java] Estimated warmup time + run time is 1 minutes 
            [java]   TestServiceXMLDriver-for-build-one using 1 
       thread(s) on 1 cpu(s) 
            [java]     Run 1: echoVoid,177.661,echoInteger,148.
       959,echoFloat,140.094,echoString,162.918,echoDate,84.997,
       echoStruct,166.121,echoSynthetic4K,72.235,echoSynthetic8K,
       122.979,echoSynthetic12K,86.514,echoArray40,28.863,echoArr
       ay80,18.589 
       ...
       
           [java] Generating reports ...

    The application generates reports in the reports directory. You can explore specific reports by viewing the index.html page in the directory for the report of interest. Each report is placed in a separate directory below the reportsdirectory. Each specific report directory has a name that represents the time stamp of when the report was created.

    Each report shows a summary of the run as well as details of each test case, including graphs comparing the results.

    For example, here is part of a graph that compares transactions-per-second for a test run against two builds.

    Result Summary
    Result Summary
     

    You can learn more about the reports on the Japex site.

  6. After running the application, you can undeploy it by entering the following command:

      ant undeploy

  7. Delete all classes and the WAR file generated for the application by entering the following command:

      ant clean

About the Authors

Deep Singh and Bharath Mundlapudi are staff members of the Java Performance Engineering group at Sun Microsystems.

Comments:

Post a Comment:
Comments are closed for this entry.
About

edort

Search

Archives
« April 2014
SunMonTueWedThuFriSat
  
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
   
       
Today