By Jonathan Benoit on Sep 26, 2011
A micro benchmarking framework has been created for Jersey/JAX-RS. This allows us to do continuous performance testing using hudson to make sure Jersey keeps performing well as we are adding functionality. This test suite also allows us to test between different JAX-RS versions, which helps us catch any regressions in Jersey 2, with respect to Jersey 1.
I am testing basic functionality such as media type to java type conversions,
and vice-versa. I am testing the most commonly used media types.
MediaType conversions covered in testing baseline results include:
Not only is there added support for different payloads, XML, pure text, JSON payload, but I am also testing:
- HTTP methods + JAXRS annotations GET, PUT, POST, DELETE.
- annotations like @QueryParam, etc.
I can increase our testing coverage, testing various features, as required.
I am utilizing Japex Micro-benchmark Framework to implement our testing suite. Japex includes a Maven2 plugin, which we use to run microbenchmarks from our maven build. Hudson includes a Japex plugin to allow Japex Trend reports to be displayed within hudson. Here is a snapshot of Japex Trend report for JUnit testing of sample Root Element Collection method:
I'm trying to ensure that we eliminate "noise" from these performance number results. Initially the numbers were less stable, as there were multiple executors configured on hudson slave, allowing multiple jobs to run concurrently. That has been changed to now run single executor on hudson slave where micro benchmark tests are run. Also, I have analyzed hudson slave to ensure no cron jobs are triggered during performance testing. This is a work in progress, striving for more deterministic results moving forward.
I've configured the Japex hudson plugin to notify us when there is more than a 1% change in performance via the configure page of our hudson job:
The usage of Hudson with Japex allows us to continuously monitor the performance of Jersey as it moves forward.