IWS Reports are new addition to Oracle SOA diagnostics tool. Refer this article for quick overview.
Once enabled SOA runtime automatically and periodically start capturing key statistics such as resource usages, flow rates at key points in the system, backlogs and execution time which allows user to go back in time and pick arbitrary time window and generate a report to gain insights and also create baseline. Comparing report from a period where system is slow or bottlenecked with baseline can provide valuable analysis.
In this entry I will take a simple scenario to showcase usage of IWS in diagnosing performance issues in SOA.
Consider following simple SOA project that consists of single composite which exposes one SOAP endpoint wired to one asynchronous BPEL process. This main process in turn calls a synchronous sub-process and an external webservice then finally writes to a file.
Let's say under normal circumstances with certain workload the application produces output at certain rate but due to some issue the output slows down and you are tasked to diagnose the issue.
Since there are multiple components and services involved the issue could really be anywhere:
- perhaps the system is low on resources (since this composite may be part of number of composites and there could be increased system load on the system)
- perhaps the system resources (workmanager/datasource) need reconfiguration as a result of changes to load profile
- may be the external service has slowed down
- may be underlying database is slow
- we may have to tune bpel engine
Lets see how IWS can help us diagnose the issue above. We will now now walk-through the steps required to enable IWS, feed data and generate report and finally do the analysis.
Log on to EM and go to IWS home page
Click the “Configure” button to Enable IWS: By default IWS is disabled
Pick Snapshot Interval and Data Collection Level:
For Snapshot Interval you can pick from 1 min to 1 hour. For this example we will choose 5 mins.
For Collection Level we pick Finest
Click OK to apply and dismiss the dialog box.
IWS is now enabled and will start capturing snapshot every 5 mins and persist them into database. Also note the various levels of data collection. At FINEST level IWS would collect data for all metrics.
Next we will feed some data.
By design this application input data passed to the composite the external service whose response is dependent on the value of input.
Using JDeveloper composite test facility we will perform two testing feed 500 messages for each.
- Feed messages with input value “0” (external service responds with no delay).
- It takes less about 4 1/2 mins to process 1000 requests.
- Note down the start time of the test and end time (all messages processed). We need this information later to generate IWS report this test period.
- Test Start Timestamp: Dec 10 13:14:37 PST 2015
- Test End Timestamp: Dec 10 13:19:12 PST 2015
- Wait at least 5 min for another snapshot
- Feed message with input value of 30 which will cause the external service to respond with 30 sec delay
- It takes little over 6 mins to process 1000 requests or the performance has degraded by 33%. Note that it takes almost same amount of time as in Test 1 to feed the messages.
- Note down the start time and end time
- Test Start Timestamp: Dec 10 13:27:10 PST 2015
- Test End Timestamp: Dec 10 13:33:17 PST 2015
Generating IWS Report
Now we will navigate to to IWS UI page and generate IWS reports for the two tests periods:
For Test 1, select the start date and end date that wraps around the start/end date of the test timestamps and then click HTML button. This will generate the IWS report in HTML format (other supported formats - xml and csv). This report forms the baseline and will be compared against report generated for second test.
Note that optionally, for large installations you can filter the result by selecting SOA Folder, and list of composites of interest and number of records for each metric (green box in screenshot below).
Do the same to Test 2 by picking start date and end date that surrounds the timestamps recorded for the test. So we generate report for start time - 12/10/15 1:22:00 PM and end time - 12/10/15 1:37:00 PM
Save the report.
In the next blog entry we will analyze the issue by comparing the two reports.