Exploring Mobility, Chatbots, Blockchain and Augmented Reality solutions in the Cloud to re-imagine Education & Research


Kevin Roebuck
Director, Digital Experience

Scalability testing in our development labs is one of the really great things we do for our education ISVs. Yesterday, the good folks at SirsiDynix, from the Dynix side of the acquisition, visited the EBC to talk about next-steps in clustering and performance for their Horizon library application. Here's the executive summary of the test results:

Horizon 8.0/Sun UltraSPARC
Benchmark Results
Executive Summary
July 2006
Prepared by SirsiDynix

The original benchmarking test results from the Sun Microsystems Labs indicated less than ideal Horizon 8.0 performance on Sun Fire V490 and Sun Fire V890 systems, both using Sun UltraSPARC IV/IV+ processors.  Prior to testing, it was expected that the V490/V890 with the Solaris operating system would exhibit superior throughput characteristics, relative to the number of processors on the respective systems, while maintaining response times within contractual limits.  For many operations, Sun Fire V890 performance was only marginally better than that of the Sun Fire V490, exhibiting no significant increase in throughput, while both systems response times were unacceptable.  This fact, combined with higher expectations of overall performance led SirsiDynix (SD) to conduct another round of benchmarking, analysis and optimizations.  Concern was mitigated by the realization that during the first round of tests, only out-of-the-box measurements were taken.  That is, no significant operating system, database or application modifications were made to ensure peak application performance on a given platform, as the intent of the tests were to measure results on a diverse group of Sun hardware, including systems based on the UltraSPARC IV/IV+, UltraSPARC T1 and AMD Opteron processors.  The round of tests described in this report focused on optimizing performance on UltraSPARC IV/IV+-based systems.

In response to the concerns outlined above, SirsiDynix (SD) formed the so-called Sun Performance Team (SPT), consisting of database, systems and applications experts from the Horizon and Unicorn teams.  After receiving access to the Sun Microsystems labs and having Sun assign a java and systems applications expert, basic guidelines were made and the project was initiated.  Over the course of approximately 6 weeks, SirsiDynix, working closely with Sun Microsystems, focused on the analysis and optimization of Horizon 8.x and Oracle on the Sun UltraSPARC architecture.  Based on operations considered representative of Horizon functionality, SD executed its standard benchmarking test suite while carefully monitoring application performance.  After each run was made, metrics were recorded and analyzed.  Based on those results, short-term goals were set to both verify hypotheses and modify the application to improve application performance. 

Assumptions originally made regarding Sun UltraSPARC were quickly proven wrong. Early on, Horizon application design and configuration proved at least partly responsible for the poor readings. Three areas of concern were noted: 1) an inordinate amount of time was spent in memory management; 2) few cores of the 16 cores available on the V890 were being utilized due to serialization; and 3) relatively poor Oracle performance was observed.  We immediately focused on memory management, while, simultaneously, the database experts reviewed the Oracle configuration settings to ensure optimal use of available resources.

The Oracle configuration issues were the first to be uncovered.  After methodically modifying parameters then verifying their effect on performance, the team was able to achieve what are now believed to be the optimal settings.  This had a profound effect on this tier and removed a potential bottleneck with relatively little cost. 

The remaining two issues were coupled.  That is, one of the reasons few of the available cores were being used was a result of the memory management .  It was determined that both application configuration as well as design or implementation issues had led to inefficient use of application resources.  As each of these and other problems were found, changes were quickly put in place – OS, application or database reconfiguration, or modification of application code – and the test suite was rerun to verify improvement of performance. 

With the culmination of this phase of the project, SirsiDynix is pleased with the progress that has been made.  Based upon the server-side performance requirements laid down, SirsiDynix has far exceeded the original targets – by more than a factor of 10.   The original concerns regarding Sun UltraSPARC have been alleviated by our testing.  Server-side response times are well within the contractual limits, as is system throughput.  Further, the addition of clustering, scheduled for Q2 2007,  should result in an even greater increase in throughput via the addition of java instances, while also keeping the latency low by avoiding instance saturation.  Based on these findings, SirsiDynix  recommends the Sun UltraSPARC architecture based on its current demonstrated performance as well as its future growth potential, once clustering is implemented.

Be the first to comment

Comments ( 0 )
Please enter your name.Please provide a valid email address.Please enter a comment.CAPTCHA challenge response provided was incorrect. Please try again.