Monday Feb 15, 2010

SPECtacular awards - Java

Last December SPEC released the SPECjEnterprise2010 benchmark, the third generation of Java Enterprise Edition performance tests from the Standard Performance Evaluation Corporation (SPEC) consortium. Steve RealmutoThe new benchmark tests Java EE 5.0's significantly expanded and simplified programming model, with a realistic workload stressing the entire system including JVM, middleware, Database, CPU, disk and servers. Yet although the benchmark test is much broader, it is simpler than ever to run because it takes advantage of Java EE 5.0 features and because it uses the open source Faban general purpose driver.

One of the most enjoyable duties of SPEC President is thanking the people from all the member institutions who make SPEC's success possible. I fear that it is all too easy for SPEC people's achievements to miss recognition, in our environment where their successes are most visible to their competitors rather than to their own management. So each January at the annual meeting we present awards to SPECtacular contributers. And now I write here to give them a bit of public recognition. I start this year with award recipients from the Java committee. Awards were presented by Alan Adamson (on right in photo, presenting award to Steve Realmuto). Alan is a SPEC Awards Committee member, member of the Board of Directors, and former chair of the Java committee.

Akara Sucharitakul - Oracle

The silent partner. Akara developed Faban, made it available to SPEC, and implemented new features needed to facilitate its use in SPECjEnterprise2010, so that the benchmark can be broader in scope and still be simpler to run.

Anil Kumar - Intel

The greenie. Although SPECjEnterprise2010 version 1.0 does not include an energy metric, the code is there thanks to Anil, awaiting adoption of suitable run and reporting rules.

Anoop Gupta - Oracle

The quiet achiever. Anoop seldom took part in the committee's sometimes raucous debates, because he was busy working on the code, making sure the workload is correct and correctly balanced.

Bernhard Riedhofer - SAP

Mr Specification. Berhnard speaks very quietly and politely , but the development group learned early on that when Bernhard speaks , it pays to listen. Of his many code contributions, those to make the database loader run truly parallel are going to be most appreciated by folks running large submissions.

David Keenan – Oracle

The chair. The job of chairing one of SPEC's largest and sometimes fractious committees is not an easy one, particularly while running results review for 4 benchmarks plus development of two new benchmarks. David combines a soft touch with firm determination to get the job done.

John Stecher - IBM

The closer. At then end when "only" a few tough action items stood in the way of benchmark release, John got additional resources committed from IBM to "get this thing done," pitching in for plenty of the closing work himself too.

Rahul Biswas - Oracle

Mr WebServices. Rahul provided most of the web services code, plus the code and ant scripts to integrate the benchmark into the Faban harness.

Robert Wisniewski - IBM

The reporter. Rob wrote the reporter code and built the test kits. He also served as secretary: taking good notes is vital to an open development process of multiple (competing) vendors.

Saraswathy Narayan - Oracle

The architect. Sara took the time to make a deep study of the entire benchmark from a transactional and data flow perspective, ensuring the correctness and function of the database schema and benchmark data partitioning.

Steve Realmuto - Oracle

The chief of police. Steve contributed much to the organisation of the benchmark development effort, and helped the team follow SPEC policies. He was editor of the run rules, driving the review and ensuring correctness.

Tom Daly – Oracle

The instigator. Tom was a relentless leader in driving the project forward, and a tireless worker in helping to push at every stage. The benchmark became a much richer and diverse test of Java middleware because of Tom’s influence.


Congratulations to these outstanding engineers, and to the entire SPEC Java team!


Friday Jun 05, 2009

SPECtacular awards & new web performance/energy benchmark

The last of the 2009 SPECtacular awards. SPECweb2005 is the industry standard performance metric for web servers, and today it is joined by SPECweb2009, the industry standard performance and energy metric for web servers. The benchmark includes a banking workload (all SSL), a support workload (no SSL), and an ecommerce workload (mixed). This is the first application of the SPECpower methodology to potentially large system under test configurations. In the initial benchmark results you can see one system with and one without external storage, and the test report lets you see the power consumption of just the server, of the storage, and of the entire configuration at various utilization levels. The entire committee did a fantastic job with this benchmark. As always, I won't list anyone's name without permission. (But give me the okay and I'll update this posting!) SPEC recognizes:

Gary Frost (AMD) who stepped in to fill a key developer role in an emergency with the release clock ticking. He took over the control code after a sudden reassignment, and frankly we handed him quite an undocumented mess. Gary was up to the challenge and produced the finished code.

Another engineer from AMD had primary responsibility for the reporting page generator. You often can't know exactly what information ought to go into a full disclosure report (FDR) until you see it. Nor how you want it organized and arranged. Nor what data integrity cross checks need be present to avoid errors. So the committee changed requirements often during development. But no matter how many requirements were placed on him, he turned around with the needed code within a week!

An engineer from Fujitsu Technology Solutions became the de facto quality assurance office because of his thorough and methodical testing practices. If there are a hundred ways software in general can go wrong, then there are a thousand ways benchmark software can go wrong, as by its nature it runs on systems stressed to the limit. When SPEC benchmark software just works that is largely due to people like this engineer who forsee, test, and diagnose every possible failure unanticipated by the authors.

And, if you'd like to see all of the SPECtacular awards, then follow the tags!

Wednesday Jun 03, 2009

Alan Adamson is SPECtacular

Another SPECtacular award from the SPEC annual meeting: Alan Adamson retired from IBM where he had been their primary SPEC Representative, held a number of different elective positions in SPEC, and earned deep respect and trust from his colleagues. Coming from the IBM Toronto Software Lab, Alan was a natural to lead SPEC's Java committee. Having put that very large committee in smooth running order, Alan was elected secretary to the Power committee helping it to produce the first industry standard power performance benchmark. Meanwhile he led the OSG steering committee which coordinates activities of all the SPEC OSG committees.

Alan genuinely cares about the well-being of SPEC and the people involved. He demonstrates incredible thoughtfulness and effectiveness in thinking about SPEC's benchmark development. He fosters the fun and friendly SPEC culture where there is always time to share a joke or a funny story if appropriate. At the same time he creates space for candid discussions of serious matter. Alan's leadership and personal effort has been a big contributor to the success of SPEC.

Alan continues to hold one position in SPEC, as a director, because members of the board of directors are elected as individuals, not as companies. Alan serves as a general chair of the 2010 WOSP/SIPEWInternational Conference on Performance Engineering, a joint conference of SPEC and ACM which brings together top academic researchers and industry practitioners in performance engineering.

You can follow Alan on his blog, for interesting insights on art, technology, politics, and life - where he is just as opinionated as ever, just as modest as ever, just as intolerant of stupidity, and just as tolerant of the people involved - even when we are opinionated, immodest, and stupid at times. For all his hard work in SPEC I can think of nobody more deserving of a relaxing retirement than Alan, and nobody whom we will miss more than him!

Monday Jun 01, 2009

Klaus-Dieter Lange is SPECtacular

Another SPECtacular award from the SPEC annual meeting: Klaus Lange (HP) has become a valuable conduit across different levels of the organization and across benchmark subcommittees, by virtue of becoming indispensable in all of them. Though Klaus is an experienced "SPEC hand" he never forgot what he faced as a newcomer, and took it on himself to organize a new member orientation program to help new institutions integrate into SPEC more easily and effectively. As chair of the SPECpower committee Klaus delivered the industry's first energy efficiency benchmark, and leads the committee in aiding other groups as they add energy metrics to a wide range of benchmarks. These groups include many SPEC committees as well as other industry consortia. As HP's representative on the OSG steering committee Klaus has earned respect for his opinions with his diligence and fair mindedness. As a member of the Board of Directors he is often the first to step up to volunteer for important projects, as well as exercising sound judgment in conducting SPEC's business operations.

Friday May 29, 2009

John Henning is SPECtacular

Another SPECtacular award from the SPEC annual meeting: John Henning of Sun Microsystems is secretary of the Open Systems Group steering committee. John has been the driving force behind improvements to our policy document. This is crucial to efficient operation of the organization, especially as so many new organizations have joined SPEC and so many new participants have joined into the work even from long time SPEC member companies. John is also the one who reminds all of us to pause in our lecturing and really listen to our adversaries, the dissident minority voice. Sometimes they have a point that is valuable to the task at hand, if we only recognize it, and thereby harness all of the energy and creativity of the group.

Thursday May 28, 2009

David Morse is SPECtacular

Another SPECtacular award from the SPEC annual meeting: David Morse (Dell) served as vice-chair and now chair of the Open Systems Group steering committee, his effective organization and leadership of a rather fractious bunch, with successful release of many benchmarks, and formalization of rules and procedures to put everyone on an even footing with the "good old boys" and reduce risk and uncertainty in members' use of the benchmarks. Another example of his dedication is his implementation of bookmarkable search extensions to benchmark result queries on spec.org. David is equally comfortable and competent in the most complicated leadership roles and in the most difficult and detailed technical roles.

Wednesday May 27, 2009

Paula Smith is SPECtacular

Another SPECtacular award from the SPEC annual meeting: Paula Smith (VMware) was honored for her tireless, competent and patient work managing the SPEC office and the people there. Paula consistently exhibits what make SPEC an unique place. The attention and enthusiasm she brings to her volunteer work make her a pleasure to interact with. She goes above and beyond in everything she does, and is often able to turn emergencies into opportunities. Most impressive is how she maintains this over time and in every interaction, despite many competing pressures for her attention. Beyond this management work, she also manages to handle the organizational and technical work of chairing the Virtualization committee, and of course her day job at VMware.

Tuesday May 26, 2009

Cloyce Spradling is SPECtacular

Another SPECtacular award from the SPEC annual meeting: Cloyce Spradling (Sun Microsystems) was honored for continued timely support of SPEC CPU, HPG and editorial tools. The key factor is his timeliness, in that he responds to unplanned, asynchronous requests, if not with a solution then at least with a map to help people find their way out of the woods. And that's on top of his day job at Sun.

Friday May 22, 2009

SPEC awards, power performance

More 2009 SPECtacular awards. The SPECpower committee has been busy. They released version 1.10 of the SPECpower_ssj2008 benchmark as a no-cost upgrade to existing licensees. It adds support for measurement of multi-node (blade) servers, improves usability, and adds a graphical display of power data during benchmark execution. Review and publication of benchmark results continues apace, with a spirited competition for first place, and with ever more power analyzers accepted for testing, and more test labs qualified for independent publication. They have also been assisting several other benchmark committees inside SPEC, and other industry standard benchmark organizations, to implement energy measurement for their benchmarks. SPECpower is more than just a benchmark; it is a methodology, and the methodology is modified and expanded as necessary over time to accommodate energy measurements for all the different workloads which are relevant to the real world in those market segments. In alphabetical order SPEC recognizes:

  • Chris Boire (Sun Microsystems) – As release manager he coordinated and integrated development activities to keep the deliverables on schedule.

  • David Schmidt (HP) – He created stand-alone and network integrated tools for automated results checking to help insure that results submissions are correct and complete.

  • Greg Darnell (Dell) – Author of the PTDaemon, he helped many other groups get started measuring power for their benchmarks. He helps out with whatever needs to be done, technical or organizational.

  • Hansfried Block (Fujitsu Technology Solutions) - He automated the process of determining power analyzer precision, handled the acceptance of several new power analyzers, and was instrumental in getting multi-channel analyzers accepted.

  • Harry Li (Intel) – He was primary developer of the Visual Activity Monitor, giving an unique view of the system's activity.

  • Jeremy Arnold (IBM) – If I tried to recount all the accomplishments Jeremy was cited for I'd probably run into some internal blog size limit. Suffice it to say he is a primary developer on many parts of the code, who never turns down a plea for help, and who is never satisfied until the entire benchmark package is right.

  • Karl Huppler (IBM) – As primary author/editor of the Power and Performance Methodology, he organized the document to capture deep technical consensus in the committee, and made it readable and understandable for people new to the field.

  • Matthew Galloway (HP) – He designed the control software to drive multiple JVMs, enabling multi node (blade) testing.

  • An engineer (AMD) – Who created and maintained much of the web content explaining the benchmark and methodology to the public.

Thursday May 14, 2009

SPEC awards, virtualization

More 2009 SPECtacular awards. SPEC's forthcoming virtualization benchmark will provide meaningful metrics of hardware and software performance in data center consolidation. As complex as this benchmark is, running several different benchmarks together in virtual machines on a host system under test, the code is only half the story. As with all benchmarks the workload is vital, to represent realistic usage scenario(s) so that performance improvements made on the benchmark will also benefit real world users. And the run rules are vital, needing to accommodate technology improvements over the lifetime of the benchmark, while precluding unrepresentative optimizations exploiting rule loopholes. (Or what the layman might call “cheating”) There is spirited debate from companies representing rather diverse user communities, all with an interest in seeing that their customers' needs are addressed by the benchmark. In the end when this group of top engineers reaches a consensus you know they've come up with a benchmark that is as rock solid as is possible to make. From among this great team of partners and competitors, three were singled out for SPECtacular awards:

Andrew Bond of HP always steps forward when a person is needed to test new code, features, parameter tuning. He performed many experiments whose results showed the committee the sensitivity of the benchmark to various parameters, sizes, and configuration options, so that the right choices could be made for fair benchmark comparisons. He also created scripts to set up and configure new guest VMs for each workload.

Chris Floyd of IBM improved and tailored the mail server and application server workloads for the new benchmark. He's revamped these workloads several times to improve the I/O profiles and add burstiness to the application server transaction injection. He helps the other developers at regular on-line coding sessions, explaining new features, and resolving problems. He even helps out when on vacation.

Greg Kopczynski of VMware developed a (necessarily) complex and feature extensive harness for the benchmark. He responds to countless pleas for help, assistance, debugging, etc., in true SPEC fashion without asking whether the help is for a partner or a competitor. He added burstiness to the web server workload. And he integrates new code and changes from all the developers for each development kit revision.

Thanks for your great efforts!

Wednesday May 13, 2009

SPEC awards, graphics

More 2009 SPECtacular awards. Sometimes even success doesn't succeed, at first. SPEC developed a workstation energy consumption benchmark, and a lot of people worked extra hard to deliver it in time for EPA to consider using it in the Energy Star program which is being extended beyond PC's to also include workstations, servers, thin clients, and storage. Although EPA decided not to use our test for the workstation program at this time, the work is still important and I am confident it will be used in some way. A graphics processor can easily use more energy than a CPU, especially a high performance accelerated 3D processor. For their exceptional work in producing this benchmark I thank David Reiner of AMD, Joerg Grosshennig of Fujitsu Technology Solutions, Paul Besl of Intel, and an engineer from NVIDIA.

Tuesday May 12, 2009

SPEC awards, HPC

More 2009 SPECtacular awards. SPEC released an update to our MPI2007 benchmark of Message Passing Interface performance. It allows evaluation of MPI-parallel, floating point, compute intensive performance across a wide range of cluster and SMP hardware. MPI2007 continues the SPEC tradition of giving HPC users the most objective and representative benchmark suite for measuring the performance of SMP (shared memory multi-processor) systems. The update, provided at no cost to existing MPI2007 licensees, improved compatibility, stability, documentation and ease of use. SPEC gave awards to:

  • Brian Whitney of Sun Microsystems for meticulous care as release manager in scheduling, and putting it all together,

  • Carl Ponder of IBM for the development and management of documentation, especially with respect to the run rules, FAQ, and the configuration file.

  • Håkon Bugge of Platform Computing for outstanding testing skills during the benchmark development.

Monday May 11, 2009

SPEC award, mail server benchmark

At SPEC's 2009 annual meeting, awards were given for SPECtacular contributions. When your competitors and partners alike join to honor one of your own it indicates that person has truly excelled. There are 77 member organizations in SPEC including hardware vendors, software vendors, universities, government agencies, and more. We are joined by a common belief that the industry as a whole is well served by a common base of reliable and representative measures of computer system performance and energy. Thereby our companies benefit from more effective test results at lower cost. And for academia, the SPEC benchmarks provide a common reference point from which to begin performance and energy related studies.

It takes a lot of hard work to produce these benefits. Each year the individuals who are recognized by their peers as having done the most to advance SPEC's mission are singled out for awards. And now I have the pleasure of thanking these exceptional people publicly. I won't list everyone since some people don't want their names posted; but you know who you are.

I'll start today by thanking Michael Abbott of Apple who carried the brunt of the new profile and code changes to the SPECmail2009 benchmark, and contributed invaluable insight and analysis on message characteristics to improve the representativeness of the benchmark. SPECmail2009 simulates corporate mail server workloads ranging from 250 to 10,000 or more users, using industry standard SMTP, IMAP4, SSL v3.0, and TLS 1.0 protocols. Folder and message MIME structures accomodate traditional office documents and a variety of rich media content.

The SPECtacular award winners like Michael are making a positive difference in the industry, and so I say thank you! (More award winners coming...)

Wednesday Apr 09, 2008

SPEC does not certify results

Nothing is more fun than arguing with BM Seer. He usually helps me more than anyone in keeping everyone at Sun in compliance with SPEC's fair use rules. But in a recent posting on SPECweb2005 for Sun SPARC Enterprise T5220 he refers to SPEC published results as "Certified." Actually as the official SPEC disclaimer spells out, "the contents of any SPEC reporting page are the submitter's responsibility. SPEC makes no warranties about the accuracy or veracity of this data."

Most SPEC benchmark results can be used without SPEC review. They must comply with all the run and reporting rules, including the requirement for a full disclosure report. And their rules compliance can be challenged on the basis of the details in that report. There is a real value to readers, and hence to vendors, of publishing a result at spec.org. Such results are peer reviewed by other SPEC committee members including competitors, prior to publication. If a result is found to be not in compliance with the run rules it is not published, and the result cannot be used elsewhere either. However, passing this review is not a guarantee or certification that the result is accurate.

Instead of a paid independent audit process, SPEC relies on full disclosure and peer review to increase confidence in the reliability of results. From the details in the full disclosure report anyone should be able to reproduce the performance experiment and obtain substantially the same results. From time to time competitors will conduct such replication experiments on each others' systems, and if they cannot get the same number they bring it to SPEC to either get some details of the test configuration that were erroneously left out of the full disclosure report, or to have the published result marked non-compliant. By this method SPEC dramatically lowers the cost of benchmarking, making it possible to have the thousands of results posted on spec.org, while keeping vendors honest by the fear of exposure and humiliation.


Sunday Mar 30, 2008

SPEC awards, Elves working in the back

Finishing my list of SPECtacular awards given at SPEC's 2008 annual meeting in San Francisco, I want to thank some of the many people who have made invaluable contributions to the organization behind the scenes. As before I won't cite names without permission, but will add them later if given the okay.

All those thousands of SPEC results make their way through committee peer review and publication, thanks to the efforts of a web site editor volunteer from Intel, and two SPEC IT staff, Jason Glick and Cathy Sandifer. Behind the public web site is a members only web site and other servers, for collaboration on benchmark development and results review. Their job is merely keeping all this infrastructure up and running through disasters, natural and man-made (telco made, that is), while the number of benchmarks, member institutions, and participants grows rapidly, and the services to members continue to expand.

Distributing the network load worldwide, and providing redundancy in case of outages, are mirror sites at the University of Miami, U.S., and the University of Pavia, Italy. The IT coordinators at those universities were given awards for their work.

Though the benchmarks keep getting more complex, it keeps getting easier to run them, to collect, review, and publish the data without errors, and then to search and select the desired information to view. Along with our SPEC IT staff, Cloyce Spradling of Sun has done spectacular work building and maintaining these tools, and adapting them for new benchmarks.

Nobody enjoys flying these days, but we have to collaborate to get our work done. And coming from 82 different companies and institutions each with its own internal IT standards, no single vendor solution is going to work for us. Our virtual meeting facilities project was driven by Alan Adamson of IBM, an engineer from Dell, and an engineer from BEA. It has allowed us to be more productive than ever while cutting the number of physical meetings. Result: a little less CO2 dumped in the atmosphere, a little less green spent on travel by member companies, and some fewer hours of confinement in airplanes endured by SPEC participants.

Organizing the work of 82 member institutions and SPEC staff is like the proverbial cat herding. Paula Smith of VMware and Alan Adamson of IBM have managed to do it. Paula is a Vice-president of SPEC in charge of headquarters operations, and Alan is chair of the Open Systems Group steering committee. John Henning of Sun, secretary of the Open Systems Group steering committee led a comprehensive overhaul of organization policy - clearer, fewer loopholes, better guidance. And Klaus Lange of HP helped make sure that the result was accessible to new members so you don't have to be in an "old boys network" to work in SPEC.

A staff member from SPEC HQ, organizes our meetings among other jobs. This is everything from finding meeting hotels and negotiating rates to acquiring an extra projector or speaker phone, and essentially solving any problem that comes up. She makes sure our meetings are productive and low cost.

Bob Cramblitt certainly deserves some good publicity for all the work he has done for SPEC over the years. Of course trying to give him publicity by posting on my personal blog is ironic because his business is public relations. I may get only a few dozen readers, but whenever we really need to reach our target audience to get the word out about a new benchmark, Bob's the guy to get it done. He also has a very good sense for when a reporter does not want to talk to a PR flack, and needs to get the information directly from the engineers.

About

I am a software engineer in San Diego, president of the Standard Performance Evaluation Corporation (spec.org), formerly a mathematician and a violist.

Search

Categories
Archives
« April 2014
SunMonTueWedThuFriSat
  
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
   
       
Today