Chapter 19. Miscellaneous Performance Information
19.1 Public Benchmarks (TPC-C, SAP, NotesBench, SPECjbb2000, VolanoMark)
iSeries systems have been represented in several public performance benchmarks. The purpose of these
benchmarks is to give an indication of relative strength in a general field of computing. Benchmark
results can give confidence in a system's capabilities, but should not be viewed as a sole criterion for the
purchase or upgrading of a system. We do not include specific benchmark results in this chapter, because
the positioning of these results are constantly changing as other vendors submit their own results. Instead,
this section will reference several locations on the internet where current information may be found.
A good source of information on many benchmark results can be found at the ideasInternational
benchmark page, at
http://www.ideasinternational.com/benchmark/bench.html
.
TPC-C Commercial Performance
The Transaction Processing Performance Council's TPC Benchmark C (TPC-C (**)) is a public
benchmark that stresses systems in a full integrity transaction processing environment. It was designed to
stress systems in a way that is closely related to general business computing, but the functional emphasis
may still vary significantly from an actual customer environment. It is fair to note that the business model
for TPC-C was created in 1990, so computing technologies that were developed in subsequent years are
not included in the benchmark.
There are two methods used to measure the TPC-C benchmark. One uses multiple small systems
connected to a single database server. This implementation is called a "non-cluster" implementation by
the TPC. The other implementation method grows this configuration by coupling multiple database
servers together in a clustered environment. The benchmark is designed in such a way that these clusters
scale far better than might be expected in a real environment. Less than 10% of the transactions touch
more than one of the database server systems, and for that small number the cross-system access is
typically for only a single record. Because the benchmark allows unrealistic scaling of clustered
configurations, we would advise against making comparisons between clustered and non-clustered
configurations. All iSeries results and AS/400 results in this benchmark are non-clustered configurations -
showing the strengths of our system as a database server.
The most current level of TPC-C benchmark standards is Version 5, which requires the same performance
reporting metrics but now requires pricing of configurations to include 24 hr x 7 day a week maintenance
rather than 8 hr x 5 day a week and some additional changes in pricing the communication connections.
All previous version submissions from reporting vendors have been offered the opportunity to simply
republish their results with these new metric ground rules. And as of April, 2001 not all vendors have
chosen to republish their results to the new Version 5 standard. iSeries and pSeries has republished.
For additional information on the benchmark and current results, please refer to the TPC's web site at:
http://www.tpc.org
SAP Performance Information
Several Business Partner companies have defined benchmarks for which their applications can be rated on
different hardware and middle ware platforms. Among the first to do this was SAP. SAP has defined a
suite of "Standard Application Benchmarks", each of which stresses a different part of SAP's solutions.
IBM i 6.1 Performance Capabilities Reference - January/April/October 2008
©
Copyright IBM Corp. 2008
Chapter 19 - Misc Perf Information
301