Version 1.03
Last Modified: August 19, 2010
The SPECjEnterprise2010TM benchmark is an end-to-end benchmark which allows performance measurement and characterization of Java EE 5 servers and supporting infrastructure such as JVM, Database, CPU, disk and servers.
The workloads consist of an end to end web based order processing domain, an RMI and Web Services driven manufacturing domain and a supply chain model utilizing document based Web Services. The application is a collection of Java classes, Java Servlets, Java Server Pages , Enterprise Java Beans, Java Persistence Entities (pojo's) and Message Driven Beans.
This document is a guide for setting up and running the SPECjEnterprise2010 benchmark. For a description of the rules and restrictions pertaining to SPECjEnterprise2010 we strongly recommend you read the SPECjEnterprise2010 Run and Reporting Rules document contained in the SPECjEnterprise benchmark kit or available online at the SPECjEnterprise2010 website. For an overview of the benchmark architecture, see the SPECjEnterprise2010 Design Document also contained in the benchmark kit.
The SPECjEnterprise2010 workload emulates an automobile manufacturing company and its associated dealerships. Dealers interact with the system using web browsers (simulated by the driver) while the actual manufacturing process is accomplished via RMI and Web Services (also driven by the driver). This workload stresses the ability of Web and EJB containers to handle the complexities of memory management, connection pooling, passivation / activation, caching, etc. The SPECjEnterprise2010 Design Document includes a complete description of the workload and the application environment in which it is run. This section of the user's guide describes software and hardware environment required to run the workload.
Although SPECjEnterprise2010 can be run on a single machine for testing purposes, compliance with the SPECjEnterprise2010 Run and Reporting Rules requires that the driver and supplier emulator be run on a machine outside the SUT. Therefore, a compliant hardware configuration must include a network and a minimum of two systems – one or more systems to run the components within the SUT and at least one system to run the driver and supplier emulator outside the SUT. A typical configuration is illustrated below.
SPECjEnterprise2010 is a Java EE 5.0 application that requires a Java EE 5 compatible application server as well as a Relational Database Management System (RDBMS) to run as part of the SUT. Outside the SUT, a Java EE 5 compatible application server is required for the supplier emulator, and a Java Runtime Environment (JRE) version 5 or later is required for the driver.
The SPECjEnterprise2010 Kit is supplied as an setup.jar that should be invoked by
These vendor specific ant files will not necessarily be available as part of the benchmark package.
After extracting the kit and running the ant scripts for installing the following directory structure will exist.
Note : For some DBMS products, DBMS specific files are in schema/
All SPECjEnterprise2010 Java classes are located in the org.spec.jent package. The following lists the sub-packages of org.spec.jent :
The SPECjEnterprise benchmark application consists of three different domains: Order, Mfg and Supplier. Each domain provides a set of business methods exposed to servlets, web services or RMI/EJB: The driver interacts with the Order domain via HTTP requests. The Mfg domain is accessed by the driver using EJB/RMI and web service calls. Communication between the domains is realized by using persistent JMS messages. The roles of the domains and the interactions between the domains and the driver are described in the design document further.
Each domain uses a distinct set of tables and calls business methods of its own domain only. Hence it is possible but not required to setup a distributed environment e.g. using a dedicated (clustered) application server and a dedicated (clustered) database instance for each domain. Please refer to section 3.1.4 to learn more about database access and alternatives for configuring data sources.
There are several components required to build, deploy, test and run SPECjEnterprise2010. These are :
Building and deploying the benchmark requires Ant 1.7.1, which is supplied as part of the SPECjEnterprise2010 kit.
Several steps must be accomplished prior to running the SPECjEnterprise2010 benchmark:
In order for ANT to run correctly, you must have the JAVA_HOME and ANT_HOME Environment variables set. If you extracted ANT using the extractant.sh provided with the kit, you can use the following:
ANT_HOME=KIT_HOME/ant/apache-ant-1.7.1
In the root directory of the benchmark kit, configure the build.properties file. The following properties need to be set for your environment:
appserver=<APPSERVER> database=<DATABASE> driver=faban
The values defined dictate the directory which is used to find the vendor specific ant build targets. These directories are organized as follows:
Some of these directories and ant build targets are provided for select products, or you can use the templates to create your own.
The instructions for modifying the KIT_HOME/faban/driver.properties will be provided in a later section.
For the benchmark, you can create a single database that houses all the
Domains, or the domains can be distributed to separate databases.
Standard SQL scripts for creating the database schema are provided in schema/sql.
These are intended to give a starting point for creating schemas for
other database products. It is beyond the scope of this user guide to
provide guidance for the installation and configuration of the various
RDBMs available on the market. For convenience database creation ANT
targets are provided for some databases in the databases/
The following commands for both Unix and Windows can be used to build the EJB Jar, Web Application aRchive (WAR)and Enterprise ARchive (EAR) files. We assume that the build environment has been configured as documented above. Only one ant target needs to be called, as it will build the jar, war, and the ear. The ant target specj.ear calls the vendor specific appserver.specj.ear target which allows vendors to replace or change e.g. xml descriptors delivered with the kit (if needed). The second command will create the emulator.ear
These targets can be used to build a submission compliant ear. Submission compliant ears must use the class files as provided with the kit, so these targets to NOT recompile the source code. For research or other unofficial testing purposes, the code may be recompiled. Targets for this are provided:
If any errors occur at this step it is generally a setup / directory mis-match problem. Please review all settings carefully prior to contacting SPEC for support.
Reference schema scripts for the different domains can be found in the schema/sql directory of the benchmark kit.
Optionally after setting up the configuration in the KIT_HOME/databases/$database directory according to your JDBC driver’s documentation you could call the following ant target to set up the database in a vendor specific way:
There are three different ways to populate the database tables:
The Injection rate is a measure of how much data will be loaded for use during the benchmark run. A higher injection rate equates to more client load, and therefore more data is created accordingly. Details regarding the mathematical relationship between Injection Rate and real tables sizes can be found in the SPECJEnterprise2010 Run Rules.
The web based database loader is located in the "Benchmark Management" of the Web UI at http://host:port/specj. After specifying the txRate and the (maximum) Parallelizm you can start loading the database by pressing the "Start" button. If you want to generate flat files (only) then check "Generate Flat files ..." and choose a delimiter and directory before pressing "Start" (database tables are not deleted, not truncated and not loaded). Since loading the database happens in the background on the application server the status can be refreshed by pressing the button "Refresh". By pressing the button "Cancel" loading can be stopped.
Database operations like deleting a table cannot be interrupted. Hence deleting a hughe table might need some time until stopping is successful. If an exception occured the stack trace is displayed in the status section. Application server log contains status as shown in the status of the database loader UI. Java Logging API is used. Hence log level can be changed like provided by Java Logging.
The properties in the property file KIT_HOME/databases/$database/database.properties are used by the standalone driver to configure the used data source:
The following properties in the property file KIT_HOME/build.properties control the loading process:
The following target starts loading:
Flat files can be generated by the web based database driver or by calling the following ant target:
In the latter case the follwing properties of the property file KIT_HOME/build.properties should be configured:
Database vendor specific load scripts could be used to load these files into the database.
The Supplier Emulator must be deployed on a Java EE 5 Web Services compliant server on a separate machines outside of the defined SUT. A reference set of web services generated jars, created via the web services reference implementation of wsgen, is automatically included in the generated Emulator.ear file. Your particular application server may require these to be regenerated. In many cases the deployment of the Supplier Emulator is as simple as deploying the generated Emulator.ear file to the desired application server. Any other changes or requirements may be vendor specific to the server implementation chosen.
The remaining steps to deploy the specj.ear file created in the target/jar directory depend on the Application Server you are using and whether or not you are using extended DDs provided by them. Each Application Server is unique and it is impossible here to capture the exact instructions to complete the deployment. Here are a list of steps that are required to complete the deployment if not using vendor supplied DDs. Please refer to Application Server documentation for help in accomplishing these steps:
While configuring the data sources please comply with the SPECjEnterprise Run Rules - if preparing for a submission and not using EAStress for scientific purposes. Remarks:
Running SPECjEnterprise2010 requires that the driver be configured for your environment and particular application server. Configuring the driver properly requires understanding how the driver works, which is described below.
The SPECjEnterprise2010 Driver consists of several Java programs and is designed to run on multiple client systems, using an arbitrary number of JVMs to ensure that the Driver has no inherent scalability limitations. Note that none of the client systems are part of the SUT. SPECjEnterprise2010 uses the Faban testing harness to control the driver agents.
The Driver consists of manufacturing and dealer agents.
The harness is based on Faban, the preferred means to schedule and automate benchmark runs. It also provides an embedded graphing tool that can map and compare run results. Additional information about the harness can be found at http://faban.sunsource.net. The harness runs as a web application hosted on a Tomcat server. It provides a Web user interface and acts as a container for the driver components of SPECjEnterprise2010. The web user interface provided by the harness is used for scheduling runs, managing runs, view run logs which are constantly updating at run time, and view the run results once the run is finished (including detailed graphs). There is also a command line interface that can used to access the same features. The harness, the driver, and all agents communicate using RMI. The harness reads the run configuration and starts the master driver and the driver agents, distributing load to the driver systems as appropriate. The Driver also reads the run configuration and configures the driver agents appropriately. Each agent will then run as many threads of their respective workload. For example, the DealerAgent will run DealerEntry. The number of threads is determined by the scaling rules of the specification and are equally distributed amongst all driver agents. Each thread runs independently, executing the workload according to the rules defined in the spec. When the run completes, the master driver co-ordinates with the Agents to retrieve all the statistics and presents the user with the reports out the reports. The harness also post-processes the reports and presents the user with several charts showing the behavior during the runs and other statistical validations.
There are a few vendor specific initial context configuration parameters [ic_config_1 ... ic_config_5] in the run.xml which should be defined in the vendor specific property file that serves as an override. These are defaulted to empty strings in the faban/driver.properties file.
appserver.workorder.ws.uri -- this should be set to the uri of the Web Service endpoint. For example currently this is defaulted to "WorkOrderSessionService/WorkOrderSession" which means if the benchmark is deployed on host lifeboat and port 8000, then the Web Service is available at http://lifeboat:8000/WorkOrderSessionService/WorkOrderSession .
workorder.web.services.wsdl.location -- this is currently defaulted to "http://ejb.workorderses.mfg.jent.spec.org/WorkOrderSessionService" and the driver uses the jaxws catalog service to find the physical location of the wsdl on the file system. The jax-ws-catalog.xml file is located under resources/driver/META-INF. As an alternative approach to finding the wsdl, the physical location to the wsdl file can also be specified for this property.
buyer.host and buyer.port specify the host and port information for the buyer web service.
supplier.host and supplier.port specify the host and port information for the supplier web service.
Both of the Web Services configurations discussed in 4.1.5.1 and 4.1.5.2 can be overridden in the harness user interface under the "Mfg Driver" tab.
By default the harness is installed in $KIT_HOME/faban/harness, this is done via the "install" ant target (see section 3.1.2 above).
All the required binaries for faban harness are checked into the workspace under faban as faban-server.tar.gz. To install or re-install the harness on your system in a place other than the default location you can:
You can start the harness (Tomcat) server using "ant faban.harness.start". At this point the harness has no deployed benchmark. For a deployed benchmark to work correctly, its client side dependencies have to be satisfied. The client side depends on javaee libraries such as javaee.jar and/or the webservices runtime jars, e.t.c; some of these dependencies maybe vendor specific.
There are two ways these dependencies can be satisfied --
1. Bundle all the client side jars into the jar that gets deployed on the harness In this approach, the client side dependencies are included in the jar file that gets deployed and the harness is responsible for distributing these bits on the individual agent machines that are used to generate load. The jars included are specified in the harness.lib.jar property. This is the path to the jar file that contains all the dependencies that need to get bundled in specjdriverharness.jar. This jar file needs to be created once by each vendor and can be reused subsequently. When "ant faban.harness.jar" is invoked and the harness.lib.jar property is specified, the build process adds the appropriate jar files into the specjdriverharness.jar.
Example of what needs to be in the harness.libs.jar file:
2. Use the classpath argument to the driver's JVM command In this approach the client side dependencies are satisfied by specifying the classpath to the individual jar files. It is the responsibility of the deployer to make sure that on each agent system the same classpath is available to the drivers. The classpath is specified in the faban.agent.classpath property of the build.xml under faban/ . This goes into the run.xml and can be overridden in the harness Web user interface.
Once one of the above approaches is picked to satisfy the client side dependencies, to deploy SPECjEnterprise2010, do the following:
The Faban harness exposes its user interface through a web browser. Once the harness is setup, point your web browser to http://host:9980 to view the user interface. You would be asked to create a profile. Profiles allow you to schedule runs for multiple benchmarks at the same time from the same harness. At any time only one benchmark is run. The Faban harness in SPECjEnterprise2010 is completely integrated with the Manufacturing and Dealer drivers. Once you have created a profile, you can click on "Schedule Run" on the left hand menu to start a run. Change the run parameters on the user interface as needed, these parameters are picked up from the run.xml and can be edited. Once edited the run parameters are persistent across runs until you redeploy the benchmark with the option to "clear previous benchmark configuration" selected.
There are currently five tabs available on the harness UI:
You should see some descriptive text for each of the configuration parameter with a mouse over on the text box. Once all your edits are done, click on the "Ok" button to start the run. The run can be monitored by clicking on the run id presented after the run has been submitted. Alternatively you can also navigate to the run using the "view results" menu and selecting the appropriate run id (arranged in the order of latest to oldest). Once a run is completed, you can see the summary results by clicking on the "Summary Result." To view the various graphs click on the "Detailed Results" link (appears a bit after the run completes). To review the run configuration click on "Run Configuration". System statistics can be viewed under "Statistics", to see io & cpu statistics you would need to specify the "tools" under the "Driver" tab; for example for Solaris these can be specified as "iostat; mpstat" e.t.c.
The faban cli provides the same functionality that is available from the harness web interface, from the command line. The faban cli is the recommended way of starting a benchmark run from the command line when using multiple agents on remote systems. The benchmark configuration is still picked from the run.xml file. To run multiple agents, potentially on remote systems, make sure the number of agents under the driverconfig tab are set appropriately and the <host> element under <hostConfig> is space separated list of driver hosts. The following ant targets are provided for starting a run using the faban cli:
If no <driver host> is specified the default local host is used.
The live statistics is used to monitor whether the benchmark run is progressing according to expectations you can turn the live stats feature on: <runtimestats enabled="true"> <interval>300</interval> </runtimestats>
This will result in run statistics being printed out at 300 second intervals in the run log after the start of the run (this is when the log says "Ramp up started" including during ramp up). For getting stats at a different interval, change the interval value. The output format is explained based on the example below:
INFO: 4500.00s - MfgDriver: CreateVehicleEJB/CreateVehicleWS CThru=15.533/13.867 OThru=14.821/14.822 CErr=0.000/0.000 CResp=0.018/0.015 OResp=0.010/0.013 CSD=0.034/0.022 OSD=0.014/0.009 C90%Resp=-/- O90%Resp=0.020/0.020
INFO: 4500.00s - DealerDriver: Purchase/Manage/Browse CThru=13.400/14.000/24.000 OThru=12.556/12.621/25.140 CErr=0.000/0.000/0.000 CResp=0.010/0.017/0.026 OResp=0.009/0.015/0.025 CSD=0.016/0.017/0.015 OSD=0.009/0.016/0.012 C90%Resp=-/-/- O90%Resp=0.020/0.030/0.040
These statistics are interpreted as follows: At 300 seconds from start of the run, the MfgDriver is reporting statistics for the two Operations CreateVehicleEJB and CreateVehicleWS.
So for example, the CreateVehicleEJB operation's current throughput (over the last collection interval of 300 seconds was 53.8 txs/sec) and the CreateVehicleWS operation's current throughput was 34.000 txs/sec.
By default, the SPECjEnterprise2010 comes with logging pre-configured at level INFO. To change the log level, edit the file ${harness.install.dir}/config/logging.properties Log levels are defined by the Java logging facilities, and are as follows: SEVERE WARNING INFO CONFIG FINE FINER FINEST Setting the log level to FINER or FINEST for the whole system would generate a large amount of output. It is recommended to enable such logging only for specific subsystems that needs more detail. For example, to enable logging at level finer for the DealerDriver, add the following line to logging.properties: org.spec.jent.driver.DealerDriver.level = FINER For further information on logging configuraion and logging.properties file format, please refer to the Java Logging Overview at the following location: http://java.sun.com/javase/6/docs/technotes/guides/logging/overview.html.
During auditing the driver executes some tests before the workload is started to verify that a few important run rules are met. If a test fails the run is stopped.
For testing purposes auditing can be skipped by setting the audit flag in faban/run.xml.template to false. Alternatively by setting the flag stopIfAuditFailed in faban/run.xml.template to false the run is not stopped if auditing fails.
In order to get a valid run please reload the database and restart your application server before starting the driver.
The database has to be reloaded since the driver executes some checks during auditing including table row size.
The reasons for restarting the application server are as follows:
The Submission File contains a detailed description of the SUT in text format that is used by the SPECjEnterprise2010 reporter to produce a report suitable for publication.
To create a SPECjEnterprise2010 Submission File:
SPECjEnterprise2010 includes a utility for generating a results page called the reporter. The report contains a summary of the key results and a complete system description in a format designed to be consistent with other SPEC reporting pages. In addition to this submission file a sample report is generated which simulates the appearance of the report on the SPEC results website.
To run the reporter by hand, use the following command:
$ java -classpath reporter.jar reporter [-a] [-r] <submission file>.txt* <result output directory>
Where:
Output:
An HTML file named <filename>.report.html is created by default.
The "-a" option will create a text report page named <filename>.report.txt and the "-r" option will write the output to the result output directory.
Once you have a successful run, you can submit the results to the SPEC OSG Java subcommittee for review. To submit a result to SPEC:
Every submission goes through a minimum two-week review process, starting on a scheduled SPEC OSG Java sub-committee conference call. During the review, members of the committee may ask for additional information or clarification of the submission. Once the result has been reviewed and accepted by the committee, it is displayed on the SPEC web site at http://www.spec.org/.
The following is a list of commonly used ant targets and a brief description of their purpose.
Unless otherwise noted – KIT_HOME refers to the location where the SPECjEnterprise2010 Kit was extracted.
Target | Location | Description |
---|---|---|
all | KIT_HOME/build.xml | Generates the emulator.ear, specj.ear, driver.jar file |
appserver.specj.ear | KIT_HOME/appservers/appserver_type/build.xml | Runs any application server specific tooling on the precompiled and packaged specj.ear file. |
clean.all | KIT_HOME/build.xml | removes all output generated by this build. For example: all jars in the target/jars directory will be deleted. |
clean.classes | KIT_HOME/build.xml | removes all classes generated by this build |
database.configure | KIT_HOME/database/db_type/build.xml | Configure the database by creating the tables |
deploy-on-harness | KIT_HOME/faban/build.xml | Deploys driver on the Faban harness |
driver.jar | KIT_HOME/faban/build.xml | Generates the driver.jar file |
driver.run | KIT_HOME/faban/build.xml | Runs the driver outside the Faban harness by starting Faban registry, driver agents and then the Faban master |
emulator.ear | KIT_HOME/build.xml | Generates the emulator.ear file with existing jar files |
emulator.ear.withcompile | KIT_HOME/build.xml | Generates the emulator.ear file after compiling any required classes |
extract.faban.libs | KIT_HOME/faban/build.xml | Extracts the lib directory from the faban server kit to the faban/lib directory |
faban.cli.killrun | KIT_HOME/faban/build.xml | Stops the benchmark using the faban command line interface |
faban.cli.run | KIT_HOME/faban/build.xml | Starts the benchmark using the faban command line interface |
faban.harness.install | KIT_HOME/faban/build.xml | Installs the faban harness on the current system. |
faban.harness.start | KIT_HOME/faban/build.xml | Starts the faban harness on the current system. |
faban.harness.stop | KIT_HOME/faban/build.xml | Stop the faban harness on the current system. |
generate.buyer-supplier.service | KIT_HOME/build.xml | Generate buyer and supplier webservice jar files |
generate.workorder.service | KIT_HOME/build.xml | Regenerate the WSDL for the WebService |
install | KIT_HOME/build.xml | finish installation, or can be used to retrieve original class, jar, ear files delivered with the kit |
kit | KIT_HOME/build.xml | create a jar that contains all of the bits needed to run the benchmark |
load.database | KIT_HOME/build.xml | Load the database using the standalone database loader |
load.flatfiles | KIT_HOME/build.xml | Create flat files to enable loading the database for the currently configured work load |
print-class-path | KIT_HOME/build.xml | Prints the classpath of the kit |
reporter.jar | KIT_HOME/build.xml | Create the reporter.jar file in the target/jar directory |
reporter.run | KIT_HOME/build.xml | run the reporter using the results from the last run (driver or harness) |
specj.ear | KIT_HOME/build.xml | Generates the specj.ear file from existing jar files |
specj.ear.withcompile | KIT_HOME/build.xml | Generates the specj.ear file after compiling any required classes |
specj.jar | KIT_HOME/build.xml | Create the specj.jar file in the target/jar directory |
specj.war | KIT_HOME/build.xml | Generates the specj.war file |
supplier.war | KIT_HOME/build.xml | Generates the supplier.war file |
Product and service names mentioned herein may be the trademarks of their respective owners.
Copyright © 2001-2012 Standard Performance Evaluation Corporation
All Rights Reserved