|
SPECmail2001 Submission Checklist
Version: 1.05
Last Update:
22 July 2002
|
This is the submission checklist for the SPECmail2001 benchmark. This checklist
is used by a SPEC Mail Server Subcommittee representative for all SPECmail2001
submissions.
Pre-submission Items
- Verify that the result was generated using version 1.00 or 1.01 of the
SPECmail2001 benchmark.
- Verify that the raw file contains an accurate System Under Test (SUT) description.
- Verify that the raw file contains an accurate description of the clients. Information
needed for the clients is documented in section 3.3.4 of the Run
and Reporting Rules.
- Verify that all the fields in the raw file are are formatted correctly.
The SPEC server will reject submissions that are missing fields or include
incorrectly formatted information. The sample raw file contains descriptions
of the fields and formatting information.
- Verify that a description of the network configuration is included in the
notes section of the raw file.
- Verify the submission includes the configuration diagram. The configuration
diagram should be in PNG, JPEG or GIF format.
- Submissions should be mailed to submail2001@spec.org.
The raw file and the configuration diagram should be included as attachments
to a single e-mail in that order. This e-mail alias automatically checks
the results and enters the results into the SPEC system.
- Review the Run Rules.
Administrative Review Items
- Has the submitter made arrangements to pay the submissions fee to the
SPEC office?
Submissions from non-SPEC members require a publication fee of $500 for
each result submitted to SPEC. This fee must be received by the SPEC
office prior to publication of results.
- Is the hardware and software generally available? If not, will it
be available within 3 months?
- Is this final release hardware and software? If not, will the submitter
retest with the final hardware/software?
Results on the final hardware/software must not degrade by more than 5%,
or the original results must be marked non-compliant (see section 2.3.2 of
the SPEC Open Systems Group Policies and Procedures
Document) and replacement results submitted.
- Are the vendor specific options described in the raw file? If not
can the submitter provide a text file with a list of the options (including
descriptions)?
- Will the submitter be able to join a con-call when we review the results?
- Can the submitter answer technical questions about these results? If
not, can a technical person who is knowledgeable about the results join the
con-call?
- Is there any corporate plan dependant on these results being published
by SPEC by a specific date?
The 2 week review period is a minimum.
The review period starts on the next subcommittee bi-weekly con-call (currently
Wednesdays). A review for a complex system or with unanswered questions may
take longer than 2 weeks to complete. For more information see the Guidelines
for Result Submission and Review in the SPEC
Open Systems Group Policies and Procedures Document.
Technical Review Items
- Is your data integrity protected against power failures?
For example: if your system uses disk write caches, power failure protection
can be provided via UPS or 48 hour battery back-up.
- Is your data integrity protected against system failures (software panics,
hardware faults)?
- When your application writes the data, does the operating system commit
the data to physical storage before completing the operation?
For example: in a UNIX system, the file systems must be mounted synchronously.
- Did you save your SMTP and POP3 logs for the review?
- Do your logs contain the minimum information as dictated by section 2.5
in the Run and Reporting Rules.
- Does your System Under Test (SUT) use a TCP maximum segment size less than
or equal to 1460 bytes?
- Is your TIME_WAIT greater than or equal to 60 seconds?
- Are your TIME_WAIT entries dynamically allocated? If not, are they
unique (i.e. not reused) during the TIME_WAIT period?
Network configuration restrictions are spelled out in section 2.6 of the Run
and Reporting Rules.
- Are all your optimizations appropriate (recommended and supported) for
a customer site?
Optimizations utilized must improve performance for a larger class of
workloads than just the ones defined by this benchmark. A complete explanation
of SPEC's philosophy regarding optimizations is explained in section 1.1
of the Run and Reporting Rules.
Copyright © 2002 Standard Performance Evaluation Corporation