Skip navigation

Standard Performance Evaluation Corporation

Facebook logo LinkedIn logo Twitter logo
 
 

 

The FAQs (Frequently Asked Questions)
About SPEC/GPC's Pro/ENGINEER™ and
SolidWorks 98Plus™ Benchmarks

Q. Why did SPEC/GPC�s Application Performance Characterization (APC) project group decide to develop Pro/ENGINEER™ and SolidWorks™ benchmarks?
A. Pro/ENGINEER and SolidWorks are two of the leading applications in the CAD/CAM/CAE marketplace. SPEC/GPC�s Application Performance Characterization (APC) project group was formed to provide standardized graphics performance measurement tools based on these kinds of industry-leading applications. The popularity and graphics-intensive nature of Pro/ENGINEER and SolidWorks made these two applications among the APC project group�s first priorities for its benchmarking efforts.

Q. Does the APC project group intend to pursue benchmarks for other leading CAD/CAM/CAE applications?
A.
The group would like to offer free benchmarks for all the leading CAD/CAM applications that include graphics-intensive operations. APC project group representatives have had discussions with most of the major CAD/CAM software vendors. A major obstacle is the availability of large-scale, realistic models that can be distributed freely to the public.

Q. Who provided the models for the Pro/ENGINEER and SolidWorks benchmarks?
A.
The vendors themselves provided the models. Parametric Technologies provided a complete and realistic model of a hypothetical photocopy machine for the Pro/ENGINEER benchmark. That model contains approximately 370,000 triangles. SolidWorks provided the models for the SolidWorks 98Plus benchmark, the largest of which is 276,000 polygons.

Q. What tests are included within the Pro/E benchmark?
A.
The benchmark comprises 17 tests. Startup and initialization time is measured, but given no weight (0.0) within the composite score for the benchmark.

There are 16 graphics tests, each of which measures a different rendering mode or features. The first three graphics tests measure wireframe performance using the entire model. The next four measure different aspects of shaded performance, using the same model. Each of these tests executes exactly the same sequence of 3D transformations to provide a direct comparison of different rendering modes.

The next four tests use a subassembly, and compare the two FASTHLR modes, the default shading mode, and shaded with edges. These tests also execute a common sequence of 3D transformations. The last five graphics tests use two different instances of the model � the first three without its outer skins (to illustrate the effect of FASTHLR and level-of-detail operations), and the last two to illustrate complex lighting modes and surface curvature display.

The last test is an aggregate of all time not accounted for by the previous 16 tests, and is a mix of CPU and graphics operations.

Q. What scores are provided for the Pro/ENGINEER benchmark?
A.
Scores are generated for all 17 tests. Composite numbers are provided for each set of graphics tests (shaded, sub-assembly, wireframe and other) and there is an overall composite score for graphics and CPU operations. Start-up and initiation time is not included in the composite score.

Q. Why does the Pro/E benchmark take so long to run?
A.
There are two reasons. First, the measurement accuracy of the timing mechanism is plus or minus one second. If very short test segments were used, the measurement error could easily obscure differences in the benchmarked systems. Second, the performance of graphics workstations is increasing rapidly. A shorter test could result in a meaningless benchmark a year from now.

Q. Why produce a Pro/ENGINEER benchmark when Bench98 from Pro/E: The Magazine is already well accepted?
A.
The benchmarks have different content and goals. SPEC/GPC�s benchmark uses a large model to measure graphics performance across a broad range of functionality. In contrast, Bench98 is oriented more towards a typical user session, in which graphics interaction plays a lesser role. Results from SPEC/GPC�s Pro/ENGINEER benchmark and Bench98 are not comparable.

Another major difference is benchmark availability and frequency of reporting results. SPEC/GPC�s Pro/ENGINEER benchmark has been made available to the public immediately upon approval from SPEC/GPC. New results are currently published every other month, with more frequent updates planned in the future.

Q. What is required to run the Pro/ENGINEER benchmark?
A.
A fully licensed, released version of Pro/ENGINEER Rev. 20 is required. If a floating license is used, the workstation�s network must be configured as documented in the Pro/ENGINEER installation guide. In addition, the workstation being benchmarked must have a 3D graphics display device that is recognized by Pro/ENGINEER.

Q. How is SPEC/GPC�s SolidWorks 98Plus benchmark different from SPEC/GPC�s SolidWorks 98 benchmark?
A.
The two benchmarks use the same models and perform the same basic tests, but their results should not be compared, since they are based on different application versions. Registry files for SPEC/GPC�s SolidWorks 98Plus have been changed to correspond with the latest version of the application, part and assembly files have been updated to correspond with new file formats, and the rotation of parts have been revised in the graphics performance tests.

As with SolidWorks 98, five tests are included in the new benchmark. I/O-intensive operations, CPU operations, and three different types of graphics operations are timed based on common user interaction with the models. A single number is derived from a weighted geometric mean of the normalized score for all five tests. Scores are also reported for each of the five individual tests and for the geometric mean of the three graphics tests. Results are normalized to a reference machine (300-MHz Pentium II processor; PERMEDIA 2 graphics processor) chosen by the APC project group.

Q. I�ve heard that SolidWorks Solutions magazine is creating a SolidWorks 99 benchmark. Will the APC project group continue updating its SolidWorks benchmark, and if so, how will it differ from that of the magazine?
A.
SPEC/GPC will continue to update its SolidWorks benchmark as long as there is vendor support and user interest. There is no way of knowing at this time how the APC project group�s benchmark will differ from that under development by SolidWorks Solutions.

What will almost surely differ are availability, distribution and frequency of reporting results. New versions of SPEC/GPC�s SolidWorks benchmark will be available immediately to the public upon approval by SPEC/GPC. They will be available without charge via the SPEC/GPC Web site to any vendor, user or publication that wishes to use it. Reporting of vendor performance results on the SPEC/GPC site would likely be more frequent than those in a print publication.

Q. Who performs testing for the results published on the GPC News Web site?
A.
Benchmark testing is performed by the vendors themselves, according to rules established by the APC project group. Vendors are responsible for the accuracy of the results they report. All results are reviewed and approved by APC project group members before publication on the Web site.

Q. Where is more information about membership and application benchmarking available?
A.
Information is available through this Web site or through the APC project group�s e-mail alias: gpcapc@spec.org.