Skip navigation

Standard Performance Evaluation Corporation

Facebook logo LinkedIn logo Twitter logo
 
 

The SPECworkstation 3.1 Benchmark — A User's Story

By Chandra Sakthivel, SPECwpc Chair



Nikki is a procurement manager at a large digital media and marketing company that offers vertical solutions for a range of industries, including media and entertainment, life sciences and ecommerce. She was tasked with purchasing over 300 new workstations to satisfy the needs of power users across the company – while staying within a finite budget.

The need for new workstations arose as employees in software development, graphics modeling and entertainment production increasingly complained that the performance of their existing systems was significantly impacting their productivity, leading to slower release cycles, missed opportunities and frustrated co-workers and customers. The existing workstations typically had older-generation CPUs, entry-level GPUs, 8GB of memory, and 512GB of disk space with DDR3 memory, so Nikki spent several days online reviewing the latest offerings from various vendors and reading reviews on several technology media sites to try to determine hardware configurations that would optimally balance performance and cost.

While her online research was helpful, Nikki was frustrated by the subjective nature of most of the reviews: systems "felt powerful" and "seemed fast" — and no review could give her any insight into how the new systems would compare to the company's existing workstations.

Of greatest interest, the few articles providing direct comparisons between the performance of different vendors' solutions consistently referred to the SPECworkstation 3.1 benchmark. In the past, Nikki had resisted the temptation to rely on performance benchmarks because most of them were developed by vendors who used them to tout the performance of their own systems, so they couldn't be used to compare systems from different vendors. Nikki was excited when she found that SPEC benchmarks are developed with the participation of multiple vendors and that SPEC's mission is to produce fair, vendor-agnostic comparisons of computing performance.

Nikki reviewed the SPEC website to see who participated in the organization and reviewed several articles to confirm SPEC's reputation as a highly regarded source of reliable information. She also scanned the dozens of benchmarks offered by the organization to confirm that the SPECworkstation 3.1 benchmark best met her needs. This benchmark measures all key aspects of workstation performance based on diverse professional applications. It includes more than 30 workloads containing nearly 140 tests that exercise CPU, graphics, I/O and memory bandwidth. Version 3.1 is also relatively new, reflecting the most recent hardware options, including how workloads partition and distribute threads across multi-core CPUs. It also supports the latest generation of GPUs, as well as vectorization, which leverages advanced CPU hardware to accelerate performance.

The comprehensiveness of the SPECworkstation 3.1 benchmark was particularly important to Nikki because her company worked in multiple segments. The Media and Entertainment group relied on Blender and Handbrake transcoders, the Life Sciences group conducted simulation activities using workloads similar to Large-scale Atomic/Molecular Massively Parallel Simulator (LAMMPS) and Nanoscale Molecular Dynamics (NAMD), and the Ecommerce group relied on several general-purpose use cases covered by the benchmark.

On the SPECworkstation 3.1 benchmark webpage, Nikki also found many published results from vendors who had measured the performance of their latest systems using the benchmark. This was extremely helpful, and Nikki used these results to develop a request for proposal (RFP) to send to three different vendors. The RFP included a request for demo systems at two different price points, one with mobile workstation specifications, the other with desktop-category hardware.

Once the RFPs were issued, Nikki downloaded the benchmark, which would enable her to run it on both the test systems she would receive and some of the company's existing systems to confirm the promised performance of the new systems and the degree of improvement users would experience. Since her company was not a computer-related service provider, she was able to get a free license.

Once she received the test systems, installing the software on each system was fast and painless, requiring no configuration. After running the SPECworkstation 3.1 benchmark on an array of test and existing systems, Nikki confirmed that the performance of the test systems at both price points was in line with the published results, and that the performance improvements compared to the company's existing systems was dramatic for all the test systems. This enabled Nikki to confidently select the system that provided the maximum performance gain while staying within her budget.

[Back to SPEC blog]