Demystifying Quantum Benchmarks

IEEE Computer Society Team
Published 02/26/2024
Share this on:

quantum computing benchmarkingQuantum computing, with its potential to revolutionize fields from medicine to finance, is no longer science fiction. But like any complex system, accurately measuring its performance remains a challenge. Enter quantum benchmarks, the tools for gauging the true power of these machines.

IBM’s paper, “Defining Best Practices for Quantum Benchmarks,” challenges the quantum community to adopt consistent benchmarking approaches to evaluate and compare quantum devices. This work is built around a simple but vital question: can developing standardized scientific benchmarking guidelines increase clarity and objectivity in gauging quantum achievements?

To address this concept, a team of IBM researchers — Mirko Amico, Helena Zhang, Petar Jurcevic, Lev S. Bishop, Paul Nation, Andrew Wack, and David C. McKay — outlined criteria that quantum benchmarks should follow and encourage widespread adoption of these principles so all stakeholders can accurately track progress in the rapidly evolving quantum sphere.

 

The Benchmarking Conundrum


Unlike their classical counterparts, quantum computers are like a universal yardstick for performance. This ambiguity makes comparing different devices, let alone tracking progress over time, a daunting task. The IBM Quantum team in Yorktown Heights, New York, emphasized the need for standardized benchmarks, stressing key characteristics:

  • Randomized: Eliminate biases and ensure statistically significant results.
  • Well-defined: With clear specifications and implementation procedures, leaving no room for ambiguity.
  • Holistic: Encompassing various aspects of device performance, not just focusing on specific strengths.
  • Device-independent: Applicable to different technologies, fostering inclusivity across the field.

The IBM Quantum team examined Quantum Volume (QV) as an example benchmark, exploring the nuances of using different success metrics to evaluate its performance. Choosing the right benchmark depends on the specific task and desired insights.

 

Beyond the Benchmark: The Power of Diagnostics


Not all tools are created equal, and the same applies to benchmarks and diagnostics. While benchmarks provide a holistic assessment of a device’s average performance, diagnostics pinpoint specific error sources or hardware components. The IBM Quantum team highlighted the role of application-oriented circuit libraries, collections of algorithms with diverse quantum circuit structures, in uncovering hardware quirks. However, it cautions against reliance solely on a limited set of applications, as this can paint an incomplete picture.

 

A Reflection of True Potential


The researchers introduced a powerful technique called mirror circuits. These circuits offer a way to optimize benchmarking by addressing hardware limitations. Mirror circuits can expose subtle errors that might go unnoticed in traditional benchmarks.

 

Balancing Scale, Quality, and Speed


The IBM Quantum researchers also explore the art of fine-tuning benchmarks, balancing three key aspects:

  • Scale: the number of qubits and gates involved
  • Quality: The accuracy and reliability of the results
  • Speed: How quickly the benchmarks can be executed

There are inherent trade-offs between these factors. More complex benchmarks might offer deeper insights but take longer to run. Striking the right balance is key, and transparency is crucial. Disclosing the optimization techniques used ensures fair comparisons and fosters trust within the research community.

 

Showcasing the Impact


Using a suite of applications and mirror circuits, the authors illustrate the dramatic effects of error suppression and mitigation techniques on reported values. The results showcase that even basic techniques can significantly improve both application and mirror benchmarking circuits. The authors also examine more sophisticated error mitigation techniques, revealing their potential to further enhance the quality of results.

By advocating for standardized, well-defined benchmarks and emphasizing the importance of transparency and optimization techniques, “Defining Best Practices for Quantum Benchmarks” equips researchers with valuable tools to navigate quantum benchmarking and accurately assess and advance the future of quantum computing. For a closer look at the research findings, download the full paper.

Download Article

"*" indicates required fields

Name*
Are you interested in learning more about any of the following?
IEEE Privacy Policy*