The Current State of Quantum Machine Learning

IEEE Computer Society Team
Published 04/13/2024
Share this on:

quantum machine learningMachine learning (ML) via classical computing has long been used by industries and researchers to meet a nearly infinite variety of challenges. Quantum machine learning (QML), on the other hand, is relatively new and generally underutilized in both industry and research.

Given the recent advances in quantum computing, researchers attending the 2023 IEEE International Conference on Quantum Computing and Engineering (QCE) investigated how to expand the number of use cases for QML in the workshop ‘Quantum Machine Learning: From Foundations to Applications.’

Researchers Volker Tresp, Steffen Udluft, Daniel Hein, and Werner Hauptmann from Siemens AG, Technology in Munich, Germany, Martin Leib from IQM in Munich, Christopher Mutschler and Daniel D. Scherer from Fraunhofer IIS in Nuremberg, and Wolfgang Mauerer of Technical University of Applied Sciences Regensburg Siemens AG, Technology in Regensburg/Munich, Germany outlined the purpose and outcomes of the workshop.

 

The Goals of the Workshop


The workshop brought together researchers and industry practitioners from AI, ML, software and systems engineering, physics, and other disciplines to discuss the challenges QML currently faces and how it can be used to support the advancement of QML as an effective tool with diverse applications.

 

The Current State of Quantum Machine Learning

Currently, QML shows the potential to operate using fewer data points than traditional ML, as is the case with quantum support vector machines.

At the same time, subtle issues present limitations. Many of the challenges stem from the fact that quantum computers are sensitive to noise, and since qubits can assume an infinite number of states, errors can be hard to correct.

To best address QML’s challenges, the researchers felt it was most suitable to raise the topic in the quantum community instead of the ML community, which is already crowded with discourse. This made IEEE’s International Conference on Quantum Computing and Engineering (QCE) an ideal venue for the workshop.

 

A Call for Contributions


To bring more ideas to the table and encourage a diverse, international array of voices, the team encouraged participants to submit four different kinds of papers:

  1. Research papers
  2. System papers
  3. Experiments and analysis papers
  4. Application papers

The desired outcomes included:

  • Publishing, presenting, and discussing research
  • Allowing participants to form interest groups
  • Enhance understanding regarding how to compare QML solutions holistically
  • Identify research and application opportunities for QML
  • Form relationships between those from industry and those from academia around the potential of QML
  • Share ideas between the machine learning and quantum computing communities.

The target audience for the workshop was broad: Anyone interested in learning about the challenges faced and opportunities presented by QML, regardless of whether they hail from computer science, physics, or engineering.

 

Eight Papers Accepted


The team used a triple-anonymous peer review process that evaluated submissions according to their relevance, novelty, technical soundness, appropriateness, depth of literary coverage, and presentation.

The process resulted in eight papers being accepted. These addressed three primary topics:

  1. Applications of QML, such as for sentiment analysis in the finance sector
  2. Quantum reinforcement learning can improve sampling efficiency, optimize circuit compilation, and involve multiple quantum processing units (QPUs) in distributed training.
  3. General quantum machine learning, including computing the principle components of a matrix, Betti numbers, and persistent Bettis numbers.

 

Results: An Accurate Picture of the Current State of QML


When looked at as a whole, the papers depict an accurate vista of the QML field. It’s still very difficult to execute unconditional speedups in a QML setting due to a lack of understanding regarding QML and the wide range of approaches already driving classical ML systems.

The small size, as well as the noise and imperfections of existing QML machines, makes it difficult to extrapolate larger-scale uses of QML.

These difficulties are exacerbated by a few different factors. Classical ML research is based on computational models and doesn’t depend on physical structures, while QML needs to be run and tested in a physical environment due to a lack of reliable error correction techniques. In addition, the relatively small size of the QML community limits the number of people available to address these challenges, which inhibits the pace of progress.

At the same time, the workshop was a significant step forward towards overcoming these and other challenges. It surfaced papers that broadened the understanding of QML’s potential and united stakeholders from a variety of backgrounds around the advancement of QML. For a deeper dive into the workshop and the contributions of participants and authors, download the full paper

 

Download Report

"*" indicates required fields

Name*
Are you interested in learning more about any of the following?
IEEE Privacy Policy*




Registration is now open for IEEE International Conference on Quantum Computing and Engineering (QCE)


Join us 15-20 September at the Palais des Congrès in Montreal for a transformative week of Quantum Computing. Meet and learn from leading researchers, innovators, and peers in this dynamic field. Don’t just watch the future unfold—be a part of it. REGISTER NOW and mark your calendars for IEEE International Conference on Quantum Computing and Engineering (QCE)