Zing Forum

Reading

Quantum Neural Networks for Binary Classification: In-depth Evaluation of Feedforward and Backpropagation Architectures

A study systematically evaluating the performance of quantum neural networks (QNNs) in binary classification tasks, comparing two architectures—Quantum Feedforward Neural Networks (QFNN) and Quantum Backpropagation Neural Networks (QBPNN)—with experimental validation on six classic datasets using the PennyLane framework.

quantum neural networksbinary classificationPennyLanequantum machine learningQFNNQBPNN
Published 2026-05-09 13:22Recent activity 2026-05-09 13:29Estimated read 6 min
Quantum Neural Networks for Binary Classification: In-depth Evaluation of Feedforward and Backpropagation Architectures
1

Section 01

In-depth Evaluation of Quantum Neural Network Architectures for Binary Classification: A Comparative Study of QFNN and QBPNN

A study systematically evaluating the performance of quantum neural networks (QNNs) in binary classification tasks, comparing two architectures—Quantum Feedforward Neural Networks (QFNN) and Quantum Backpropagation Neural Networks (QBPNN). Experimental validation was conducted on six classic datasets using the PennyLane framework, revealing the impact of different design choices on performance and providing empirical references for the practical application of quantum machine learning.

2

Section 02

Research Background and Motivation

The intersection of quantum computing and machine learning is developing rapidly. As an important branch, quantum neural networks (QNNs) are expected to outperform classical algorithms, but there is a lack of systematic comparison of different architectures in real-world classification tasks. Addressing this issue, the Nepalese research team of Sahaj Raj Malla and Sudan Jha focused on binary classification tasks, designed and implemented two architectures—QFNN and QBPNN—and conducted comprehensive evaluations on six representative datasets.

3

Section 03

Quantum Neural Network Architecture Design

Quantum Feedforward Neural Network (QFNN)

Designed with three interference layers, information is propagated unidirectionally via quantum gates. Each layer contains parameterized quantum gates that can be optimized, implemented using the PennyLane framework (which supports automatic differentiation).

Quantum Backpropagation Neural Network (QBPNN)

A six-layer structure with residual connections (to mitigate gradient vanishing), allowing bidirectional information flow. Theoretically, it can capture more complex feature relationships, drawing on experience from classical deep learning.

4

Section 04

Experimental Design and Dataset Selection

Six binary classification datasets with different characteristics were selected for testing:

  • Linear Blobs (linear separable benchmark)
  • XOR (classic nonlinear problem)
  • Circles (concentric circle distribution)
  • Moons (crescent-shaped nonlinear boundary)
  • Gaussian Quantiles (Gaussian quantile data)
  • Iris (2D) (dimensionality-reduced version of the Iris dataset) These cover scenarios from simple linear to complex nonlinear, comprehensively evaluating generalization ability.
5

Section 05

Experimental Configuration and Evaluation Metrics

The experiment used classical simulation (2 qubits), Adam optimizer for training (batch size 32), and 5-fold cross-validation to ensure reliability. Three measurement configurations were compared:

  1. Phase and Measure (only phase gates and measurement)
  2. Interference and Measure (measurement after introducing interference)
  3. All (complete set of quantum gates) Evaluation metrics include accuracy, precision, recall, and F1 score, measuring performance from multiple dimensions.
6

Section 06

Research Findings and Significance

The study provides empirical data on the performance of quantum neural networks in practical applications, revealing the impact of different designs: residual connections are effective in deep quantum networks, differences in measurement strategy performance, etc., providing references for subsequent research. More importantly, it demonstrates a key step from theory to practical application of quantum machine learning, proving that current QNNs have acceptable performance in specific tasks and laying the foundation for applications in complex problems.

7

Section 07

Reproduction Guide and Extension Directions

Reproduction steps: After installing dependencies, run python src/train.py. It supports an early stopping mechanism to monitor validation loss and prevent overfitting. Extension directions: Explore more quantum circuit designs, try different optimization strategies, and extend to multi-classification tasks. The project has a clear code structure (core at src/train.py, directories like logs/results/models, etc.), complete documentation, and is open-source under the MIT license, suitable for beginners in quantum machine learning.