All published articles of this journal are available on ScienceDirect.
Balancing Accuracy and Recall in Hebbian and Quantum-Inspired Learning Models
Abstract
Introduction
This study investigates integrating quantum-inspired learning models with traditional Hebbian learning within neural networks, comparing their performance in learning efficiency, generalization, stability, and robustness. Traditional Hebbian models are biologically plausible but often struggle with stability, scalability, and adaptability. In contrast, quantum-inspired models leverage quantum mechanics principles like superposition and entanglement to enhance neural network performance potentially.
Methods
The simulations were conducted using a neural network comprising 1,000 neurons and 100 patterns across 10 instances. The key parameters included a fixed decay rate of 0.005, 80% excitatory neurons, and 10% fixed connectivity. The study varied learning rates (0.01, 0.05, 0.1) and thresholds (0.3, 0.5, 0.7) to assess different parameter settings. The performance metrics evaluated included accuracy, precision, recall, and F1-Score.
Results
The results showed that quantum-inspired models achieved significantly higher accuracy and precision, enhancing their reliability in class prediction and reducing false positives. Conversely, Hebbian models excelled in recall and F1-Score, effectively identifying positive cases and balancing precision and recall. Additionally, quantum-inspired models demonstrated greater stability, robustness, and consistent performance across varying parameters.
Conclusion
Quantum-inspired models offer notable improvements in learning efficiency, generalization, stability, and robustness, while Hebbian models perform better in recall and F1-Score. These findings suggest the potential for hybrid models that combine the strengths of both approaches, aiming for more balanced and efficient learning systems. Future research should explore these hybrid models to enhance performance across diverse artificial intelligence applications. Supplementary materials include the complete R code used, enabling replication and further investigation of the results.