Machine Learning Applications of Parameterized Quantum Circuits

Publication Type:
Thesis
Issue Date:
2023
Full metadata record
With the development of near-term quantum devices, hybrid quantum-classical computing has been acknowledged as a promising framework to realize near-term quantum advantages on important tasks, including chemistry, optimization and machine learning. The performance of such frameworks significantly relies on the power of parameterized quantum circuits (PQCs). However, it is challenging to design more suitable PQC architectures showing quantum superiorities for practical quantum machine learning tasks. In this thesis, we make progress in studying the power of PQCs in quantum classification and quantum natural language processing, and exploring the limitations of PQCs in quantum data encoding. Specifically, we first propose variational shadow quantum learning for quantum classification, which in particular utilizes the local PQCs inspired by classical shadows to extract features of quantum data in a convolution way. We show this method could avoid the notorious barren plateaus issue and has superiorities with respect to accuracy and parameter numbers compared with baselines. Secondly, we propose a quantum self-attention neural network, where we introduce the self-attention mechanism into PQCs and then utilize a Gaussian projected quantum self-attention serving as a sensible quantum version of self-attention. We show this approach outperforms 1) the best existing QNLP model based on syntactic analysis, and 2) a simple classical self-attention neural network in text classification tasks on public data sets. Lastly, we prove that, for the PQC-based data encoding strategies, the average encoded state will concentrate on the maximally mixed state at an exponential speed on circuit depth. In conclusion, we propose two new quantum neural network (QNN) models for handling practical machine learning tasks, demonstrating QNN's ability to extract features and the potential of quantum machine learning in real-world applications. In addition, we also reveal the concentration of data encoding, which seriously limits the performance of downstream quantum supervised learning tasks. Such concentration might also guide the practical data encoding design. All these progresses would benefit practical quantum machine learning.
Please use this identifier to cite or link to this item: