Skip to main content
CenXiv.org
This website is in trial operation, support us!
We gratefully acknowledge support from all contributors.
Contribute
Donate
cenxiv logo > quant-ph > arXiv:2506.16938v2

Help | Advanced Search

Quantum Physics

arXiv:2506.16938v2 (quant-ph)
[Submitted on 20 Jun 2025 (v1) , last revised 2 Jul 2025 (this version, v2)]

Title: Enhancing Expressivity of Quantum Neural Networks Based on the SWAP test

Title: 基于SWAP测试的量子神经网络表达能力增强

Authors:Sebastian Nagies, Emiliano Tolotti, Davide Pastorello, Enrico Blanzieri
Abstract: Parameterized quantum circuits represent promising architectures for machine learning applications, yet many lack clear connections to classical models, potentially limiting their ability to translate the wide success of classical neural networks to the quantum realm. We examine a specific type of quantum neural network (QNN) built exclusively from SWAP test circuits, and discuss its mathematical equivalence to a classical two-layer feedforward network with quadratic activation functions under amplitude encoding. Our analysis across classical real-world and synthetic datasets reveals that while this architecture can successfully learn many practical tasks, it exhibits fundamental expressivity limitations due to violating the universal approximation theorem, particularly failing on harder problems like the parity check function. To address this limitation, we introduce a circuit modification using generalized SWAP test circuits that effectively implements classical neural networks with product layers. This enhancement enables successful learning of parity check functions in arbitrary dimensions which we analytically argue to be impossible for the original architecture beyond two dimensions regardless of network size. Our results establish a framework for enhancing QNN expressivity through classical task analysis and demonstrate that our SWAP test-based architecture offers broad representational capacity, suggesting potential promise also for quantum learning tasks.
Abstract: 参数化量子电路代表了机器学习应用中有前途的架构,但许多电路与经典模型缺乏明确的联系,这可能限制了它们将经典神经网络的广泛成功转化为量子领域的能力。 我们研究了一种特定类型的量子神经网络(QNN),该网络仅由SWAP测试电路构建,并讨论了其在幅度编码下的数学等价性,即与具有二次激活函数的经典两层前馈网络的等价性。 我们在经典现实世界和合成数据集上的分析显示,尽管这种架构可以成功学习许多实际任务,但由于违反了通用逼近定理,它表现出基本的表达能力限制,特别是在像奇偶校验函数这样的困难问题上表现不佳。 为了解决这一限制,我们引入了一种使用广义SWAP测试电路的电路修改方法,该方法有效地实现了具有乘积层的经典神经网络。 这种改进使我们在任意维度上成功学习奇偶校验函数成为可能,我们通过分析论证,原架构在超过两个维度的情况下,无论网络大小如何,都无法实现这一点。 我们的结果建立了一个通过经典任务分析增强QNN表达能力的框架,并表明我们的基于SWAP测试的架构具有广泛的表示能力,这表明它在量子学习任务中也具有潜在的前景。
Comments: 15 pages, 7 figures, added code availability statement
Subjects: Quantum Physics (quant-ph) ; Emerging Technologies (cs.ET); Machine Learning (cs.LG)
Cite as: arXiv:2506.16938 [quant-ph]
  (or arXiv:2506.16938v2 [quant-ph] for this version)
  https://doi.org/10.48550/arXiv.2506.16938
arXiv-issued DOI via DataCite

Submission history

From: Sebastian Nagies [view email]
[v1] Fri, 20 Jun 2025 12:05:31 UTC (953 KB)
[v2] Wed, 2 Jul 2025 13:44:48 UTC (953 KB)
Full-text links:

Access Paper:

    View a PDF of the paper titled
  • View Chinese PDF
  • View PDF
  • HTML (experimental)
  • TeX Source
  • Other Formats
license icon view license
Current browse context:
quant-ph
< prev   |   next >
new | recent | 2025-06
Change to browse by:
cs
cs.ET
cs.LG

References & Citations

  • NASA ADS
  • Google Scholar
  • Semantic Scholar
a export BibTeX citation Loading...

BibTeX formatted citation

×
Data provided by:

Bookmark

BibSonomy logo Reddit logo

Bibliographic and Citation Tools

Bibliographic Explorer (What is the Explorer?)
Connected Papers (What is Connected Papers?)
Litmaps (What is Litmaps?)
scite Smart Citations (What are Smart Citations?)

Code, Data and Media Associated with this Article

alphaXiv (What is alphaXiv?)
CatalyzeX Code Finder for Papers (What is CatalyzeX?)
DagsHub (What is DagsHub?)
Gotit.pub (What is GotitPub?)
Hugging Face (What is Huggingface?)
Papers with Code (What is Papers with Code?)
ScienceCast (What is ScienceCast?)

Demos

Replicate (What is Replicate?)
Hugging Face Spaces (What is Spaces?)
TXYZ.AI (What is TXYZ.AI?)

Recommenders and Search Tools

Influence Flower (What are Influence Flowers?)
CORE Recommender (What is CORE?)
IArxiv Recommender (What is IArxiv?)
  • Author
  • Venue
  • Institution
  • Topic

arXivLabs: experimental projects with community collaborators

arXivLabs is a framework that allows collaborators to develop and share new arXiv features directly on our website.

Both individuals and organizations that work with arXivLabs have embraced and accepted our values of openness, community, excellence, and user data privacy. arXiv is committed to these values and only works with partners that adhere to them.

Have an idea for a project that will add value for arXiv's community? Learn more about arXivLabs.

Which authors of this paper are endorsers? | Disable MathJax (What is MathJax?)
  • About
  • Help
  • contact arXivClick here to contact arXiv Contact
  • subscribe to arXiv mailingsClick here to subscribe Subscribe
  • Copyright
  • Privacy Policy
  • Web Accessibility Assistance
  • arXiv Operational Status
    Get status notifications via email or slack

京ICP备2025123034号