Skip to main content
CenXiv.org
This website is in trial operation, support us!
We gratefully acknowledge support from all contributors.
Contribute
Donate
cenxiv logo > cs > arXiv:2311.00579v1

Help | Advanced Search

Computer Science > Cryptography and Security

arXiv:2311.00579v1 (cs)
[Submitted on 1 Nov 2023 (this version) , latest version 6 May 2025 (v2) ]

Title: Revealing CNN Architectures via Side-Channel Analysis in Dataflow-based Inference Accelerators

Title: 通过数据流推理加速器中的侧信道分析揭示卷积神经网络架构

Authors:Hansika Weerasena, Prabhat Mishra
Abstract: Convolution Neural Networks (CNNs) are widely used in various domains. Recent advances in dataflow-based CNN accelerators have enabled CNN inference in resource-constrained edge devices. These dataflow accelerators utilize inherent data reuse of convolution layers to process CNN models efficiently. Concealing the architecture of CNN models is critical for privacy and security. This paper evaluates memory-based side-channel information to recover CNN architectures from dataflow-based CNN inference accelerators. The proposed attack exploits spatial and temporal data reuse of the dataflow mapping on CNN accelerators and architectural hints to recover the structure of CNN models. Experimental results demonstrate that our proposed side-channel attack can recover the structures of popular CNN models, namely Lenet, Alexnet, and VGGnet16.
Abstract: 卷积神经网络(CNNs)在各个领域被广泛使用。 基于数据流的CNN加速器的最新进展使得在资源受限的边缘设备上进行CNN推理成为可能。 这些数据流加速器利用卷积层的固有数据重用来高效处理CNN模型。 隐藏CNN模型的架构对于隐私和安全至关重要。 本文评估了基于内存的侧信道信息,以从基于数据流的CNN推理加速器中恢复CNN架构。 所提出的攻击利用了数据流映射在CNN加速器上的空间和时间数据重用以及架构提示来恢复CNN模型的结构。 实验结果表明,我们提出的侧信道攻击可以恢复流行CNN模型的结构,即Lenet、Alexnet和VGGnet16。
Subjects: Cryptography and Security (cs.CR) ; Hardware Architecture (cs.AR); Machine Learning (cs.LG)
Cite as: arXiv:2311.00579 [cs.CR]
  (or arXiv:2311.00579v1 [cs.CR] for this version)
  https://doi.org/10.48550/arXiv.2311.00579
arXiv-issued DOI via DataCite

Submission history

From: Hansika Weerasena [view email]
[v1] Wed, 1 Nov 2023 15:23:04 UTC (17,928 KB)
[v2] Tue, 6 May 2025 10:47:10 UTC (2,242 KB)
Full-text links:

Access Paper:

    View a PDF of the paper titled
  • View Chinese PDF
  • View PDF
  • TeX Source
license icon view license
Current browse context:
cs.CR
< prev   |   next >
new | recent | 2023-11
Change to browse by:
cs
cs.AR
cs.LG

References & Citations

  • NASA ADS
  • Google Scholar
  • Semantic Scholar
a export BibTeX citation Loading...

BibTeX formatted citation

×
Data provided by:

Bookmark

BibSonomy logo Reddit logo

Bibliographic and Citation Tools

Bibliographic Explorer (What is the Explorer?)
Connected Papers (What is Connected Papers?)
Litmaps (What is Litmaps?)
scite Smart Citations (What are Smart Citations?)

Code, Data and Media Associated with this Article

alphaXiv (What is alphaXiv?)
CatalyzeX Code Finder for Papers (What is CatalyzeX?)
DagsHub (What is DagsHub?)
Gotit.pub (What is GotitPub?)
Hugging Face (What is Huggingface?)
Papers with Code (What is Papers with Code?)
ScienceCast (What is ScienceCast?)

Demos

Replicate (What is Replicate?)
Hugging Face Spaces (What is Spaces?)
TXYZ.AI (What is TXYZ.AI?)

Recommenders and Search Tools

Influence Flower (What are Influence Flowers?)
CORE Recommender (What is CORE?)
IArxiv Recommender (What is IArxiv?)
  • Author
  • Venue
  • Institution
  • Topic

arXivLabs: experimental projects with community collaborators

arXivLabs is a framework that allows collaborators to develop and share new arXiv features directly on our website.

Both individuals and organizations that work with arXivLabs have embraced and accepted our values of openness, community, excellence, and user data privacy. arXiv is committed to these values and only works with partners that adhere to them.

Have an idea for a project that will add value for arXiv's community? Learn more about arXivLabs.

Which authors of this paper are endorsers? | Disable MathJax (What is MathJax?)
  • About
  • Help
  • contact arXivClick here to contact arXiv Contact
  • subscribe to arXiv mailingsClick here to subscribe Subscribe
  • Copyright
  • Privacy Policy
  • Web Accessibility Assistance
  • arXiv Operational Status
    Get status notifications via email or slack

京ICP备2025123034号