Skip to main content
CenXiv.org
This website is in trial operation, support us!
We gratefully acknowledge support from all contributors.
Contribute
Donate
cenxiv logo > stat > arXiv:2502.21009

Help | Advanced Search

Statistics > Machine Learning

arXiv:2502.21009 (stat)
[Submitted on 28 Feb 2025 (v1) , last revised 26 May 2025 (this version, v2)]

Title: Position: Solve Layerwise Linear Models First to Understand Neural Dynamical Phenomena (Neural Collapse, Emergence, Lazy/Rich Regime, and Grokking)

Title: 位置:首先求解分层线性模型以理解神经动力学现象(神经坍缩、涌现、懒惰/丰富模式和领悟)

Authors:Yoonsoo Nam, Seok Hyeong Lee, Clementine C J Domine, Yeachan Park, Charles London, Wonyl Choi, Niclas Goring, Seungjai Lee
Abstract: In physics, complex systems are often simplified into minimal, solvable models that retain only the core principles. In machine learning, layerwise linear models (e.g., linear neural networks) act as simplified representations of neural network dynamics. These models follow the dynamical feedback principle, which describes how layers mutually govern and amplify each other's evolution. This principle extends beyond the simplified models, successfully explaining a wide range of dynamical phenomena in deep neural networks, including neural collapse, emergence, lazy and rich regimes, and grokking. In this position paper, we call for the use of layerwise linear models retaining the core principles of neural dynamical phenomena to accelerate the science of deep learning.
Abstract: 在物理学中,复杂系统通常被简化为仅保留核心原理的最小可解模型。 在机器学习中,逐层线性模型(例如,线性神经网络)作为神经网络动态的简化表示。 这些模型遵循动态反馈原理,该原理描述了各层如何相互控制并放大彼此的演化。 这一原理不仅限于简化模型,还成功解释了深度神经网络中的广泛动态现象,包括神经坍缩、出现、懒惰和丰富区域以及理解现象。 在本文中,我们呼吁使用保留神经动态现象核心原理的逐层线性模型,以加速深度学习科学的发展。
Comments: accepted to ICML 2025 position track
Subjects: Machine Learning (stat.ML) ; Machine Learning (cs.LG); Data Analysis, Statistics and Probability (physics.data-an)
Cite as: arXiv:2502.21009 [stat.ML]
  (or arXiv:2502.21009v2 [stat.ML] for this version)
  https://doi.org/10.48550/arXiv.2502.21009
arXiv-issued DOI via DataCite

Submission history

From: Yoonsoo Nam [view email]
[v1] Fri, 28 Feb 2025 12:52:11 UTC (5,023 KB)
[v2] Mon, 26 May 2025 13:30:50 UTC (5,049 KB)
Full-text links:

Access Paper:

    View a PDF of the paper titled
  • View Chinese PDF
  • View PDF
  • HTML (experimental)
  • TeX Source
license icon view license
Current browse context:
stat.ML
< prev   |   next >
new | recent | 2025-02
Change to browse by:
cs
cs.LG
physics
physics.data-an
stat

References & Citations

  • NASA ADS
  • Google Scholar
  • Semantic Scholar
a export BibTeX citation Loading...

BibTeX formatted citation

×
Data provided by:

Bookmark

BibSonomy logo Reddit logo

Bibliographic and Citation Tools

Bibliographic Explorer (What is the Explorer?)
Connected Papers (What is Connected Papers?)
Litmaps (What is Litmaps?)
scite Smart Citations (What are Smart Citations?)

Code, Data and Media Associated with this Article

alphaXiv (What is alphaXiv?)
CatalyzeX Code Finder for Papers (What is CatalyzeX?)
DagsHub (What is DagsHub?)
Gotit.pub (What is GotitPub?)
Hugging Face (What is Huggingface?)
Papers with Code (What is Papers with Code?)
ScienceCast (What is ScienceCast?)

Demos

Replicate (What is Replicate?)
Hugging Face Spaces (What is Spaces?)
TXYZ.AI (What is TXYZ.AI?)

Recommenders and Search Tools

Influence Flower (What are Influence Flowers?)
CORE Recommender (What is CORE?)
IArxiv Recommender (What is IArxiv?)
  • Author
  • Venue
  • Institution
  • Topic

arXivLabs: experimental projects with community collaborators

arXivLabs is a framework that allows collaborators to develop and share new arXiv features directly on our website.

Both individuals and organizations that work with arXivLabs have embraced and accepted our values of openness, community, excellence, and user data privacy. arXiv is committed to these values and only works with partners that adhere to them.

Have an idea for a project that will add value for arXiv's community? Learn more about arXivLabs.

Which authors of this paper are endorsers? | Disable MathJax (What is MathJax?)
  • About
  • Help
  • contact arXivClick here to contact arXiv Contact
  • subscribe to arXiv mailingsClick here to subscribe Subscribe
  • Copyright
  • Privacy Policy
  • Web Accessibility Assistance
  • arXiv Operational Status
    Get status notifications via email or slack

京ICP备2025123034号