Skip to main content
CenXiv.org
This website is in trial operation, support us!
We gratefully acknowledge support from all contributors.
Contribute
Donate
cenxiv logo > stat > arXiv:2506.00270v1

Help | Advanced Search

Statistics > Machine Learning

arXiv:2506.00270v1 (stat)
[Submitted on 30 May 2025 ]

Title: Bayesian Data Sketching for Varying Coefficient Regression Models

Title: 贝叶斯数据速写在变系数回归模型中的应用

Authors:Rajarshi Guhaniyogi, Laura Baracaldo, Sudipto Banerjee
Abstract: Varying coefficient models are popular for estimating nonlinear regression functions in functional data models. Their Bayesian variants have received limited attention in large data applications, primarily due to prohibitively slow posterior computations using Markov chain Monte Carlo (MCMC) algorithms. We introduce Bayesian data sketching for varying coefficient models to obviate computational challenges presented by large sample sizes. To address the challenges of analyzing large data, we compress the functional response vector and predictor matrix by a random linear transformation to achieve dimension reduction and conduct inference on the compressed data. Our approach distinguishes itself from several existing methods for analyzing large functional data in that it requires neither the development of new models or algorithms, nor any specialized computational hardware while delivering fully model-based Bayesian inference. Well-established methods and algorithms for varying coefficient regression models can be applied to the compressed data.
Abstract: 变系数模型在函数数据分析中广泛用于估计非线性回归函数。然而,它们的贝叶斯变体在大规模数据应用中受到的关注有限,主要原因是使用马尔可夫链蒙特卡洛(MCMC)算法进行后验计算的速度极其缓慢。我们引入了变系数模型的贝叶斯数据压缩方法,以克服大样本量带来的计算挑战。为了解决分析大规模数据的难题,我们通过随机线性变换压缩函数响应向量和预测变量矩阵,实现降维,并在压缩后的数据上进行推断。与现有几种分析大规模函数数据的方法不同,我们的方法无需开发新的模型或算法,也无需特殊的计算硬件,同时能够提供基于模型的完全贝叶斯推断。现有的变系数回归模型的方法和算法可以直接应用于压缩后的数据。
Subjects: Machine Learning (stat.ML) ; Machine Learning (cs.LG); Methodology (stat.ME)
Cite as: arXiv:2506.00270 [stat.ML]
  (or arXiv:2506.00270v1 [stat.ML] for this version)
  https://doi.org/10.48550/arXiv.2506.00270
arXiv-issued DOI via DataCite

Submission history

From: Sudipto Banerjee [view email]
[v1] Fri, 30 May 2025 22:09:06 UTC (1,252 KB)
Full-text links:

Access Paper:

    View a PDF of the paper titled
  • View Chinese PDF
  • View PDF
  • HTML (experimental)
  • TeX Source
license icon view license
Current browse context:
stat.ML
< prev   |   next >
new | recent | 2025-06
Change to browse by:
cs
cs.LG
stat
stat.ME

References & Citations

  • NASA ADS
  • Google Scholar
  • Semantic Scholar
a export BibTeX citation Loading...

BibTeX formatted citation

×
Data provided by:

Bookmark

BibSonomy logo Reddit logo

Bibliographic and Citation Tools

Bibliographic Explorer (What is the Explorer?)
Connected Papers (What is Connected Papers?)
Litmaps (What is Litmaps?)
scite Smart Citations (What are Smart Citations?)

Code, Data and Media Associated with this Article

alphaXiv (What is alphaXiv?)
CatalyzeX Code Finder for Papers (What is CatalyzeX?)
DagsHub (What is DagsHub?)
Gotit.pub (What is GotitPub?)
Hugging Face (What is Huggingface?)
Papers with Code (What is Papers with Code?)
ScienceCast (What is ScienceCast?)

Demos

Replicate (What is Replicate?)
Hugging Face Spaces (What is Spaces?)
TXYZ.AI (What is TXYZ.AI?)

Recommenders and Search Tools

Influence Flower (What are Influence Flowers?)
CORE Recommender (What is CORE?)
IArxiv Recommender (What is IArxiv?)
  • Author
  • Venue
  • Institution
  • Topic

arXivLabs: experimental projects with community collaborators

arXivLabs is a framework that allows collaborators to develop and share new arXiv features directly on our website.

Both individuals and organizations that work with arXivLabs have embraced and accepted our values of openness, community, excellence, and user data privacy. arXiv is committed to these values and only works with partners that adhere to them.

Have an idea for a project that will add value for arXiv's community? Learn more about arXivLabs.

Which authors of this paper are endorsers? | Disable MathJax (What is MathJax?)
  • About
  • Help
  • contact arXivClick here to contact arXiv Contact
  • subscribe to arXiv mailingsClick here to subscribe Subscribe
  • Copyright
  • Privacy Policy
  • Web Accessibility Assistance
  • arXiv Operational Status
    Get status notifications via email or slack

京ICP备2025123034号