Skip to main content
CenXiv.org
This website is in trial operation, support us!
We gratefully acknowledge support from all contributors.
Contribute
Donate
cenxiv logo > math > arXiv:2509.02344

Help | Advanced Search

Mathematics > Analysis of PDEs

arXiv:2509.02344 (math)
[Submitted on 2 Sep 2025 ]

Title: Probabilistic well-posedness of dispersive PDEs beyond variance blowup I: Benjamin-Bona-Mahony equation

Title: 概率适定性超越方差爆破的色散PDE I:Benjamin-Bona-Mahony方程

Authors:Guopeng Li, Jiawei Li, Tadahiro Oh, Nikolay Tzvetkov
Abstract: We investigate a possible extension of probabilistic well-posedness theory of nonlinear dispersive PDEs with random initial data beyond variance blowup. As a model equation, we study the Benjamin-Bona-Mahony equation (BBM) with Gaussian random initial data. By introducing a suitable vanishing multiplicative renormalization constant on the initial data, we show that solutions to BBM with the renormalized Gaussian random initial data beyond variance blowup converge in law to a solution to the stochastic BBM forced by the derivative of a spatial white noise. By considering alternative renormalization, we show that solutions to the renormalized BBM with the frequency-truncated Gaussian initial data converges in law to a solution to the linear stochastic BBM with the full Gaussian initial data, forced by the derivative of a spatial white noise. This latter result holds for the Gaussian random initial data of arbitrarily low regularity. We also establish analogous results for the stochastic BBM forced by a fractional derivative of a space-time white noise.
Abstract: 我们研究了非线性色散PDE的随机初始数据下概率适定性理论向方差爆破之外的可能扩展。 作为模型方程,我们研究了具有高斯随机初始数据的Benjamin-Bona-Mahony方程(BBM)。 通过在初始数据上引入一个合适的消失乘法重整化常数,我们证明了在方差爆破之后,具有重整化高斯随机初始数据的BBM解在分布意义上收敛到由空间白噪声导数驱动的随机BBM的解。 通过考虑其他重整化方式,我们证明了具有频率截断高斯初始数据的重整化BBM解在分布意义上收敛到由完整高斯初始数据驱动的线性随机BBM的解,该解由空间白噪声的导数驱动。 后一个结果适用于任意低正则性的高斯随机初始数据。 我们还为由时空白噪声的分数阶导数驱动的随机BBM建立了类似的结果。
Comments: 45 pages
Subjects: Analysis of PDEs (math.AP) ; Probability (math.PR)
MSC classes: 35Q35, 35R60, 60H15, 60H30
Cite as: arXiv:2509.02344 [math.AP]
  (or arXiv:2509.02344v1 [math.AP] for this version)
  https://doi.org/10.48550/arXiv.2509.02344
arXiv-issued DOI via DataCite

Submission history

From: Tadahiro Oh [view email]
[v1] Tue, 2 Sep 2025 14:11:42 UTC (47 KB)
Full-text links:

Access Paper:

    View a PDF of the paper titled
  • View Chinese PDF
  • View PDF
  • HTML (experimental)
  • TeX Source
  • Other Formats
license icon view license
Current browse context:
math.AP
< prev   |   next >
new | recent | 2025-09
Change to browse by:
math
math.PR

References & Citations

  • NASA ADS
  • Google Scholar
  • Semantic Scholar
a export BibTeX citation Loading...

BibTeX formatted citation

×
Data provided by:

Bookmark

BibSonomy logo Reddit logo

Bibliographic and Citation Tools

Bibliographic Explorer (What is the Explorer?)
Connected Papers (What is Connected Papers?)
Litmaps (What is Litmaps?)
scite Smart Citations (What are Smart Citations?)

Code, Data and Media Associated with this Article

alphaXiv (What is alphaXiv?)
CatalyzeX Code Finder for Papers (What is CatalyzeX?)
DagsHub (What is DagsHub?)
Gotit.pub (What is GotitPub?)
Hugging Face (What is Huggingface?)
Papers with Code (What is Papers with Code?)
ScienceCast (What is ScienceCast?)

Demos

Replicate (What is Replicate?)
Hugging Face Spaces (What is Spaces?)
TXYZ.AI (What is TXYZ.AI?)

Recommenders and Search Tools

Influence Flower (What are Influence Flowers?)
CORE Recommender (What is CORE?)
IArxiv Recommender (What is IArxiv?)
  • Author
  • Venue
  • Institution
  • Topic

arXivLabs: experimental projects with community collaborators

arXivLabs is a framework that allows collaborators to develop and share new arXiv features directly on our website.

Both individuals and organizations that work with arXivLabs have embraced and accepted our values of openness, community, excellence, and user data privacy. arXiv is committed to these values and only works with partners that adhere to them.

Have an idea for a project that will add value for arXiv's community? Learn more about arXivLabs.

Which authors of this paper are endorsers? | Disable MathJax (What is MathJax?)
  • About
  • Help
  • contact arXivClick here to contact arXiv Contact
  • subscribe to arXiv mailingsClick here to subscribe Subscribe
  • Copyright
  • Privacy Policy
  • Web Accessibility Assistance
  • arXiv Operational Status
    Get status notifications via email or slack

京ICP备2025123034号