Skip to main content
CenXiv.org
This website is in trial operation, support us!
We gratefully acknowledge support from all contributors.
Contribute
Donate
cenxiv logo > eess > arXiv:2509.15969

Help | Advanced Search

Electrical Engineering and Systems Science > Audio and Speech Processing

arXiv:2509.15969 (eess)
[Submitted on 19 Sep 2025 ]

Title: VoXtream: Full-Stream Text-to-Speech with Extremely Low Latency

Title: VoXtream:具有极低延迟的全流文本到语音

Authors:Nikita Torgashov, Gustav Eje Henter, Gabriel Skantze
Abstract: We present VoXtream, a fully autoregressive, zero-shot streaming text-to-speech (TTS) system for real-time use that begins speaking from the first word. VoXtream directly maps incoming phonemes to audio tokens using a monotonic alignment scheme and a dynamic look-ahead that does not delay onset. Built around an incremental phoneme transformer, a temporal transformer predicting semantic and duration tokens, and a depth transformer producing acoustic tokens, VoXtream achieves, to our knowledge, the lowest initial delay among publicly available streaming TTS: 102 ms on GPU. Despite being trained on a mid-scale 9k-hour corpus, it matches or surpasses larger baselines on several metrics, while delivering competitive quality in both output- and full-streaming settings. Demo and code are available at https://herimor.github.io/voxtream.
Abstract: 我们提出VoXtream,这是一个完全自回归的、零样本流式文本到语音(TTS)系统,适用于实时使用,可以从第一个词开始说话。 VoXtream使用单调对齐方案和动态前瞻直接将传入的音素映射到音频标记,而不会延迟开始。 围绕一个增量音素变压器、一个预测语义和持续时间标记的时间变压器以及一个生成声学标记的深度变压器构建,据我们所知,VoXtream在公开可用的流式TTS中实现了最低的初始延迟:在GPU上为102毫秒。 尽管在中等规模的9000小时语料库上进行训练,它在多个指标上达到了或超过了更大的基线,在输出流和全流设置中都提供了具有竞争力的质量。 演示和代码可在https://herimor.github.io/voxtream获取。
Comments: 5 pages, 1 figure, submitted to IEEE ICASSP 2026
Subjects: Audio and Speech Processing (eess.AS) ; Computation and Language (cs.CL); Human-Computer Interaction (cs.HC); Machine Learning (cs.LG); Sound (cs.SD)
Cite as: arXiv:2509.15969 [eess.AS]
  (or arXiv:2509.15969v1 [eess.AS] for this version)
  https://doi.org/10.48550/arXiv.2509.15969
arXiv-issued DOI via DataCite (pending registration)

Submission history

From: Nikita Torgashov [view email]
[v1] Fri, 19 Sep 2025 13:26:46 UTC (1,750 KB)
Full-text links:

Access Paper:

    View a PDF of the paper titled
  • View Chinese PDF
  • View PDF
  • HTML (experimental)
  • TeX Source
  • Other Formats
view license
Current browse context:
eess.AS
< prev   |   next >
new | recent | 2025-09
Change to browse by:
cs
cs.CL
cs.HC
cs.LG
cs.SD
eess

References & Citations

  • NASA ADS
  • Google Scholar
  • Semantic Scholar
a export BibTeX citation Loading...

BibTeX formatted citation

×
Data provided by:

Bookmark

BibSonomy logo Reddit logo

Bibliographic and Citation Tools

Bibliographic Explorer (What is the Explorer?)
Connected Papers (What is Connected Papers?)
Litmaps (What is Litmaps?)
scite Smart Citations (What are Smart Citations?)

Code, Data and Media Associated with this Article

alphaXiv (What is alphaXiv?)
CatalyzeX Code Finder for Papers (What is CatalyzeX?)
DagsHub (What is DagsHub?)
Gotit.pub (What is GotitPub?)
Hugging Face (What is Huggingface?)
Papers with Code (What is Papers with Code?)
ScienceCast (What is ScienceCast?)

Demos

Replicate (What is Replicate?)
Hugging Face Spaces (What is Spaces?)
TXYZ.AI (What is TXYZ.AI?)

Recommenders and Search Tools

Influence Flower (What are Influence Flowers?)
CORE Recommender (What is CORE?)
IArxiv Recommender (What is IArxiv?)
  • Author
  • Venue
  • Institution
  • Topic

arXivLabs: experimental projects with community collaborators

arXivLabs is a framework that allows collaborators to develop and share new arXiv features directly on our website.

Both individuals and organizations that work with arXivLabs have embraced and accepted our values of openness, community, excellence, and user data privacy. arXiv is committed to these values and only works with partners that adhere to them.

Have an idea for a project that will add value for arXiv's community? Learn more about arXivLabs.

Which authors of this paper are endorsers? | Disable MathJax (What is MathJax?)
  • About
  • Help
  • contact arXivClick here to contact arXiv Contact
  • subscribe to arXiv mailingsClick here to subscribe Subscribe
  • Copyright
  • Privacy Policy
  • Web Accessibility Assistance
  • arXiv Operational Status
    Get status notifications via email or slack

京ICP备2025123034号