Skip to main content
CenXiv.org
This website is in trial operation, support us!
We gratefully acknowledge support from all contributors.
Contribute
Donate
cenxiv logo > cs > arXiv:2503.05775

Help | Advanced Search

Computer Science > Machine Learning

arXiv:2503.05775 (cs)
[Submitted on 26 Feb 2025 ]

Title: Evaluation of Missing Data Imputation for Time Series Without Ground Truth

Title: 时间序列缺失数据填补的评估

Authors:Rania Farjallah, Bassant Selim, Brigitte Jaumard, Samr Ali, Georges Kaddoum
Abstract: The challenge of handling missing data in time series is critical for maintaining the accuracy and reliability of machine learning (ML) models in applications like fifth generation mobile communication (5G) network management. Traditional methods for validating imputation rely on ground truth data, which is inherently unavailable. This paper addresses this limitation by introducing two statistical metrics, the wasserstein distance (WD) and jensen-shannon divergence (JSD), to evaluate imputation quality without requiring ground truth. These metrics assess the alignment between the distributions of imputed and original data, providing a robust method for evaluating imputation performance based on internal structure and data consistency. We apply and test these metrics across several imputation techniques. Results demonstrate that WD and JSD are effective metrics for assessing the quality of missing data imputation, particularly in scenarios where ground truth data is unavailable.
Abstract: 处理时间序列中缺失数据的挑战对于在第五代移动通信(5G)网络管理等应用中保持机器学习(ML)模型的准确性和可靠性至关重要。 传统的插补验证方法依赖于真实数据,而真实数据本质上是不可用的。 本文通过引入两个统计指标——Wasserstein距离(WD)和Jensen-Shannon散度(JSD),解决了这一限制,以在不需要真实数据的情况下评估插补质量。 这些指标评估插补数据与原始数据分布之间的对齐程度,提供了一种基于内部结构和数据一致性的稳健插补性能评估方法。 我们在多种插补技术上应用并测试了这些指标。 结果表明,WD和JSD是评估缺失数据插补质量的有效指标,特别是在真实数据不可用的场景中。
Comments: Accepted for publication in IEEE ICC 2025 (International Conference on Communications). The paper consists of 6 pages including references and contains 5 figures
Subjects: Machine Learning (cs.LG) ; Machine Learning (stat.ML)
Cite as: arXiv:2503.05775 [cs.LG]
  (or arXiv:2503.05775v1 [cs.LG] for this version)
  https://doi.org/10.48550/arXiv.2503.05775
arXiv-issued DOI via DataCite

Submission history

From: Rania Farjallah [view email]
[v1] Wed, 26 Feb 2025 01:02:16 UTC (3,391 KB)
Full-text links:

Access Paper:

    View a PDF of the paper titled
  • View Chinese PDF
  • View PDF
  • HTML (experimental)
  • TeX Source
  • Other Formats
license icon view license
Current browse context:
cs.LG
< prev   |   next >
new | recent | 2025-03
Change to browse by:
cs
stat
stat.ML

References & Citations

  • NASA ADS
  • Google Scholar
  • Semantic Scholar
a export BibTeX citation Loading...

BibTeX formatted citation

×
Data provided by:

Bookmark

BibSonomy logo Reddit logo

Bibliographic and Citation Tools

Bibliographic Explorer (What is the Explorer?)
Connected Papers (What is Connected Papers?)
Litmaps (What is Litmaps?)
scite Smart Citations (What are Smart Citations?)

Code, Data and Media Associated with this Article

alphaXiv (What is alphaXiv?)
CatalyzeX Code Finder for Papers (What is CatalyzeX?)
DagsHub (What is DagsHub?)
Gotit.pub (What is GotitPub?)
Hugging Face (What is Huggingface?)
Papers with Code (What is Papers with Code?)
ScienceCast (What is ScienceCast?)

Demos

Replicate (What is Replicate?)
Hugging Face Spaces (What is Spaces?)
TXYZ.AI (What is TXYZ.AI?)

Recommenders and Search Tools

Influence Flower (What are Influence Flowers?)
CORE Recommender (What is CORE?)
IArxiv Recommender (What is IArxiv?)
  • Author
  • Venue
  • Institution
  • Topic

arXivLabs: experimental projects with community collaborators

arXivLabs is a framework that allows collaborators to develop and share new arXiv features directly on our website.

Both individuals and organizations that work with arXivLabs have embraced and accepted our values of openness, community, excellence, and user data privacy. arXiv is committed to these values and only works with partners that adhere to them.

Have an idea for a project that will add value for arXiv's community? Learn more about arXivLabs.

Which authors of this paper are endorsers? | Disable MathJax (What is MathJax?)
  • About
  • Help
  • contact arXivClick here to contact arXiv Contact
  • subscribe to arXiv mailingsClick here to subscribe Subscribe
  • Copyright
  • Privacy Policy
  • Web Accessibility Assistance
  • arXiv Operational Status
    Get status notifications via email or slack

京ICP备2025123034号