Skip to main content
CenXiv.org
This website is in trial operation, support us!
We gratefully acknowledge support from all contributors.
Contribute
Donate
cenxiv logo > q-bio > arXiv:2509.02626v1

Help | Advanced Search

Quantitative Biology > Quantitative Methods

arXiv:2509.02626v1 (q-bio)
[Submitted on 1 Sep 2025 ]

Title: A Wrist-Worn Multimodal Reaction Time Monitoring Device for Ecologically-Valid Cognitive Assessment

Title: 用于生态有效认知评估的手腕佩戴多模态反应时间监测设备

Authors:Abhigyan Sarkar, Boris Rubinsky
Abstract: Reaction time (RT) is a fundamental measure in cognitive and neurophysiological assessment, yet most existing RT systems require active user engagement and controlled environments, limiting their use in real-world settings. This paper introduces a low cost wrist-worn instrumentation platform designed to capture human reaction times (RT) across auditory, visual, and haptic modalities with millisecond latency in real-world conditions. The device integrates synchronized stimulus delivery and event detection within a compact microcontroller-based system, eliminating the need for user focus or examiner supervision. Emphasizing measurement fidelity, we detail the hardware architecture, timing control algorithms, and calibration methodology used to ensure consistent latency handling across modalities. A proof-of-concept study with six adult participants compares this system against a benchmark computer-based RT tool across five experimental conditions. The results confirm that the device achieves statistically comparable RT measurements with strong modality consistency, supporting its potential as a novel tool for non-obtrusive cognitive monitoring. Contributions include a validated design for time-critical behavioral measurement and a demonstration of its robustness in unconstrained, ambient-noise environments. It offers a powerful new tool for continuous, real-world cognitive monitoring and has significant potential for both research and clinical applications.
Abstract: 反应时间(RT)是认知和神经生理评估中的基本测量指标,但现有的大多数RT系统需要用户主动参与和受控环境,限制了它们在现实场景中的应用。 本文介绍了一种低成本的手腕佩戴设备平台,旨在在现实条件下以毫秒级延迟跨听觉、视觉和触觉模态捕捉人类反应时间(RT)。 该设备在一个基于微控制器的紧凑系统中集成了同步刺激传递和事件检测,无需用户专注或检查员监督。 强调测量准确性,我们详细描述了用于确保各模态间一致延迟处理的硬件架构、定时控制算法和校准方法。 一项针对六名成年参与者的概念验证研究将该系统与基准计算机RT工具在五个实验条件下进行了比较。 结果证实,该设备实现了统计上可比的RT测量,并具有强大的模态一致性,支持其作为非侵入性认知监测新工具的潜力。 贡献包括一个经过验证的时间关键行为测量设计以及其在无约束、环境噪声环境中的鲁棒性演示。 它为连续的现实世界认知监测提供了一个强大的新工具,并在研究和临床应用中具有重要意义。
Subjects: Quantitative Methods (q-bio.QM)
Cite as: arXiv:2509.02626 [q-bio.QM]
  (or arXiv:2509.02626v1 [q-bio.QM] for this version)
  https://doi.org/10.48550/arXiv.2509.02626
arXiv-issued DOI via DataCite

Submission history

From: Boris Rubinsky [view email]
[v1] Mon, 1 Sep 2025 15:21:09 UTC (855 KB)
Full-text links:

Access Paper:

    View a PDF of the paper titled
  • View Chinese PDF
  • View PDF
view license
Current browse context:
q-bio.QM
< prev   |   next >
new | recent | 2025-09
Change to browse by:
q-bio

References & Citations

  • NASA ADS
  • Google Scholar
  • Semantic Scholar
a export BibTeX citation Loading...

BibTeX formatted citation

×
Data provided by:

Bookmark

BibSonomy logo Reddit logo

Bibliographic and Citation Tools

Bibliographic Explorer (What is the Explorer?)
Connected Papers (What is Connected Papers?)
Litmaps (What is Litmaps?)
scite Smart Citations (What are Smart Citations?)

Code, Data and Media Associated with this Article

alphaXiv (What is alphaXiv?)
CatalyzeX Code Finder for Papers (What is CatalyzeX?)
DagsHub (What is DagsHub?)
Gotit.pub (What is GotitPub?)
Hugging Face (What is Huggingface?)
Papers with Code (What is Papers with Code?)
ScienceCast (What is ScienceCast?)

Demos

Replicate (What is Replicate?)
Hugging Face Spaces (What is Spaces?)
TXYZ.AI (What is TXYZ.AI?)

Recommenders and Search Tools

Influence Flower (What are Influence Flowers?)
CORE Recommender (What is CORE?)
IArxiv Recommender (What is IArxiv?)
  • Author
  • Venue
  • Institution
  • Topic

arXivLabs: experimental projects with community collaborators

arXivLabs is a framework that allows collaborators to develop and share new arXiv features directly on our website.

Both individuals and organizations that work with arXivLabs have embraced and accepted our values of openness, community, excellence, and user data privacy. arXiv is committed to these values and only works with partners that adhere to them.

Have an idea for a project that will add value for arXiv's community? Learn more about arXivLabs.

Which authors of this paper are endorsers? | Disable MathJax (What is MathJax?)
  • About
  • Help
  • contact arXivClick here to contact arXiv Contact
  • subscribe to arXiv mailingsClick here to subscribe Subscribe
  • Copyright
  • Privacy Policy
  • Web Accessibility Assistance
  • arXiv Operational Status
    Get status notifications via email or slack

京ICP备2025123034号