Evaluating Visual-Spatiotemporal Co-Registration of a Physics-Based Virtual Reality Haptic Interface

Abstract

This study aimed to evaluate the visual-spatiotemporal co-registration of the real and virtual objects' movement dynamics by designing a low-cost, physics-based virtual reality (VR) system that provides actual cutaneous and kinesthetic haptic feedback of an object instead of using computer-generated haptic feedback. Twelve healthy participants performed three human-robot collaborative (HRC) sequential pick-and-place lifting tasks while both motion capture and VR systems, respectively, traced the movement of the real and virtual objects simultaneously. We used an iterative closest point algorithm to transform and align the 3D coordinates of VR point clouds with the 3D coordinates of the motion capture system. We introduced a new method to calculate and analyze the precision of visual and spatiotemporal co-registration between virtual and real objects. Results showed a high correlation (r > 0.96) between real and virtual objects' movement dynamics and linear and angular co-registration errors of less than 5 cm and 8°, respectively. The trend also revealed a low temporal registration error of < 12 ms and was only found along the vertical axis. The visual registration data indicated that using real objects to provide cutaneous and kinesthetic haptics in the VR setting enhanced the users' overall proprioception and visuomotor functions.

Description

© 2013 IEEE. cc-by-nc-nd

Keywords

accuracy, Co-registration, haptics technology, high-fidelity, human-robot collaboration, virtual reality

Citation

Mubarrat, S.T., Chowdhury, S.K., & Fernandes, A.S.. 2024. Evaluating Visual-Spatiotemporal Co-Registration of a Physics-Based Virtual Reality Haptic Interface. IEEE Access, 12. https://doi.org/10.1109/ACCESS.2024.3391186

Collections