Investigating the effect of sensor data visualization variances in virtual reality
- Publisher:
- ACM
- Publication Type:
- Conference Proceeding
- Citation:
- Proceedings of the ACM Symposium on Virtual Reality Software and Technology, VRST, 2021, pp. 1-5
- Issue Date:
- 2021-12-08
Closed Access
Filename | Description | Size | |||
---|---|---|---|---|---|
3489849.3489877.pdf | Published version | 2.03 MB |
Copyright Clearance Process
- Recently Added
- In Progress
- Closed Access
This item is closed access and not available.
This paper investigates the effect of real-time sensor data variances on humans performing straightforward assembly tasks in a Virtual Reality-based (VR-based) training system. A VR-based training system has been developed to transfer color and depth images, and constructs colored point clouds data to represent objects in real-time. Various parameters that affect sensor data acquisition and visualization of remotely operated robots in the real-world are varied. Afterward, the associated task performance is observed. Experimental results from 12 participants performed a total of 95 VR-guided puzzle assembly tasks demonstrated that a combination of low resolution and uncolored points has the most significant effect on participants’ performance. Participants mentioned that they needed to rely upon tactile feedback when the perceptual feedback was minimal. The most insignificant parameter determined was the resolution of the data representations, which, when varied within the experimental bounds, only resulted in a 5% average change in completion time. Participants also indicated in surveys that they felt their performance had improved and frustration was reduced when provided with color information of the scene.
Please use this identifier to cite or link to this item: