Overview
This research leverages the asymmetry between Time and Reality in human-robot interactions to create a new remote operation experience. Traditional remote robot operations involve humans giving step-by-step instructions to robots, with the robot faithfully replicating them, creating a synchronous relationship. However, this method restricts humans and robots to a one-on-one interaction, making it difficult to parallelize tasks efficiently.
The remote operation approach discussed in this research aims to solve this issue. Humans only provide the final goal to the robot. The robot autonomously plans and executes specific movements to achieve the given goal. This allows humans to give new instructions one after another without waiting for the robot to complete each task.
While previous studies exist on asynchronous remote control mechanisms, we focus on the design of the interface. To make use of this asynchrony, it is necessary to monitor multiple goals given to the robot simultaneously and keep track of their progress. Therefore, we developed two UI paradigms to visualize the status of past instructions. One is the “Integrated View,” which consolidates goal states from different times into a single 3D space. The other is the “Timeline View,” where users can check goal states from thumbnails of multiple 3D scenes arranged along a timeline.
Results from user studies showed that the Integrated View excelled in terms of work efficiency, while the Timeline View was preferred subjectively. This difference is believed to be due to the unique characteristics of each UI.
The insights gained from this research can be applied not only to remote robot operations but also to asynchronous collaboration among humans. UI designs based on the asymmetry of time and reality are likely to become essential concepts supporting remote work in the post-COVID era.
Asynchronous remote operation represents an innovative paradigm that fundamentally changes the relationship between humans and robots. By questioning the UI design in this research, the potential for collaboration between humans and robots is significantly expanded.
External Evaluation
The main research papers have been accepted at several international conferences.
- Aoyama, S., Liu, J.-S., Wang, P., Jain, S., Wang, X., Xu, J., Song, S., Tversky, B., & Feiner, S. (2024). Asynchronously Assigning, Monitoring, and Managing Assembly Goals in Virtual Reality for High-Level Robot Teleoperation. 2024 IEEE Conference Virtual Reality and 3D User Interfaces (VR). (Main research paper, acceptance rate 28.7%)
- Wang, P., Jain, S., Li, M., Aoyama, S., Wang, X., Song, S., Liu, J.-S., & Feiner, S. (2023). Built to Order: A Virtual Reality Interface for Assigning High-Level Assembly Goals to Remote Robots. Proceedings of the 2023 ACM Symposium on Spatial User Interaction. (Poster presentation)
- This research was supported by the National Science Foundation Grant CMMI-2037101. #Human_Robot_Interaction#vr
- Research conducted at FMRG SingleSite ~202310
- To be presented at IEEE VR 2024