From https://hci.stanford.edu/publications/2022/Beyond_Being_Real.pdf:
Let’s focus on the highlighted ones for now.
Translation
”I’m a Giant: Walking in Large Virtual Environments at High Speed Gains” - CHI 2019
- Advancements in tracking technology and wireless headsets have made walking in virtual reality (VR) a feasible means of movement. When exploring virtual environments larger than the physical room, increasing the user’s walking speed is desired. This study investigates three methods: (1) Ground-Level Scaling - enlarging the user’s avatar to cover more distance, (2) Eye-Level Scaling - maintaining a street-level view while walking through a miniature world, and (3) Seven-League Boots - amplifying the user’s movement. Results from comparative studies showed that users felt the most sense of presence and increased stride length when using Ground-Level Scaling. Unlike other methods, Seven-League Boots led to a decrease in positional accuracy at high gains, prompting users to modify their walking behavior to compensate for the lack of control. The strengths, weaknesses, and applicable scenarios of each technique are discussed.
- scaling + translation?
”BalloonProbe: Reducing Occlusion in 3D Using Interactive Space Distortion” - VRST 2005
- Visualizing information in 3D virtual environments shows promise but often leads to the “can’t see the forest for the trees” scenario. Parts of the visualization may occlude others, resulting in decreased efficiency and accuracy loss. Users may need to significantly change their viewpoint to access hidden objects. This paper introduces an interaction technique called BalloonProbe, which involves distorting space. By commanding it, users can inflate a spherical force field around a 3D cursor, pushing objects onto the surface of the sphere. Smooth animations show expansion and contraction, displaying traces of moved objects. Initial tests suggest that BalloonProbe is a robust means of occlusion management in 3D visualization.
- Distorting space by inflating it to make hidden parts visible.
”Spherical World in Miniature: Exploring the Tiny Planets Metaphor for Discrete Locomotion in Virtual Reality” - IEEE VR 2021
- This paper explores discrete locomotion in VR using the concept of Spherical World in Miniature (SWIM). SWIM wraps a miniature world on a plane into a sphere, creating a metaphor of small planets that can be interacted with. Scaling is set based on the distance of the sphere, and rotation moves the current viewpoint. Teleportation is triggered by looking at the sphere and stopping. In an experiment involving 20 participants, SWIM outperformed flat WIM in task completion time (TCT), accuracy, and subjective evaluation.
- This is what we saw.
”Design and Evaluation of a Free-Hand VR-based Authoring Environment for Automated Vehicle Testing” - IEEE VR 2021
- VR is increasingly being used for the safe evaluation and validation of automated vehicles. However, designing and creating virtual testing environments is a complex process. This study proposes a VR authoring environment that speeds up design iterations by allowing scene creation entirely within VR. It introduces 3D interaction techniques for designing road networks and traffic scenarios, demonstrating superior accuracy and task completion time compared to other methods in a user study.
- Indirect interface, whiteboard, similar to WIM (blu3mo).
”Rapid, Continuous Movement Between Nodes as an Accessible Virtual Reality Locomotion Technique” - IEEE VR 2018- The impact of player movement in immersive virtual reality on the vestibulo-ocular reflex is a major cause of motion sickness. Continuous movement, especially for stationary users, poses a problem, leading to teleportation becoming a common means of movement. However, teleportation can disrupt the sense of direction and reduce presence in VR environments. This paper proposes alternative movement techniques to provide accessible movement while maintaining presence. It introduces a node-based navigation system where players swiftly and continuously move linearly between defined nodes. An evaluation comparing this movement technique with common teleportation-based and continuous walking approaches was conducted. Using PlayStation VR to navigate a virtual house, motion sickness and presence were investigated for each technique. Contrary to intuition, it was shown that rapid movement speed reduces the player’s sense of motion sickness compared to continuous walking at normal walking speed.
”Evaluating Automatic Parameter Control Methods for Locomotion in Multiscale Virtual Environments” - VRST 2020
- In virtual reality applications, virtual environments with a wide range of scales are becoming more common. Controlling movement parameters can help users explore such environments more easily. In multiscale virtual environments, point-and-teleport movement designed with distance control methods may compete with flying interfaces. However, automatic distance control for point-and-teleport has not been studied in such environments before. A new approach for point-and-teleport with automatic distance control is introduced. The initial user study compared three methods (automatic distance control point-and-teleport, manual distance control point-and-teleport, automatic speed control flying) using a solar system environment. The results showed that automatic control significantly reduces overshooting compared to manual control, but users preferred automatic speed control flying due to the discontinuity of teleportation. In a second study, two versions of automatic distance control teleportation were compared, one incorporating optical flow cues and bimanual interaction with automatic speed control flying. It was found that the version with optical flow cues and automatic distance control point-and-teleport was more accurate than automatic speed control flying and equally preferred as the version without cues.
”Combining bimanual interaction and teleportation for 3D manipulation on multi-touch wall-sized displays” - VRST 2016- Multi-touch devices have become a part of everyday life and are increasingly growing larger. Large screens like wall-sized displays now come with multi-touch functionality. These new devices are expected to become popular in public spaces and meeting rooms. They offer an interesting opportunity to interact with 3D virtual environments. The large display surfaces provide a great sense of immersion, and the multi-touch feature makes interacting with 3D content accessible to the general public. This paper explores touch-based 3D interaction in situations where users immerse themselves in a 3D virtual environment and move in front of a vertical wall-sized display. We designed a dual-handed touch-based technology called In(SITE) to compare with standard 3D interaction techniques for performing 6 degrees of freedom tasks on wall-sized displays. The results of two control experiments showed that participants using the In(SITE) technology achieved equivalent performance in task completion time and improved precision in fine adjustments. Combining object teleportation also suggested improvements in usability, fatigue, and user preferences.
[Automatic speed and direction control along constrained navigation paths] - IEEE VR 2017
- In many virtual reality applications, precomputed fly-through paths are the de facto standard navigation method. These paths are convenient for users and ensure coverage of important areas in the scene. Traditional applications either use a constant camera speed, allow fully user-controlled manual speed adjustments, or perform automatic speed adjustments based on scene heuristics. We introduce a new method that depends on the natural direction of the user’s head to enable the examination of off-axis scenes naturally. By utilizing head tracking to capture the user’s focal area, natural camera speed adjustments are made. This is extended to include automatic camera navigation along precomputed paths, eliminating the need for user navigation input. Our technology is applicable to any scene with precomputed navigation paths and is suitable for medical applications like virtual colonoscopy, coronary artery fly-through, virtual vascular examination, and graph navigation. By comparing traditional methods (constant speed and manual speed adjustments) with our two methods (automatic speed adjustments and automatic speed/direction control), we examine the impact of speed adjustments on system usability, mental workload, performance, and user accuracy. Through this evaluation, we observe the effects of automatic navigation, which showed no negative impact, as users demonstrated equivalent performance to manual navigation.
”Large scale cut plane: an occlusion management technique for immersive dense 3D reconstructions” - VRST 2016- Dense 3D reconstructions of real-world environments are widely used and expected to function as databases to solve real-world issues like remote inspections. Therefore, in addition to displaying scenes, the ability to interact with the environment is also necessary, such as selecting user-defined parts of the reconstruction for later use. However, due to the geometry of scenes and reconstruction artifacts, large-scale occlusion in dense 3D reconstructions is unavoidable. Previous research has lacked approaches for occlusion management in environments composed of one or more (large) continuous surfaces, leading to the proposal of a new technique called Large Cut Planes. This technique enables the segmentation and selection of visible parts and partially occluded patches within large 3D reconstructions, even at a distance, to facilitate understanding of 3D scenes and natural user interaction. The combination of immersive virtual reality settings and Large Cut Planes aims to enhance user experience. The results of a user study comparing the performance and usability of the proposed technique with baseline technology show that Large Cut Planes excel in speed and accuracy, suggesting the need for improvements in the user interface. This research, to the best of the authors’ knowledge, has not been addressed in previous studies.
”Evaluating snapping-to-photos virtual travel interfaces for 3D reconstructed visual reality” - VRST 2017
- Navigating virtual 3D reconstructed scenes is crucial in many applications. A common approach is to virtually move through the photos used for scene reconstruction. This approach is commonly known as “snapping to photos” virtual travel interfaces. Traditional studies have utilized either fully constrained interfaces (always snapping to photos) or minimally constrained interfaces (free-flight navigation). This paper introduces a new Snap-To-Photo interface that lies between these two extremes. Our Snap-To-Photo interface snaps views to photos in 3D based on viewpoint similarity and optionally uses the user’s mouse cursor or tap position. Experimental results showed that our Snap-To-Photo interface is preferred over the baseline fully constrained interface in both indoor and outdoor scene reconstructions. Additionally, differences were observed between indoor and outdoor scenes, with Click-To-Snap-Point-of-Interest snapping being easier to reach the target photo than automatic Point-of-View snapping.
”Telewalk: Towards Free and Endless Walking in Room-Scale Virtual Reality” - CHI 2020
- Natural navigation in VR poses challenges due to spatial constraints. Teleportation enables navigation in very small physical spaces, potentially reducing presence and spatial awareness while avoiding motion sickness symptoms. In contrast, Redirected Walking (RDW) allows users to walk naturally within a very large physical space while remaining in a limited area. Telewalk is a new locomotion approach that combines perceptible curvature and translation gains known from RDW research. This combination makes Telewalk applicable even in a 3m x 3m physical space. By using head rotation as an input device, Telewalk keeps users on the optimal circular path in the real world while freely walking in the virtual world. A user study reported motion sickness symptoms in participants sensitive to it, but Telewalk was shown to enhance presence and immersion, feeling more natural than teleportation.- Well, is it better than teleportation? (blu3mo)
“Nonlinear interactive motion control techniques for virtual space navigation” - VRAIS 1993
- This paper proposes nonlinear motion control techniques to swiftly and intuitively control the viewpoint and hand position within a virtual workspace. It suggests dividing the operational range of physical input devices into several parts and mapping the parameters of the devices to the virtual space using different mapping functions. The key requirements for this type of technology are providing a consistent cognitive model and maintaining smooth transitions between subspaces. Applying these nonlinear motion control techniques to Cosmic Explorer, a virtual reality visualization system for space exploration, shows that users can adapt instantly and exhibit very positive responses.
“Dynamic decomposition and integration of degrees of freedom for 3-D positioning” - VRST 2010
- This paper introduces a new interaction technique based on the decomposition of degrees of freedom (DoF) for accurate positioning in virtual reality environments. This technique, called DIOD (Decomposition and Integration Of Degrees of freedom), is based on the Adaptive Two-Component Model and provides two different control levels: one for integrating and one for separating the manipulation of DoFs. The hypothesis is that each control level is suitable for different phases of the positioning task. In the ballistic phase, users manipulate all dimensions of the task simultaneously, while in the control phase, users try to manipulate specific dimensions individually. Results from our preliminary research suggest that the DIOD technique is more efficient than existing techniques.
“Podoportation: Foot-Based Locomotion in Virtual Reality” - CHI 2020
- Virtual reality (VR) offers an expansive environment, but physical movement within this space is always constrained by the limitations of the real world. This mismatch between the physical and virtual dimensions renders traditional real-world locomotion methods impossible. To alleviate this constraint, various artificial locomotion concepts such as teleportation, treadmills, and redirected walking have been proposed. However, these concepts occupy the user’s hands and require complex hardware and extensive physical space. This paper proposes nine VR locomotion concepts that utilize the 3D position of the user’s feet and the pressure applied to the soles of the feet as input modalities. We evaluated our concepts through a controlled experiment with 20 participants and compared them to state-of-the-art point-and-teleport technology. The results confirmed that our approach is suitable for engaging movement using the feet. Furthermore, based on these findings, we propose implementing a wireless hardware prototype.
“Third-person navigation of whole-planet terrain in a head-tracked stereoscopic environment” - IEEE VR 1999
- Navigation and interaction in a virtual environment using a stereoscopic head-tracked display present several challenges compared to small datasets or simple displays. In addition to zooming by approaching or retreating from targets, it is necessary to integrate scale as the seventh degree of freedom. To maintain stereoscopic images, the interface must maintain stereo image pairs that the user perceives as a single 3D image, minimize depth loss due to incorrect occlusion of stereo images by the screen frame, provide maximum depth information, and position objects at the optimal manipulation distance. Lastly, the navigation interface must function when the environment is displayed at any scale. This paper addresses these issues for navigation in a specific large-scale virtual environment: covering the entire planet with a high-resolution terrain database.
“Multi-Ray Jumping: Comprehensible Group Navigation for Collocated Users in Immersive Virtual Reality” - IEEE VR 2019- Collaborative exploration in virtual environments benefits from group navigation features. This paper focuses on the design and evaluation of short-range teleportation technology (Jumping) for groups of users wearing head-mounted displays in the same location. Through a pilot study targeting expert users, three simple group jumping approaches were tested to derive requirements for understandable group jumping. A new Multi-Ray Jumping technology that meets these requirements is proposed, and the results of two formal user studies are reported. One study investigates the impact of passive jumping on simulator sickness symptoms (N=20), while the other examines the advantages of our new technology compared to simple group jumping (N=22). The results indicate that Multi-Ray Jumping reduces passengers’ spatial confusion, enhances navigators’ planning accuracy, and reduces the cognitive load for both.
”Object Manipulation in Virtual Reality Under Increasing Levels of Translational Gain” - CHI 2018
- Room-scale virtual reality (VR) has become a feasible consumer reality for a wide range of applications from entertainment to productivity. However, the physical space limitations in typical home or office environments for room-scale VR are a significant challenge. To address this, mapping (transformation gain) that virtually expands physical movements can be used to extend physical space. While expanded movements have been used in VR from the beginning, how they affect reach-based interaction with virtual objects, a standard feature of current consumer VR, is not well understood. Therefore, this paper explores for the first time the pick-and-place of virtual objects in VR at different transformation gain levels, from mapping a 3.5m3.5m virtual space to a 1x gain mapping to a 10.5m10.5m virtual space in a 3.5m*3.5m physical space. The results showed that reach accuracy is maintained up to a 2x gain, but beyond that, accuracy decreases, and simulator sickness and cognitive load increase. Gain levels between 1.5x and 1.75x are proposed to be usable without compromising the usability of VR tasks and significantly expanding the boundaries of interactive room-scale VR.
”JumpVR: Jump-Based Locomotion Augmentation for Virtual Reality” - CHI 2020
- One of the significant advantages of virtual reality (VR) is the implementation of functionalities beyond reality. Common “unrealistic” movement techniques (such as teleportation) can bypass spatial constraints of tracking but minimize the potential benefits of more realistic techniques (e.g., walking). As an alternative combining realistic physical movements with surreal virtual outcomes, this paper proposes a jump-based locomotion augmentation technology called JumpVR. In a user study (N=28), it was shown that jumping in VR significantly enhances presence, motivation, and immersion compared to teleportation, regardless of the scaling of the jump. Additionally, most variants of scaled jumps reported higher immersion and motivation than forward jumps. The research demonstrates the feasibility and benefits of jumping in VR and explores parameters suitable for its surreal scaling. Design considerations for VR experiences and research are discussed.
Scaling ”I’m a Giant: Walking in Large Virtual Environments at High Speed Gains” - CHI 2019 ”GiAnt: Stereoscopic-Compliant Multi-Scale Navigation in VEs” - VRST 2016 ”Dense and Dynamic 3D Selection for Game-Based Virtual Environments” - TVCG 2012 ”The magic barrier tape: A Novel Metaphor for Infinite Navigation in Virtual Worlds with a Restricted Walking Workspace” - VRST 2009The following research papers and projects focus on various aspects of virtual reality (VR) technology:
- “Embodied interaction using non-planar projections in immersive virtual reality” - Presented at VRST 2015
- “BalloonProbe: reducing occlusion in 3D using interactive space distortion” - Presented at VRST 2005
- “Spherical World in Miniature: Exploring the Tiny Planets Metaphor for Discrete Locomotion in Virtual Reality” - Presented at IEEE VR 2021
- “Design and Evaluation of a Free-Hand VR-based Authoring Environment for Automated Vehicle Testing” - Presented at IEEE VR 2021
- “Design and Evaluation of Navigation Techniques for Multiscale Virtual Environments” - Presented at IEEE VR 2006
- “Evaluating Automatic Parameter Control Methods for Locomotion in Multiscale Virtual Environments” - Presented at VRST 2020
- “vMirror: Enhancing the Interaction with Occluded or Distant Objects in VR with Virtual Mirrors” - Presented at CHI 2021
- Provides virtual mirrors at convenient angles
- “Combining bimanual interaction and teleportation for 3D manipulation on multi-touch wall-sized displays” - Presented at VRST 2016
- “Drift-Correction Techniques for Scale-Adaptive VR Navigation” - Presented at UIST 2019
- “NaviFields: Relevance fields for adaptive VR navigation” - Presented at UIST 2017
- “Slicing-Volume: Hybrid 3D/2D Multi-target Selection Technique for Dense Virtual Environments” - Presented at IEEE VR 2020
- “Navigation with place representations and visible landmarks” - Presented at IEEE VR 2004
- “Poros: Configurable Proxies for Distant Interactions in VR” - Presented at CHI 2021
- Involves creating fixed mappings
- “Nonlinear interactive motion control techniques for virtual space navigation” - Presented at VRAIS 1993
- “Your Place and Mine: Designing a Shared VR Experience for Remotely Located Users” - Presented at DIS 2018
- “Impossible Spaces: Maximizing Natural Walking in Virtual Environments with Self-Overlapping Architecture” - Presented at TVCG 2012
- “Dynamic decomposition and integration of degrees of freedom for 3-D positioning” - Presented at VRST 2010
- “CrOS: a touch screen interaction technique for cursor manipulation on 2-manifolds” - Presented at VRST 2012
- “Podoportation: Foot-Based Locomotion in Virtual Reality” - Presented at CHI 2020
- “Third-person navigation of whole-planet terrain in a head-tracked stereoscopic environment” - Presented at IEEE VR 1999
- “JumpVR: Jump-Based Locomotion Augmentation for Virtual Reality” - Presented at CHI 2020
- “Spacetime: Enabling Fluid Individual and Collaborative Editing in Virtual Reality” - Presented at UIST 2018
- For instance, users can scale down the environment to get an overview and manipulate large virtual objects more easily
- “Magnoramas: Magnifying Dioramas for Precise Annotations in Asymmetric 3D Teleconsultation” - Presented at IEEE VR 2021
Additionally, some papers discuss similar topics or concepts, such as:
- “Visual feedback techniques for virtual pointing on stereoscopic displays” - Presented at VRST 2009
- “Navigation aids for multi-floor virtual buildings: a comparative evaluation of two approaches” - Presented at VRST 2006
- “Interactive Exploration Assistance for Immersive Virtual Environments Based on Object Visibility and Viewpoint Quality” - Presented at IEEE VR 2018
- “Exploration of Large Omnidirectional Images in Immersive Environments” - Presented at IEEE VR 2019
Other projects and papers explore various interaction methods and visualization techniques in virtual environments, such as:
- “The virtual magic lantern: an interaction metaphor for enhanced medical data inspection” - Presented at VRST 2009
- “Worlds-in-Wedges: Combining Worlds-in-Miniature and Portals to Support Comparative Immersive Visualization of Forestry Data” - Presented at IEEE VR 2019
- “Virtual Projection Planes for the Visual Comparison of Photogrammetric 3D Reconstructions with Photo Footage” - Presented at VRST 2020
- “FaceWidgets: Exploring Tangible Interaction on Face with Head-Mounted Displays” - Presented at UIST 2019
- “Slice of Light: Transparent and Integrative Transition Among Realities in a Multi-HMD-User Environment” - Presented at UIST 2020
Furthermore, some projects focus on specific aspects of virtual reality navigation and interaction techniques, such as:
- “Virtual Navigation considering User Workspace: Automatic and Manual Positioning before Teleportation” - Presented at VRST 2020