Looking for research similar to Asymmetric Reality.
2024/01~
- I feel like I’ve looked through quite a wide range of studies, so I want to organize this.
- Categorization:
- What kind of asymmetry?
- Spatial arrangement of objects, temporal arrangement of events,
- What kind of asymmetry?
2023/11~ Following up on The Two-User Responsive Workbench: Support for Collaboration Through Individual Views of a Shared Space that Steve introduced me to.
Searching for papers citing The Two-User Responsive Workbench: Support for Collaboration Through Individual Views of a Shared Space:
- Survey type:
- MR-based co-design in manufacturing
- Survey on Collaborative Work in Augmented Reality
- Design considerations for collaborative visual analytics
- A paper written by the same author about 10 years ago
- Doesn’t seem to specifically mention subjective, specialized reality
- Enveloping users and computers in a collaborative 3D augmented reality
- Lessons Learned from Employing Multiple Perspectives in a Collaborative Virtual Environment for Visualizing Scientific Data
- This seems quite relevant
- Spacetime: Enabling Fluid Individual and Collaborative Editing in Virtual Reality
- This as well, There is No Need to Always Be in Sync
- Lenticular Objects: 3D Printed Objects with Lenticular Lens Surfaces That Can Change their Appearance Depending on the Viewpoint
- This is more of a study focused on fab technology, but the phenomena observed are similar
- Collaborative Visual Analysis with Multi-Level Information Sharing Using a Wall-size Display and See-Through HMDs
- Several studies on “Collaboration With Specialized Views” are introduced in the preceding research here
- The Two-User Responsive Workbench: Support for Collaboration Through Individual Views of a Shared Space
- Observations of record-keeping in co-located collaborative analysis
- Collaborative coupling over tabletop displays
- Branch-explore-merge: Facilitating real-time revision control in collaborative visual exploration
- Spidereyes: Designing attention- and proximity-aware collaborative interfaces for wall-sized displays
- Many of these focus on convenient individual optimization features (like filters) rather than exploring interesting cases akin to Asymmetric Reality
- It’s important to consider what makes a case interesting
- It might be more intriguing when the reality itself is different rather than just modifications or overlaid information being altered (blu3mo)
- This relates to the differences between X and Y in All Reality: Virtual, Augmented, Mixed (X), Mediated (X,Y), and Multimediated Reality
- Several studies on “Collaboration With Specialized Views” are introduced in the preceding research here
- Democratizing Rendering for Multiple Viewers in Surround VR Systems
- Methods for using environments like caves with multiple users, somewhat technically interesting and structurally similar
- Effects of Layer Partitioning in Collaborative 3D Visualizations
- The work done here is based on the original paper
- In the real world, it’s about “View-Dependent Co-located Visualization,” with a narrower scope compared to Asymmetric Reality
- Exploring Stereoscopic Multi-user Interaction with Individual Views
- This seems quite relevant
-
When users are interacting in a collaborative virtual environment it can neither be guaranteed that every user has the same input device nor that they have access to the same information.
Papers citing Lessons Learned from Employing Multiple Perspectives in a Collaborative Virtual Environment for Visualizing Scientific Data:
- ShiSha: Enabling Shared Perspective With Face-to-Face Collaboration Using Redirected Avatars in Virtual Reality
- This is quite similar to a recent conversation I had with Steve
- Really like this (blu3mo)
- Exploring the Impact of Asymmetrical Interfaces on Presence and Group Awareness in Mixed Reality Collaborative Environments
Vicarious: Context-aware Viewpoints Selection for Mixed Reality Collaboration
Partially Blended Realities: Aligning Dissimilar Spaces for Distributed Mixed Reality Meetings
- Citing:
Placement Retargeting of Virtual Avatars to Dissimilar Indoor Environments
- Citing:
- Many involve remapping avatar movements
- Spatial relationship preserving character motion adaptation
- Relationship descriptors for interactive motion adaptation
- Harmonic parameterization by electrostatics
- Works by Sung-Hee Lee
- Interested in studying the techniques used in these
- Many involve remapping avatar movements
- Cited by:
- Re-locations: Augmenting Personal and Shared Workspaces to Support Remote Collaboration in Incongruent Spaces
- Partially Blended Realities: Aligning Dissimilar Spaces for Distributed Mixed Reality Meetings
- Real-time Retargeting of Deictic Motion to Virtual Avatars for Augmented Reality Telepresence
- A Mixed Reality Telepresence System for Dissimilar Spaces Using Full-Body Avatar
- An older study by the same author as Placement Retargeting of Virtual Avatars to Dissimilar Indoor Environments
- Synthesizing Novel Spaces for Remote Telepresence Experiences
- Table2Table: Merging Similar Workspaces and Supporting Adaptive Telepresence Demonstration Guidance
- The lab of Sung-Hee Lee at KAIST has conducted numerous studies on handling dissimilar avatars
- Perhaps one of the first to delve into the finer movements of avatars?
—Read up to here on 20240115## SpaceTime: Adaptive Control of the Teleported Avatar for Improved AR Tele-conference Experience
This seems to be a relatively old related study.
Citations (unfamiliar ones):
- Predict-and-Drive: Avatar Motion Adaption in Room-Scale Augmented Reality Telepresence with Heterogeneous Spaces
- AnyPlace: Automatic Gaze Alignment of the Teleported Avatar for MR Collaborative Environments
- Deictic Gesture Retargeting for Telepresence Avatars in Dissimilar Object and User Arrangements
- Interaction motion retargeting to highly dissimilar furniture environment
Predict-and-Drive: Avatar Motion Adaption in Room-Scale Augmented Reality Telepresence with Heterogeneous Spaces
Citations:
- Creating Automatically Aligned Consensus Realities for AR Videoconferencing
- Optimization and manipulation of contextual mutual spaces for multi-user virtual and augmented reality interaction
- Adjusting relative translation gains according to space size in redirected walking for mixed reality mutual space generation
- Virtual agent positioning driven by scene semantics in mixed reality
- Although unrelated, the mechanism behind understanding semantics could be useful as a reference.
Search for more on ResearchRabbit:
- Mini-Me: An Adaptive Avatar for Mixed Reality Remote Collaboration
A Multi-Objective Optimization Framework for Redirecting Pointing Gestures in Remote-Local Mixed/Augmented Reality