Do You See What I See? Bring Live Pedestrians into an Outdoor Collaborative Mixed Reality Experience
Published September 27, 2025 in UIST '25: Proceedings of the 38th Annual ACM Symposium on User Interface Software and Technology by Jingyi Zhang, Ziwen Lu, Changrui Zhu, Simon Julier, Anthony Steed
Abstract
Collaborative use of mixed reality (MR) devices is blurring the line between virtual and physical worlds. A remote virtual reality (VR) user immersed in a virtual replica of a real-world environment can interact in real-time with an augmented reality (AR) user who is physically present in that location. One challenge with such a setting is that the virtual world experienced by the remote users often lacks the richness of the real world, particularly in outdoor settings where dynamic elements, such as pedestrians, are missing. The first contribution of this paper is to report findings from focus group sessions on an example collaborative outdoor mixed reality system. Participants noted that lack of synchronisation between the AR and VR worlds diminishes the VR user’s sense of having visited the real-world location together with the AR user. To address this, our second contribution is a system that brings live dynamics into a collaborative MR experience using pedestrians as an example. We conducted a user study using a tour-guide scenario, where an in-situ guide using AR interacts with a remote participant in VR. Results showed that in this scenario, most participants perceived the virtual avatars they saw as representations of real humans in situ.
Full-text PDF Publisher PageCite with BibTex
@inproceedings{10.1145/3746059.3747667,
author = {Zhang, Jingyi and Lu, Ziwen and Zhu, Changrui and Julier, Simon and Steed, Anthony},
title = {Do You See What I See? Bring Live Pedestrians into an Outdoor Collaborative Mixed Reality Experience},
year = {2025},
isbn = {9798400720376},
publisher = {Association for Computing Machinery},
address = {New York, NY, USA},
url = {https://doi.org/10.1145/3746059.3747667},
doi = {10.1145/3746059.3747667},
abstract = {Collaborative use of mixed reality (MR) devices is blurring the line between virtual and physical worlds. A remote virtual reality (VR) user immersed in a virtual replica of a real-world environment can interact in real-time with an augmented reality (AR) user who is physically present in that location. One challenge with such a setting is that the virtual world experienced by the remote users often lacks the richness of the real world, particularly in outdoor settings where dynamic elements, such as pedestrians, are missing. The first contribution of this paper is to report findings from focus group sessions on an example collaborative outdoor mixed reality system. Participants noted that lack of synchronisation between the AR and VR worlds diminishes the VR user’s sense of having visited the real-world location together with the AR user. To address this, our second contribution is a system that brings live dynamics into a collaborative MR experience using pedestrians as an example. We conducted a user study using a tour-guide scenario, where an in-situ guide using AR interacts with a remote participant in VR. Results showed that in this scenario, most participants perceived the virtual avatars they saw as representations of real humans in situ.},
booktitle = {Proceedings of the 38th Annual ACM Symposium on User Interface Software and Technology},
articleno = {171},
numpages = {14},
keywords = {Virtual Reality, Augmented Reality, Mixed Reality Collaboration},
location = {
},
series = {UIST '25}
}