Ubiq Tutorial at IEEE VR 2022
Posted February 22, 2022 by Anthony Steed, Sebastian Friston, Ben Congdon, Lisa Izzouzi, Klara Brandstätter, Nels Numan ‐ 5 min read
Learn how to build your own social virtual reality with Ubiq at the IEEE Conference on Virtual Reality and 3D User Interfaces (IEEE VR) 2022!
One of the most promising applications of consumer virtual reality technology is its use for remote collaboration. A very wide variety of social virtual reality (SVR) applications are now available; from competitive games amongst small numbers of players; through to conference-like setups supporting dozens of visitors. Indeed many participants at IEEE Virtual Reality 2022 will be experiencing at least some of the conference through a SVR application. The implementations strategies of different SVR applications are very diverse, with few standards or conventions to follow. There is an urgent need for researchers to be able to develop and deploy test systems so as to facilitate a range of research from new protocols and interaction techniques for SVRs through to multi-participant experiments on the impact of avatar appearance. This tutorial will explain the key concepts behind SVR software and introduce Ubiq, an open source (Apache licence) platform for developing your own SVR applications.
The schedule of this tutorial is as follows.
- Part 1: Sunday, March 13 2022, 8:00 - 9:30, NZDT UTC+13
- Part 2: Sunday, March 13 2022, 10:00 - 11:30, NZDT UTC+13
During the tutorial, the following topics are covered.
|Introduction and Overview (all presenters)|
|General Overview of Social VR Platforms (Steed)|
|Overview of Ubiq’s Structure (Friston)|
|Your First Ubiq Application (Congdon)|
|Where to Go Next|
(Izzouzi, Brandstätter, Friston, Congdon, Numan)
|Q&A (all presenters)|
Try a demo during the tutorial by running Ubiq as described below!
Click the button below or go to nexus.cs.ucl.ac.uk.Launch in WebXR
Download an .apk to install to your Meta Quest by clicking the button below. Requires a developer account for sideloading with SideQuest.Download .apk
Upon launch, you will be automatically connected to a lobby with spare spaces and linked to other users via voice chat.
Please contact us over the tutorial’s Discord channel if there are any issues using or installing the apps.
- WASD keys to move around
- Hold right click and move the mouse to look around
- Left click to interact with the UI
- Middle mouse button to grab items in the scene
- Left click to ‘use’ the items
Meta Quest controls
- Left stick to fly
- Hold and release primary button to teleport
- Swipe stick to snap-turn
- Point and use the trigger to interact with the UI
- Grip to grab items in the scene
- Trigger to use items
Anthony Steed is Head of the Virtual Environments and Computer Graphics group in the Department of Computer Science at University College London. He has over 25 years’ experience in developing effective immersive experiences. While his early work focussed on the engineering of displays and software, more recently it has focussed on user engagement in collaborative and telepresent scenarios. He received the IEEE VGTC’s 2016 Virtual Reality Technical Achievement Award. Recently he was a Visiting Researcher at Microsoft Research, Redmond and an Erskine Fellow at the Human Interface Technology Laboratory in Christchurch, New Zealand.
Sebastian Friston received his EngD from University College London in 2017. He is currently a Research Associate in the Virtual Environments and Computer Graphics group at University College London. His work has received the IEEE VR Best Dissertation Award (2018). His research interests are in how to build high fidelity virtual worlds, specifically the problems of rendering and networking. His most recent work is on sharing immersive physical simulations, and the social VR platform Ubiq
Ben Congdon is a research associate with the VECG group at UCL. He received an MEng in Computer Science, also from UCL. His PhD topic was redirected walking in obstacle-rich virtual environments. Ben has worked as a software engineer in Formula One and a researcher in telecommunications. His current work is on open-source software to improve access to mixed reality development.
Lisa Izzouzi is a Ph.D. student and Marie Curie Research Fellow working under the supervision of Anthony Steed in the VECG group at University College London. Prior to this, she obtained an engineering master degree in Electronics and Numerical Technologies from Polytech Nantes, and a research master degree in Management of 3D Interactive Technologies from Arts et Metiers ParisTech. Her research interest lies in the evaluation of social interactions in virtual environments. She aims to find new ways to enhance communication, collaboration, and trust between multiple people sharing the same virtual environment.
Klara Brandstätter is a Ph.D. student and Marie Curie Trainee at the VECG group at UCL, supervised by Anthony Steed. She received an MSc in Visual Computing and a BSc in Media Informatics and Visual Computing from TU Wien. Her goal is to gather insights from social scenes in VR, use these insights to teach virtual agents believable and adequate social behaviours, and eventually create lively and interactive virtual environments that are inhabited by real and virtual humans alike.
Nels Numan is a Ph.D. student at the VECG group at UCL, where he is supervised by Anthony Steed and Simon Julier. Previously, he worked as a data scientist at Microsoft and IBM in the Netherlands. He received an MSc in Computer Science from Delft University of Technology and a BSc in Computer Science from Leiden University. His current research interests lie at the intersection of immersive technology, human-computer interaction, and machine learning; with the ultimate goal to learn about how machines can best serve people.