Qi Sun, Anjul Patney, Li-Yi Wei, Omer Shapira, Jingwan Lu, Paul Asente, Suwen Zhu, Morgan McGuire, David Luebke, Arie Kaufman
SIGGRAPH 2018
We present an end-to-end VR redirected walking system through saccadic suppression and a GPU-based dynamic path planning.
Redirected walking techniques can enhance the immersion and visual-vestibular comfort of virtual reality (VR) navigation, but are often limited by the size, shape, and content of the physical environments. We propose a redirected walking technique that can apply to small physical environments with static or dynamic obstacles. Via a head- and eye-tracking VR headset, our method detects saccadic suppression and redirects the users during the resulting temporary blindness. Our dynamic path planning runs in real-time on a GPU, and thus can avoid static and dynamic obstacles, including walls, furniture, and other VR users sharing the same physical space. To further enhance saccadic redirection, we propose subtle gaze direction methods tailored for VR perception. We demonstrate that saccades can significantly increase the rotation gains during redirection without introducing visual distortions or simulator sickness. This allows our method to apply to large open virtual spaces and small physical environments for room-scale VR. We evaluate our system via numerical simulations and real user studies.
[paper] [Morgan's page] [fast-forward (youtube)] [video (youtube)] [two-minute papers] [GTC 2018 demo]
We would like to thank Michael Stengel for early discussions, Rachel Albert for conducting initial vision studies, Mavey Ma for video dubbing, Jonghyun Kim, Joohwan Kim, Erik Lindholm, Alexander Majercik, and Ward Lopes for helping our GTC live demo, and the anonymous reviewers for their valuable suggestions. This project is partially supported by National Science Foundation grants CNS1302246, NRT1633299, CNS1650499, and a gift from Adobe.