Yung-Ta Lin, Yi-Chi Liao, Shan-Yuan Teng, Yi-Ju Chung, Liwei Chan, Bing-Yu Chen. 2017. Full Paper UIST 2017. Quebec City, QC, Canada.
360-degree video contains a full field of environmental content. However, browsing these videos, either on screens or through head-mounted displays (HMDs), users consume only a subset of the full field of view per a natural viewing experience. This causes a search problem when a region-of-interest (ROI) in a video is outside of the current field of view (FOV) on the screen, or users may search for non-existing ROIs.
We propose Outside-In, a visualization technique which re-introduces off-screen regions-of-interest (ROIs) into the main screen as spatial picture-in-picture (PIP) previews. The geometry of the preview windows further encodes a ROI’s relative location vis-à-vis the main screen view, allowing for effective navigation. In an 18-participant study, we compare Outside-In with traditional arrow-based guidance within three types of 360-degree video. Results show that Outside-In outperforms in regard to understanding spatial relationship, the storyline of the content and overall preference. Two applications are demonstrated for use with Outside-In in 360-degree video navigation with touchscreens, and live telepresence.
This is my master thesis, and I’m the first author of this work. I was responsible for leading this group and implementing the core algorithm.