Open Access
ARTICLE
An Efficient Viewport-Dependent 360 VR System Based on Adaptive Tiled Streaming
1 Department of Computer Engineering, Gachon University, Seongnam, 13120, Korea
2 Department of Computer Education, Sungkyunkwan University, Seoul, 03063, Korea
* Corresponding Author: Eun-Seok Ryu. Email:
Computers, Materials & Continua 2021, 66(3), 2627-2643. https://doi.org/10.32604/cmc.2021.013399
Received 05 August 2020; Accepted 31 August 2020; Issue published 28 December 2020
Abstract
Recent advances in 360 video streaming technologies have enhanced the immersive experience of video streaming services. Particularly, there is immense potential for the application of 360 video encoding formats to achieve highly immersive virtual reality (VR) systems. However, 360 video streaming requires considerable bandwidth, and its performance depends on several factors. Consequently, the optimization of 360 video bitstreams according to viewport texture is crucial. Therefore, we propose an adaptive solution for VR systems using viewport-dependent tiled 360 video streaming. To increase the degrees of freedom of users, the moving picture experts group (MPEG) recently defined three degrees plus of freedom (3DoF+) and six degrees of freedom (6DoF) to support free user movement within camera-captured scenes. The proposed method supports 6DoF to allow users to move their heads freely. Herein, we propose viewport-dependent tiled 360 video streaming based on users’ head movements. The proposed system generates an adaptive bitstream using tile sets that are selected according to a parameter set of user’s viewport area. This extracted bitstream is then transmitted to the user’s computer. After decoding, the user’s viewport is generated and rendered on VR head-mounted display (HMD). Furthermore, we introduce certain approaches to reduce the motion-to-photon latency. The experimental results demonstrated that, in contrast with non-tiled streaming, the proposed method achieved high-performance 360 video streaming for VR systems, with a 25.89% BD-rate saving for Y-PSNR and 61.16% for decoding time.Keywords
Cite This Article
This work is licensed under a Creative Commons Attribution 4.0 International License , which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.