Meet Emily. She and her family have all been working remotely from home for the last 14 months, including several lockdown periods. Finally, they’ve all had their COVID-19 vaccines and Emily is looking for a sunny holiday destination for the late summer. She usually prefers vacationing by the sea under the blue Mediterranean sky.
Emily uses a virtual tour app to check out the hotels and their facilities. She finds the one she likes, and it’s got a strong recommendation record as well. Emily puts on her head-mounted display and zaps into the virtual tour provided by the hotel. The 360-degree video is enriched with 2D overlays providing her the additional information she needs, such as an introductory video from the hotel manager, comment videos from other visitors and links to 360-degree videos from different areas in the hotel. It is as if she was already there, by the beach enjoying the warm summer breeze.
These kind of immersive experiences are becoming a reality, thanks to the high bandwidth and low-latency features of 5G. When combined with efficient and versatile video codecs and emerging media delivery standards, streaming of high-quality 360-degree videos will be no different than streaming 2D videos from your favorite video streaming service provider.
Let’s break down the components of such a system and examine a little deeper how it works.
Omnidirectional Media Format paves the way for immersive experiences
Delivering a vivid holiday experience to Emily requires capturing the 360-degree environment at a considerably high resolution and frame rate. The captured raw video content needs to be efficiently compressed with standardized video codecs such as Advanced Video Coding (AVC/H.264), High Efficiency Video Coding (HEVC/H.265) and Versatile Video Coding (VVC/H.266). And then a specific 360-degree media carriage and streaming format is needed to deliver the compressed video as an enriched experience to Emily’s virtual tour app.
At this point, a new standard for 360-degree media called MPEG Omnidirectional Media Format (OMAF) comes to the rescue. OMAF (ISO/IEC 23090-2) supports a wide range of omnidirectional media delivery features such as overlays, which enable 2D videos and images to be embedded into the 360-degree video; hotspots that enable the viewer to interact with the content and navigate into other 360-degree videos from other physical locations.
OMAF also has visual quality optimization features such as maximizing the quality and resolution of the video in the direction that the viewer is looking at (which is called a viewport). This feature is called viewport-dependent delivery (VDD) and it can reduce the streaming bandwidth requirements per viewer by up to 50%.
The MPEG organization recently finalized the standardization of the OMAF second edition. Features include overlays, viewpoints and content streaming in up to 8K resolution. When combined with the high bitrate and low-latency features of 5G, it will power streaming services for 360-degree content in the not-so-distant future.
Virtual tours, VR concerts and events, 360-degree videos for education, employee orientation and training in VR are just a few possible use cases and experiences that MPEG OMAF and its standards ecosystem enable. Nokia is a leader and an active contributor to the MPEG OMAF standard and its related ecosystem of standards that pave the way to future immersive media experiences. Nokia also released the OMAF source code and made it publicly available in GitHub to further support the adoption and implementation of the standard.
Spatial audio and 360-degree video coming together for immersive conferencing
Let’s now leave Emily with her dream vacation plans and pay attention to Tom. He supervises a team of field workers in an open mine. During a routine field check, two of his colleagues have identified an issue that requires Tom’s immediate attention. He is far away from the physical location and his crew wants to show him the surroundings, so they decide to set up an immersive video call. With the help of a 5G-connected 360-degree camera, Tom can have a real-time conversation with his team and examine what they are pointing at. This saves a lot of time and effort for the crew, letting them focus on what’s most important: communicating and sharing information for operational situational awareness and decision support.
The underlying technology and protocol stack for a conversational and immersive service that helped Tom and his team is quite different from what Emily used for her virtual tour experience. Right now, a new video services standard called Immersive Teleconferencing and Telepresence for Remote Terminals (ITT4RT) is being developed with Nokia’s active involvement and leadership in the 3GPP standardization partnership. When finalized, ITT4RT will enable real-time and immersive conversational services that utilize 360-degree videos and spatial audio. ITT4RT also provides a standardized way for enriching communication with additional media, including 2D overlays (e.g. a presentation or helmet camera feed). Moreover, real-time visual quality adaptation further enhances the user experience, thanks to the 5G-provided bitrates and low latency features.
Emre B. Aksu is a research manager at Nokia Technologies with over 20 years of experience on multimedia systems technologies. His team’s research focus is on immersive multimedia systems, cloud-based and AI-enabled media processing technologies and their standardization. As a standardization delegate of Nokia, he actively contributes as an editor to several multimedia systems standards, including File Format Subgroup under ISO/IEC JTC1/SC29/WG 3 which won the Technology & Engineering Emmy Award for 2020.
The pros and cons of FAST digitaltveurope.com/comment/the-pr…
20 June 2021 @ 13:38:00 UTC