19
Sep 2017
360° VR Live-Streaming Workflow
The hype around VR is dwindling, and the industry is now faced with the reality of delivering on the promise of giving consumers compelling content and experiences. 360° VR live streaming has gained some traction over the past year as the production process has improved. Until recently, capturing 360° video was difficult, complex, and often convoluted, and panoramic camera systems combined a morass of different technologies based on work-arounds and compromises.
Today, cameras have increased in resolution and features, the production hardware has become more powerful, and the software is easier to use, making 360° VR video content production more accessible and efficient. VR production isn’t plug-and-play yet, but it’s getting closer.
Streaming 360° has become a much a more refined process, allowing creativity to reign supreme for content creators instead of burdening them with technical challenges and limitations. There are still some hurdles to overcome, such as the need to increase frame rates, resolution, compression, and bandwidth, but those problems will be solved in time. In addition, much of the required equipment for VR production and delivery remains too costly and unreliable to engender widespread adoption and growth to match the interest in VR on the consumer end.
There is currently a large number of manufacturers drowning creators in a sea of confusing options. To help VR content creators contend with the deluge of information and product announcements, this article will focus on VR streaming workflows and toolkits that have worked well together in various VR/360 productions I’ve done.
But let’s start with a quick overview of camera options and other components before we look at how they fit together in different VR streaming setups.
Cameras
VR/360 cameras now come in all shapes and sizes. You’ll find them configured in rigs utilizing an array of cameras in a premade or custom harness, or as a complete camera unit featuring built-in stitching.
For video streaming, you can use either type of camera setup, depending on your particular project needs or budget. You must determine which camera fits your intended purpose, since each has its own strengths and weaknesses. For example, you may want a rig that is small and compact to cover an outdoor sporting event, as opposed to a larger rig that has greater resolution and coverage.
There are many uses for streaming VR. Some current implementations include product launches, brand activations, and concerts. Choosing the right camera can make or break your production. A good, general-purpose camera rig might consist of four to eight GoPro HERO5 Blacks with a Freedom360 or 360RIZE array (Figure 1, below). Alternatively, you might build a 360 rig with Blackmagic Design Micro Studio Cameras in a harness also available from 360RIZE.
Figure 1. GoPro Black cameras in a 360RIZE harness
Two things to be aware of with GoPros are their tendency to overheat and their limited battery life. If you’re going the GoPro route, I recommend adding a fan and a battery-based USB power source to keep your rig running reliably longer.
Right now, all-in-one cameras are gaining cachet, since they add simplicity, synchronization, and automation to the capture process. Selecting the right camera is key, since each system configuration has its strengths and weaknesses. I recommend using the ImagineVision Z Cam S1 (Figure 2, below) featured in Case 2, since it delivers high-quality capture overall with its fantastic optics and fantastic reliability while offering end-to-end features that make production easier.
Figure 2. The ImagineVision Z Cam S1
The S1 is a professional VR camera with four high-resolution sensors that have synced auto white balance and exposure. The camera can shoot 6K video at 30 fps and 4K video at 60 fps. In addition, it captures video files directly to the camera’s four SD cards and can record up to 120 minutes on a full battery. It is also equipped with four HDMI outputs and an Ethernet port for file transfer and control. The camera has a rugged, full-metal body embedded with four directional microphones.
My company has used this solid performer with great success, and I would absolutely trust it on future productions. For now, if it were my choice, I would exclusively use the Z Cam line of cameras over any other VR camera systems we’ve tested. It rivals most others that cost up to 10 times as much.
Live Stitching
360° video production requires stitching together footage from multiple sources, since each camera outputs a separate video stream that represents only one segment of the 360° image. Stitching is processor-intensive, and requires a powerful workstation and graphics card to run the stitching software.
The Z Cam Controller WonderLive software (Figure 3, below), which is included with the Z Cam cameras, enables real-time stitching. It also has an external audio input with Nadir logo patching, which allows you to insert a logo for branding. There is even a method for outputting a stitched feed via SDI out to support a separate encoder if needed.
Figure 3. Z Cam Controller WonderLive stitching software.
Hardware system requirements and key supported features (such as 4K output and Facebook Live integration) for WonderLive can be found at z-cam.com/wonderlive.
If you decide to use cameras from manufacturers that don’t supply their own stitching app, then VideoStitch Vahana VR is the go-to piece of software. Vahana VR (Figure 4, below) is the only standalone PC application that is capable of stitching VR video in real-time for livestreaming separate cameras.
Figure 4. Vahana VR stitching software
I recommend incorporating four-input video cards from either Magewell or AJA, which enable stitching video feeds from multiple cameras, whether HDMI or HD-SDI, into a single 360° video for a streamed output via RTMP.
Vahana VR system requirements and key supported features (such as 8K output and Facebook 360 Live integration) are available here: go2sm.com/43.
A viable stitching/streaming alternative is the Teradek Sphere (Figure 5, below), which is a unique hardware and software solution that allows for live preview and live stitching of panoramic video. The Sphere, which retails for $3,000, enables you to combine four HDMI feeds from various cameras (up to eight when networked), such as the GoPro Hero or Blackmagic Design Micro Studio Cameras, and stitch and stream them in real time from an iPad app.
Figure 5. Teradek Sphere
Teradek charges a one-time license for livestreaming with the Sphere. It’s also available in an SDI version, which costs an additional $400.
Encoding/Streaming Hardware
The next stage after stitching is encoding, either via software or dedicated hardware encoders from reputable manufacturers such as Harmonic, Haivision, or Elemental. Streams should be delivered at multiple bitrates to cater to viewers with different internet speeds.
Encoding VR content requires more processing power than typical HD or UHD content. A high-end workstation or a higher-performing hardware encoder is critical. I recommend investing in as much CPU and GPU processing power as you can afford to prevent any bottlenecks and to future-proof your system as well.
Although streaming VR from your encoder to your CDN over RTMP, MPEG-DASH, or HLS is identical to that of standard live-streaming workflows, VR video—especially at 4K—requires large amounts of bandwidth, especially when streaming at higher frame rates at around 20+Mbps with the H.264 codec. Once HEVC/H.265 is widely adopted with increased hardware support on mobile devices and desktops, it will bring much more efficiency with lower overhead, thereby allowing for better-quality streams.
You can optimize your upstream by using HEVC encoding as a mezzanine encode. HEVC is roughly twice as efficient as H.264. Generally, a 4K stream in H.264 might need 12–15Mbps of bandwidth. With HEVC, you can stream video with the same quality at around 6–8Mbps.
Something else to consider is using Open Broadcaster Software. OBS is a powerful, versatile, and open source switching, encoding, and recording solution for live streaming for Windows, Mac, or Linux. I recommend using OBS in conjunction with a high-end PC as an alternative to “Big Iron” encoders, since the software is free and feature-rich. It has served us well on our productions, and can be a critical component for VR streaming events. If OBS isn’t your primary encoder, I suggest having it as a backup encoder.
Case 1: Single-Rig Live VR Streaming With Vahana VR or Teradek Sphere
Figure 6 (below) maps out a single-camera VR workflow I assembled through trial and error to use in relatively simple VR productions.
Figure 6. A single-rig live VR streaming setup using Vahana VR or Teradek Sphere
At the heart of this setup is a VR camera rig with HDMI/HD-SDI outputs and a workstation PC for live stitching and encoding.
If you’re planning on building your VR workflow around Vahana VR, here are a few things to note:
• For Vahana VR, software stitching is sufficient, but for better stitching results, consider importing a PTGui template.
• Vahana VR can be buggy at times, so I highly recommend using a separate encoder.
• As of August 2017, Vahana VR is the only software available that offers live-view with Oculus headsets.
Case 2: Z Cam S1-Based Complete IP Workflow
The setup diagrammed in Case 2 (Figure 7, below) shows a complete IP workflow achieved with a Z Cam S1. To assemble this kit, you need only one Ethernet cable between the Z Cam S1 and a WonderLive installed machine. The Z Cam controller, WonderStitch, and WonderLive used in the Case 1B setup, are sold as bundled software. The caveat is that one unique license is required per camera.
Figure 7. A Z Cam S1-based complete IP workflow setup for VR/360 streaming
As in Case 1, software stitching is sufficient for baseline use, but for better stitching results, import a PTGui template.
Streaming Platforms
For 360° livestreaming to the general public, I recommend encoding directly to platforms such as Facebook or YouTube. You can also stream to other CDNs or services such as Bitmovin, Wowza, or Visbit when you need more robust or customized solutions. Overall, using Facebook or YouTube is great for standard 4K, since they are both easy to configure and provide comprehensive dashboards to work with.
YouTube has continuously refined its live platform and is very straightforward in its approach to panoramic live streaming. In our projects, it has proven the most consistent and stable platform while at the same time delivering good, high-resolution 4K streams.
YouTube supports VR streaming at up to 4K @ 60 fps, but depending on your bandwidth, I suggest using more conservative specs, such as 2560×1440 at 30 fps, uploaded at 6–10Mbps; or for 3840×2160 at 30 fps, with an upload bitrate of 13–20Mbps.
Facebook, with its Live 360 video platform, is making major investments in its ecosystem and community to bring together its content creators and consumers. Facebook now features live-streamed 360° video up to 4K resolution. The streams are viewable in 4K/360° on the Samsung Gear VR. Facebook intends to offer 4K/360° live video for Oculus Rift in near the future.
You can find upload specs and recommendations for Facebook Live 360 here.
Live Preview
Being able to see a live preview on-set allows the director, clients, or production crew members to get an instant visualization of what is being captured from the stitched camera view. The Teradek Sphere allows for live preview and live stitching of panoramic video view while on-set. Having this feature enhances overall production value and allows you to troubleshoot possible stitching errors that diminish the quality of the stream.
Vahana VR software also provides a live preview in real time. It has an equirectangular preview mode for viewing the full panoramic video on a connected display, as well as interactive mode for first-person view in a VR headset.
Developing Your Own VR/360 Streaming Workflow
As mentioned previously, there are many choices in VR production, and every company or individual producer will have their own methodologies in accomplishing a goal. What I have presented are general workflows that have served as a basis for my company in 360° streaming live events including fashion shows, music concerts, corporate presentations, and product launches. Hopefully this will serve as a template that you can build on and expand based on your own or your client’s specific needs.
360° live streaming is still in its infancy and has yet to reach its full potential. When content improves and captures the hearts and imaginations of its viewers, only then will it be an easier sell to a broader swath of consumers and make inroads into the mainstream. But until then, it will remain a niche market, albeit one with great potential.