Stitching VR with chips13 September 2016
Stitching video from multiple cameras or sensors to create a 360-degree panoramic movie on-the-fly is one of the main bottlenecks of virtual reality and 360-media production. A demonstration of Argon360 in the IBC Future Zone could be the answer.
Its hardware-based approach means that stitching happens on a chip in the camera. Users can view a panorama in real time, or upload it direct to YouTube, Facebook or other 360-video serving websites.
Argon360 operates at very low latency between the incoming video from the sensors and the outgoing stitched video. It uses ‘multi-band blending techniques’ to avoid ‘visual discontinuities’ while still preserving detail across a stitch line. Depth dependent parallax correction, which is under development, will mitigate artefacts that can occur when objects near the camera span a stitching seam.
“Real-time stitching is essential for live streaming of 360-degree video and a huge advantage in many other circumstances,” stated the developer.
“There are software-based solutions that can deliver live streaming of 360-degree content, but they require a powerful PC platform. Wouldn’t it be great to produce high-quality immersive 360-degree video on a compact battery powered mobile platform? With Argon360 you can.”