Ncam demonstrates integration with Unreal Engine at NAB 2016

Ncam demonstrates integration with Unreal Engine at NAB 2016
March 30, 2016 Meriam Khan

[30 March 2016]

Collaboration with Epic Games for next generation, photorealistic augmented reality including real-time relighting and depth demos

Ncam, global leader in augmented reality for television and film production, will be demonstrating ground-breaking advances in creative capabilities at NAB2016 (18 – 21 April, Las Vegas Convention Center). By combining Ncam’s unrivalled technological capabilities with Epic Games’ Unreal Engine, augmented reality takes a massive leap forward into photorealism.

“It is clear from discussions with customers, studios and productions, especially around episodic television, that there is demand in augmented reality for set extensions, virtual environments, previsualization and finished visual effects,” said Nic Hatch, CEO of Ncam. “To do that convincingly calls for photorealistic graphic elements that audiences believe to be real. And to get photorealism in real-time means tapping into the power of games engine technology.

“That is why we are working with Epic Games’ Unreal Engine and can now demonstrate real-time photorealism from multiple, freely-moving cameras,” he said. “Our demonstrations at NAB this year are going to be absolutely stunning, and a real game-changer for creative television and movie-making.”

The Ncam booth (C10345) will feature a live demonstration of the photorealistic augmented reality solution based around the Unreal Engine by Epic Games. For episodic television it creates production efficiencies by slashing the post-production time and budget because many or all of the graphics elements can be added in real time. For movies, the ability to have photorealistic previsualization will give directors and DoPs much more confidence in camera moves and cast placements.

A separate demonstration area will show the work that Ncam is doing on matching natural lighting. Ncam’s Relight measures and models the light in a scene, allowing virtual elements to cast shadows on actual objects and to respond to lighting changes based on the surrounding environment in real time.

A third presentation will show the latest addition to the data output of the Ncam camera tracking system: depth data. Armed with this, broadcast augmented reality graphics such as sports analysis will be able to allow presenters to walk around and through virtual graphics.

“So far, Ncam has been recognised for its camera tracking expertise, which is still the most accurate and fastest way to track any camera, anywhere,” said Hatch. “That remains our foundation and what is really exciting is that we are now combining our camera tracking with Ncam’s new relighting and depth technology, delivering something no-one else can do: photorealistic augmented reality.”

Ncam can be seen on booth C10345 at NAB2016. Demonstrations of Ncam camera tracking technology can also be seen in conjunction with graphics from Vizrt (booth SL2417), Orad (part of Avid, booth SU902) and Brainstorm (booth SL4617).