Basic Realtime AR in Unreal 5.4

| 1 min read

Learn how to take camera tracking data and a camera feed and marry it with an incoming video feed in Unreal Engine 5.4 to create some simple realtime AR. Also have a look at some of the pitfalls and techniques you might need to use such as masking, translucency and bot depth.

Auto-generated summary

In this video, I walk you through setting up basic real-time augmented reality (AR) in Unreal Engine 5.4, focusing on live virtual production rather than handheld AR. You'll learn how to enable essential plugins like Blackmagic for video input, Live Link for camera tracking, and the camera calibration plugin for lens distortion correction. I demonstrate importing lens files, configuring media sources and IO for live video feeds, and synchronizing genlock signals to keep everything perfectly in sync.

You'll see how to set up a minimal Unreal level with a cine camera parented to a target point that matches your physical camera’s height. I show you how to apply the lens calibration data to match real camera characteristics accurately. Then, I guide you through basic compositing steps using a post-process material to combine live video feed with virtual objects, carefully handling alpha channels and explaining issues around bloom and premultiplied alpha. By the end, you’ll understand the fundamental workflow to integrate digital elements in real time with live camera footage for virtual production or AR set extensions.