Network Video into Unreal 5

| 4 min read

This post walks through how to bring NDI video into Unreal Engine virtual sets using the built-in NDI plugin, Media Bundles, and a holdout-style material so the feed stays visually accurate. It then shows how to use NDI Router (plus tools like OBS, Screen Capture HX, hardware encoders, and NDI Remote) to flexibly switch and route multiple live sources into Unreal without ever touching your level setup.

In a previous video we looked at how to get NDI out of Unreal. This time, let’s flip things around and bring NDI into Unreal so we can use it as a flexible video source inside virtual sets.

NDI is fantastic for moving video around over the network, but it does have some important limitations you need to understand before you try to build an entire tracked virtual studio on it.


What NDI Is (and Isn’t) Good For

NDI has its own timing structure. It doesn’t use PTP like SMPTE 2110, and that means it’s not ideal for timing-critical work where you need frame-accurate alignment between video and tracking data.

That means:

  • You generally can’t rely on a straight NDI camera feed plus separate tracking data for:

    • AR graphics

    • XR workflows

    • Fully tracked virtual sets (unless you’re embedding tracking data inside the NDI stream somehow)

Where NDI does shine is as a video transport layer for your virtual set:

  • Great for feeding lots of video sources into Unreal

  • Not limited by a fixed number of SDI inputs

  • Perfect for screens, monitors, projection surfaces, and “video walls” in your scene

If you use NDI HX, your GPU can do the decode work, leaving your CPU free and letting you run “tons and tons” of simultaneous feeds inside Unreal.


Example Virtual Set Setup

In the demo setup, the virtual set uses:

  • A cine camera with Live Link and a lens component

  • A loaded lens file for accurate optics

  • A green screen workflow using a Blackmagic Broadcast G2 and MOSA Star Tracker

  • A plane in the scene acting as a 16:9 screen, with a simple wooden frame around it

Nothing fancy on the art side—just a clean surface where we’ll project our NDI feed.


Required Unreal Plugins & “Holdout” Material

  1. Enable the NDI plugin that ships with Unreal (the Epic one).

  2. If you have other media gear (e.g. Blackmagic output to an Ultimatte), make sure your Media framework is already configured.

  3. Use Composite Core instead of the deprecated “Holdout” system.

The key idea is to apply a holdout-style material to the plane so Unreal knows to:

  • Skip motion blur

  • Skip depth of field

  • Skip tone mapping and other post effects

That way, what comes out of Unreal looks exactly like what went in via NDI, which is crucial when you’re feeding it into downstream compositors or switchers.


Creating an NDI Media Source in Unreal

Inside your Media setup:

  1. Create (or reuse) a Media Input.

  2. Set its type to NDI.

  3. Under Capture, enable Video.

  4. Under NDI Source, pick something simple to start with, like the NDI Test Pattern generator.

Then:

  1. Go to your Media Bundle.

  2. Use the built-in Media Texture and material that ship with the bundle (or a custom one if you prefer).

  3. Apply that material to the 16:9 plane in your virtual set.

To start playback, you have two options:

  • Drag the Media Bundle actor into the level (which auto-starts it), or

  • Use the Media Playback tool to select one or more bundles and hit Play.

Once you hit Capture on your main camera, you should now see the NDI feed sitting inside your virtual set as if it were a screen or monitor on the set.


The Big Upgrade: NDI Router for Flexible Switching

Manually changing the NDI source in the Media Source asset is painful—especially if you’re on air. That’s where NDI Router (part of NewTek’s NDI Tools) comes in.

NDI Router lets you:

  • Define multiple Sources (e.g. test patterns, OBS outputs, laptops, screen captures)

  • Define one or more Destinations (e.g. “Unreal Source”, “Studio Screen #1”)

  • Route any source to any destination with a click (matrix switching)

Example setup:

  • Sources:

    • Test Pattern from Machine A

    • Test Pattern from Laptop

    • Screen Capture HX from a PC

  • Destinations:

    • Unreal Source (UI)

    • Test Pattern 1

    • Test Pattern 2

    • PC Share

Then in Unreal:

  1. Open your NDI Media Source.

  2. Instead of picking a direct device (like “Test Pattern”), pick Unreal Source from the Router section.

  3. Click Apply and Save.

Now Unreal is “listening” to whatever the router sends on Unreal Source. Changing what appears in your virtual set is as easy as clicking a different source in the Router UI—no touching Unreal at all.


More Ways to Feed NDI into Unreal

Once you’re routing via NDI, a lot of fun options open up:

  • OBS as a Source
    Use OBS to build a whole mini show (graphics, camera cuts, overlays) and send its NDI output into Unreal as one source.

  • NDI Screen Capture HX
    Capture a presenter’s laptop, PowerPoint, or browser and put it on a screen in your virtual set.

  • Hardware NDI Encoders
    Devices from companies like BirdDog let you convert HDMI → NDI without installing software on the source machine.

  • NDI Remote
    Let a guest share their screen via a web browser; it appears on your network as an NDI source you can route straight into Unreal—no software install required, very “Google Meet”-like in workflow.

You can also scale this up: multiple planes in Unreal, each listening to a different Router destination, all controlled from a small touchscreen or a separate PC.


Wrapping Up

NDI won’t replace SDI + genlock + proper tracking for high-end AR/XR work, but it’s a fantastic glue layer for getting lots of video into Unreal—especially for virtual sets with multiple screens, remote presenters, or complex program feeds. With the built-in NDI plugin, Media Bundles, and the NDI Router, you get a surprisingly powerful and flexible “video matrix” sitting inside your Unreal scene.