Engineering Guide  |  XR / Virtual Production

UNREAL ENGINE & LED VOLUME

How to Match Real Cameras in Unreal Engine 5

Master the process of **matching real cameras in Unreal Engine** for XR and virtual production, ensuring your CineCamera filmback aligns with your physical lens.

Essential for avoiding perspective mismatch in LED volumes. To find your exact settings, use our XR FOV calculator. Explore more broadcast engineering tools for IP video and media networking.

Why Camera Matching Matters in XR

In-camera visual effects (ICVFX) on LED volumes rely on a fundamental principle: the virtual camera in Unreal Engine must see exactly the same view as the physical camera. If the field of view is even slightly different — even by half a degree — the virtual scene will appear to float or slide relative to the foreground as the camera moves, immediately breaking the illusion.

This alignment must be achieved across three dimensions: field of view (matching lens and sensor), spatial position (matching camera tracking), and time (compensating tracking latency). This guide covers all three.

Step 1: Get Your Camera's Exact Sensor Dimensions

The most important data to collect before any UE5 configuration is your camera's active sensor area in the specific recording format (crop mode) you will be shooting in. This is not necessarily the full sensor size — many cameras use different crop modes that change the active area significantly.

Key Sensor Sizes

Common Cinema Camera Sensor Data

  • ARRI ALEXA 35 (Open Gate): 27.99 × 19.22 mm
  • ARRI ALEXA Mini LF (16:9 LF): 31.68 × 17.82 mm
  • ARRI ALEXA Mini (4:3 S35): 28.17 × 21.13 mm
  • Sony VENICE 2 (Full Frame 16:9): 35.9 × 20.2 mm
  • Sony VENICE 2 (S35 Mode): 23.76 × 13.365 mm
  • RED Komodo 6K (6K Full Frame): 35.87 × 24.43 mm
  • RED Komodo 6K (S35 5K): 29.90 × 15.77 mm
  • Canon EOS R5C (Full Frame 8K): 36.0 × 24.0 mm
  • Blackmagic PYXIS 6K (Full Frame): 35.9 × 24.0 mm

Always verify these dimensions against the manufacturer's sensor data sheet for the exact recording format you are using, as values vary between firmware versions and recording modes.

Step 2: Calculate the Field of View

The horizontal field of view for a given sensor and lens is calculated using the thin-lens formula:

HFOV = 2 × arctan(Sensor Width mm ÷ (2 × Focal Length mm))

Example: Sony VENICE in S35 mode (23.76 mm sensor width) with a 35mm lens:

HFOV = 2 × arctan(23.76 ÷ 70) = 2 × arctan(0.3394) = 2 × 18.73° = 37.47°

Use the XR FOV Calculator to compute this automatically and get the precise values to enter in Unreal Engine.

Step 3: Configure Unreal Engine 5 CineCamera

In Unreal Engine 5, the CineCamera actor is the correct camera type for virtual production. Its Filmback settings map directly to physical camera properties.

Step 3a

Set the Filmback (Sensor Dimensions)

  1. In the UE5 Editor, place or select a CineCameraActor in your level.
  2. In the Details panel, find the Filmback group.
  3. Set Sensor Width to your camera's active sensor width in mm (e.g., 23.76).
  4. Set Sensor Height to the active sensor height in mm (e.g., 13.365).
  5. Ignore the "Presets" dropdown unless your exact camera format is listed — always verify preset values against the data sheet.

Step 3b

Set the Focal Length

  1. Under Current Focal Length, enter the focal length of the physical lens in mm.
  2. If using a zoom lens, this must be updated in real time when the focal length changes. A LiveLink tracking system that reports zoom data can drive this value dynamically.
  3. The camera's Field of View will update automatically once Filmback and Focal Length are both set correctly. Cross-reference it against your FOV calculator output.
Important: Unreal Engine displays the FOV angle as the horizontal FOV by default. In nDisplay's inner frustum configuration, the frustum is defined by the CineCamera's Filmback and Focal Length — not a raw FOV angle. Ensure your CineCamera is the one driving the inner frustum render, not the default PlayerCamera.

Step 4: Configure nDisplay for LED Volume

For in-camera VFX on a physical LED volume, nDisplay is the Unreal plugin that distributes the render across multiple screens and maps the camera's frustum to the LED panels.

Step 4a

Inner Frustum (In-Camera View)

The inner frustum is what the camera actually "sees" through the LED wall — it must perfectly match the physical camera's FOV. Configure it by linking the nDisplay viewport to the CineCamera actor you configured in Steps 2–3. Unreal's nDisplay will extract the FOV directly from the CineCamera Filmback settings.

Step 4b

Outer Frustum (LED Wall Peripheral)

The outer frustum covers the rest of the LED panels beyond the camera's direct view. These panels render the wider scene environment for reflections, ambient light on the talent, and physical spill. The outer frustum is typically 20–40% wider than the inner frustum angle. Calculate the outer frustum using the LED Wall Planner to determine the angular coverage of your physicialt LED volume at the camera position.

Step 5: Set Up Camera Tracking

Camera tracking data feeds the CineCamera's world transform in real time — this is what makes the virtual scene appear locked to the physical world despite the camera moving. Common tracking vendors for UE5 include Mo-Sys, Ncam, OptiTrack (via LiveLink plugin), and Stype.

Step 5a

LiveLink Setup

  1. Open the LiveLink panel (Window → Live Link).
  2. Add a LiveLink source for your tracking vendor (most provide a dedicated UE5 plugin).
  3. In the CineCamera actor's Details, add a Live Link Controller component.
  4. Set the Live Link Subject to the tracking source name and enable Transform control.
  5. Test by moving the physical camera and verifying the virtual camera's position and orientation update in real time.

Step 6: Compensate for Tracking Latency

Even with a fast tracking system, there is always some latency between the physical camera moving and the virtual scene updating on the LED wall. If uncompensated, moving the camera will cause the virtual background to "swim" — visibly lagging behind the physical world. This is the most common technical failure mode on LED volumes.

Total system latency accumulates from three stages:

Step 6

Apply Latency Compensation in LiveLink

  1. Use the Tracking Latency Calculator to compute your total system latency in ms and frames.
  2. In the LiveLink panel, select your tracking source and open its properties.
  3. Set Source → Time Offset to the negative of your total latency (e.g., −0.050 for 50 ms).
  4. Test by making a sharp camera rotation and observing whether the virtual scene's edge aligns with the LED panel edge without swimming.
Pro Tip: The Genlock "Phase" Trap
Even with Genlock, you may see a slight "shimmer" on the LED wall through the camera. This is often a Phase Shift issue. Use your LED processor (Brompton/Megapixel) to adjust the "Genlock Phase" in micro-degrees until the camera's shutter opening perfectly aligns with the LED refresh cycle.

Step 7: Verify and Calibrate

After configuration, systematic verification is essential before the camera department arrives on set. Check:

Related Tools