mrps_faq_teaser_TEMP

Virtual Production FAQ

Questions on how to work with ARRI cameras in a virtual production environment. 

Virtual production is a powerful instrument within the filmmaker's toolbox. The following will provide insight into these setups in relation to ARRI cameras.

  • Modern LED walls require the high dynamic range of ARRI's sensors, available in ALEXA or AMIRA cameras to capture its full dynamic range. Our sensor’s color imagery combined with its Optical Low Pass Filter (OLPF) is a proven combination since the first ALEXA in 2010. This team perfectly suits the needs of LED Wall setups.

    Of course color reproduction is only one part of the equation: ARRI's equipment offers the needed reliability and robustness to service shoots in extreme conditions as well as long running shows in studios.

  • The three main artefact-types are listed below:

    Sync artefacts (framerate and timing to other devices on the brink of technology)

    • Check whether the LED wall’s signal processor allows the framerates needed for production.
    • Processor implementations may cause trouble in low lighting environments.

    Color artefacts (colors of the LED wall may not be accurately perceived by the eye)

    • We provide a color calibration through our solutions team. Please contact: mrps@arri.de

    Moiré artefacts (disturbances caused by regular patterns overlapping each other, e.g. photosites on the sensor and LEDs of the LED wall.)

    • We urge you to perform your own tests to get a sense for this new style of shooting. A long list of variables including camera and screen type, low pass filter, lens characteristics, aperture, focus and more influence the severity of moiré when it emerges.
  • As stated in the question on typical LED wall artefacts there are many factors which relate to the occurrence of moiré on the wall. But we see that the majority of moiré patterns disappear at a distance of about 3 meters when using our ALEXA Mini LF sensor. Please perform your own tests to verify our findings for your setup.
     

  • Generally, we are seeing very good color space and tone mapping reproduction for SDR and HDR on the LED walls and its processors used today in virtual production, but please verify the system’s calibration.
     

  • Currently the majority of virtual production setups use LED fixtures to help illuminate the actors in front of the LED wall. ARRI's LED lighting products are an ideal match for a virtual production setup.

    Please keep the following in mind:

    •  LED walls produce a very narrow bandwidth spectrum of light, which can result in slightly different colors being perceived by the human eye compared to what ARRI cameras will capture.
    • The LED majority of walls are calibrated at about 6500 Kelvin.
  • ALEXA and AMIRA cameras offer static as well as dynamic metadata. Static metadata changes on a per-clip-basis like clip name or the clip’s framerate. Dynamic metadata on the other hand changes on a per-frame-basis. A new set of metadata is generated for every frame (even if the internal interfaces offer much faster read-outs, the rate is linked to the sensor framerate as this is the fastest stream of information).

    Please have a look at our Metadata White Paper on all available metadata form the cameras and ways to access them.

    Live metadata with ALEXA 35 or UMC-4
    For real-time lens metadata please refer to our ARRI Live Link Metadata Plug-in for Unreal Engine (using our UMC-4 or ALEXA 35) or implement streaming the metadata from the SDI ancillary data.

  • Measuring right at the camera's SDI outputs the metadata has less than one frame delay from its associated frame. Additionally the delay depends on the way you capture and decode the signal.

  • Our quick internal signal processing introduces only a sub-one-frame metadata delay and is not tied to the framer ate (neither sensor nor SDI).
     

  • Metadata from recorded clips can be assessed easily using  ARRI Reference Tool or ARRI Meta Extract (legacy).

    Real-time metadata can be accessed by extracting the ARRI V3 header from the ancillary data stream of the SDI or our by using our custom realtime metadata solution via the UMC-4 and ALEXA 35: the ARRI Live Link Metadata Plug-in for Unreal Engine.

    See "What types of (real-time) metadata do ARRI cameras provide?" for more.


  • We are currently testing different setups. So far, we have had good results with video cards from Rhode & Schwarz/DVS (Atomix LT) and the Blackmagic Design UltraStudio Mini Recorder (Thunderbolt).

    We will update this answer once new information comes in.
     

  • Yes, we offer a free plug-in for Unreal Engine with support for metadata streaming using the Universal Motor Controller UMC-4 or ALEXA 35 camera: ARRI Live Link Metadata

    This solution supports camera agnostic live metadata.


  • The ALEXA 35, ALEXA Mini, Mini LF, and AMIRA cameras can directly accept, and be synchronized to a tri-level or black burst genlock signal. Once synchronized, an offset to the incoming genlock (shift sync) can also be applied in-camera.

    Further explanation on why you need to apply a shift to the synchronization illustrates this document on Snyc Shift & GenLock.

    For other ALEXA Classic models, such as the ALEXA SXT, ALEXA LF, and ALEXA 65, the camera can be synchronized to an incoming LTC signal. To ensure synchronization between camera and a virtual set environment, please see that you use a master clock that simultaneously provides LTC for the camera and genlock for the virtual set environment. It is possible to apply an offset to the LTC signal either in the device supplying TC (if supported) or by connecting an analog audio delay device in-line with the LTC signal being supplied to the camera.

    We have compiled some Technical Information on how to sync ARRI cameras to third-party devices.

    Please consult the LED wall processor vendor to find out which framerates they are capable of synchronizing to.

  • The usable framerate is determined by the camera’s ability to sync. ALEXA Mini, Mini LF, AMIRA and ALEXA 35 cameras are capable of synchronizing to all standard tri-level rates (23.976, 24, 25, 29.97, 30, 48, 50, 59.94, and 60).

    Since the ALEXA Classic camera models utilize LTC as their method of synchronization, only standard SMPTE Timecode frame rates of 23.976, 24, 25, 29.97, and 30 can be utilized. Under-cranking and over-cranking with frame rates such as 18 fps and 33fps, respectively, will prevent the ability to synchronize.

  • Very short exposure times can cause imaging artefacts due to LED wall processor implementation. On the other hand, if an exposure time is too long, it can also reveal artefacts such as image overlapping. We have found that the results occur when using a shutter angle of around 180 degrees to be best for artefact free, camera to wall exposure times.
    ARRI is working in collaboration with LED wall display and processor manufacturers to solve these issues.
     

  • Once synchronization is achieved, and the correct synchronization shift is applied in camera, rolling shutter effects are non-problematic.
    However, when very high framerates and long exposures are used, this can produce undesirable effects such as image overlapping.
     

  • Generelly speaking, yes, this is to maintain synchronization between the camera and the wall. It is also possible to run the camera at a multiple of the frame rate being sent to the LED wall.
    However manufacturers are experimenting with various methods e.g. to oversample the LED screen to enable two or more cameras to shoot the backgound from different angles.

  • Sure, both analog and digital camera systems can be used in a virtual set, providing synchronization can be achieved, e.g. ARRIFLEX 416 offers genlock.