How To: Implement MixCast SDK in Unreal

The MixCast SDK for Unreal enables you and your users to create powerful and memorable content from your VR application in minutes. Installation is designed to be painless and is testable without leaving the editor. MixCast also provides significant access to its internal functions, giving developers unprecedented power to customize how their users’ experiences are displayed.

This document contains a number of resources to help you with all aspects of the MixCast SDK:

If you’re having issues with your integration, see if our troubleshooting page has the answer! If not, you can try our community discussion on Discord or open a support ticket.

Technical Requirements

Note: The MixCast SDK for Unreal is designed for certain Unreal project configurations, but should generally still compile when the project is set to build for another configuration. Please let us know on if this isn’t the case for your project on our Discord!

MixCast will be fully functional when your project meets the following requirements (all lines must be true):

  • Running in Unreal 4.17 – 4.24
  • Building for Windows Standalone (x64)
  • If primary user is in VR: using the SteamVR or Oculus plugins.

Integrating The SDK

Note: For a general primer on installing and using plugins, review Epic’s Unreal Engine 4 plugin system documentation.

Download the SDK Plugin

The latest stable SDK release can be found here.

Import the SDK Plugin

Install the plugin by copying the MixCast folder into Plugins at the root of your UE4 project (you may have to create this directory).

Apply Required Project Settings

Open your project in Unreal. In the Project Settings window, select the Engine > Rendering section and ensure that the Support global clip plane for Planar Reflections option is checked.

Unreal will prompt you restart the editor, which you need to do before proceeding.

Add the MixCast Actor to your Scene

Find the class with the name MixCast SDK Actor in the Modes > Place panel in the All Classes list (the name may vary slightly in the UI depending on Unreal version) and add one of these to each of your available scenes. This handles initializing and managing all MixCast functionality in the SDK.

 

MixCast has now been integrated into your project. If you’d like, you can now test your integration or customize your project’s MixCast settings!

Testing Your Integration

To test that you’ve successfully integrated MixCast into your project, you should configure your own MixCast setup (don’t worry if you don’t have a physical setup for background removal, output quality isn’t an issue) and then run your application, either in-editor in VR Mode or as a standalone VR build.
By default, MixCast is configured (after setup) to run automatically, so simply running your application should cause MixCast to begin displaying to the desktop. If you don’t have MixCast already running (icon visible in the system tray), start it from the Start Menu entry “MixCast/Run MixCast“.

Note: If running your project in the Unreal Engine Editor with MixCast, it’s recommended to disable CPU performance throttling for consistent results. To apply this setting, open the Editor Preferences window, select the General > Performance section, and uncheck the toggle labelled Use less CPU when in Background

Accessing Project Settings

You can access your project’s MixCast Settings by selecting the Plugins > MixCast section in the Unreal Project Settings menu. Don’t forget to use File->Save All if you’ve modified a setting value!

Customization

Camera Class: Allows you to override the default Virtual Camera Actor that the SDK will instantiate for the user based on their configuration.

Video Input Class: Allows you to override the default Video Input Actor that the SDK will instantiate for the user based on their configuration.

Viewfinder Class: Allows you to override the default Viewfinder Actor that the SDK will instantiate for the user based on their configuration.

Rendering

Clip Mode: This option defines how the foreground layer of the experience is generated. Per Pixel means that MixCast can supply a depth value for each pixel of the output and cut off the background precisely based on that data. Planar uses the Global Clip Plane functionality in Unreal to cut off at a uniform depth value. More info here.

Translucency Mode: This option defines how the foreground layer’s translucent objects are handled. Information about how to configure this option can be found here.

Can Render Opaque BG: If enabled, tells MixCast that your experience can render a skybox, etc for MixCast to populate the background of the scene completely. Usually the case for VR experiences.

Can Render Transparent BG: If enabled, tells MixCast that your experience doesn’t have/can hide its skybox, etc so that MixCast to populate the background of the scene from another source (probably a physical video input feed). This allows for what could be described as an ‘AR’ effect within MixCast since there will then only be select virtual props visible in the output.

Build

Require Command Line Flag: This setting being enabled indicates that you’d like MixCast to remain inactive when running the application without an explicit MixCast command line argument being specified (the argument is “-mixcast”). This setting has no effect in the editor.

Enabling Accurate Transparency

The MixCast SDK for Unreal offers several options for dealing with transparent object blending in your experience, accessible within your Project Settings under the Translucency Mode option:

  • None: Uses a key color that represents the foreground cutoff in the scene’s render (usually pure black to avoid being affected by color grading) to determine which pixels to blend during compositing. Partial transparency is not supported.
  • Force Additive: Blends the scene additively with the physical world during compositing. This is only recommended for experiences designed for AR devices such as the HoloLens and Magic Leap since it emulates their approach to blending the digital and physical world.
  • From Alpha: Renders the scene’s alpha channel and samples it to perform blending during compositing. An important note on performance is that this requires an extra capture of the scene due to engine limitations, as well as some attention paid to what values your transparent shaders are writing to the alpha channel (see the note below).

Note: In order for the From Alpha Translucency Mode to achieve accurate transparency of foreground objects, your application must use Premultiplied Alpha in its materials rather than standard alpha blending (as standard alpha blending operations are not associative). This change can be made solely in the material graphs of any translucent materials by changing the Blend Mode and multiplying your colour channels by your alpha channel.

Unreal premultiplied alpha material

Alternatively, your translucent assets can be premultiplied by their alpha channel directly in your texture assets and then only the blend mode can be set in the material graph.

Configuring Per-Camera Visibility

There are sometimes cases where you want to display imagery to the VR user without affecting the results captured by MixCast. MixCast provides the ability to flag the Actors representing these objects so that they are excluded from the scene when MixCast renders it. To specify these Actors either a Blueprint or C++ function can be called at the start of each of the Actors’ lifecycles.

Using Components

  1. Add the “MixCast Actor Visibility” component to the Actor to control the visibility of.
  2. Configure the fields in the Details pane as desired.

Using Blueprints

  1. Add the “Set Actor Visibility for MixCast” node to your Blueprint.
  2. Connect the Execution and Target pins. The Target should point to the Actor to control the visibility of.
  3. Configure the fields in the node as desired.

Using C++

Call the following function with the desired values. Your AActor reference should point to the Actor to control the visibility of.

void UMixCastBPFunctionLibrary::SetActorVisibilityForMixCast(
    AActor* Target, 
    bool ShowForRegularCameras, 
    bool ShowForMixCastCameras, 
    EMixCastCameraMode MixCastCameraMode, 
    ERenderTypeCondition RenderTypeCondition, 
    EPerspectiveCondition PerspectiveCondition
)

Here are some examples where this could be applied:

  • Login Screen with Password Input (User only – Hidden from MixCast)
  • Heads-up Display such as Health Bars (User or First Person MixCast Camera)
  • Player Accessories such as Hats, Weapons, etc (Third Person MixCast Camera)
  • Avatar (Virtual+Third Person MixCast Camera)

Further Customization

The basic MixCast SDK integration will allow end users to create engaging media however they’re interacting with the experience, but the MixCast SDK also allows for more complex behaviour to be configured as well, provided you’re working with us directly on tapping into that functionality. Contact us if you think this is something you’d like to explore!

Raising Experience Events

When working with MixCast to create a more tailored content creation process (contact us for details!), you’ll likely need to supply the MixCast client with additional information about what’s occurring in your experience. The MixCast SDK provides a simple, one-step interface for both Blueprints and C++ to send an “Experience Event” from your title to the MixCast Client for processing. Firing these events has virtually no performance impact, and can be done without concern for the current status of the user’s MixCast system (such as checking for the MixCast Client’s presence).

Blueprints

This example sends a simple event with no additional information aside from a unique Event ID of your choice.

This example sends an event with an argument: the amount of damage taken.

C++

Call the following function:

void UMixCastBPFunctionLibrary::SendExperienceEvent(FString eventstr)

If the event has no arguments, the eventStr is just a unique Event ID of your choice. If this event has arguments, the eventStr should have the format: eventId(arg1Val,arg2Val,arg3Val).

Note that no matter how you’re sending your events, you won’t see anything happen at runtime until you’re running MixCast as well, and we’ve worked with you on your MixCast Moment sequencing logic (contact us for more info!).

Adding Custom Tracking

Understanding the locations of certain objects within your experience is sometimes helpful to MixCast’s operations (ex: specifying where a virtual security camera should be positioned for rendering). The MixCast SDK for Unreal makes it easy to expose the position and rotation of a specified Actor/SceneComponent with a number of options.

Using Components

  1. Add the “Custom Tracked Object” component to the Actor whose SceneComponent you want to track.
  2. (Optional) Make the Custom Tracked Object a child of the SceneComponent to track if not tracking the Actor’s root component.
  3. Configure the fields based on discussions with the MixCast team:

Additional Information

This section contains additional information that developers may find helpful in implementing and using the MixCast SDK for Unreal.

Updating The SDK

When updating to a new version of the MixCast SDK plugin for Unreal, first close the Unreal Editor before deleting and dragging in the new Plugins/MixCast folder. You may also need to close Visual Studio.

Known Issues & Limitations

Known Issues

  •  SteamVR:
    • Prior to Unreal Engine 4.20, MixCast won’t support having your VR tracking space set to Eye Level.
  •  Oculus:
    • MixCast currently requires that SteamVR is running for compositing/tracking to function correctly, even if the Oculus plugin is active and not the SteamVR plugin. (Resolved in 2.4.0 – currently in Beta)
Suggest Edit