How To: Implement MixCast SDK in Unreal

The MixCast SDK for Unreal enables you and your users to create powerful and memorable content from your VR application in minutes. Installation is designed to be painless and is testable without leaving the editor. MixCast also provides significant access to its internal functions, giving developers unprecedented power to customize how their users’ experiences are displayed.

This document contains a number of resources to help you with all aspects of the MixCast SDK:

If you’re having issues with your integration, see if our troubleshooting page has the answer! If not, you can try our community discussion on Discord or open a support ticket.

Technical Requirements

Note: The MixCast SDK for Unreal is designed for certain Unreal project configurations, but should generally still compile when the project is set to build for another configuration. Please let us know on if this isn’t the case for your project on our Discord!

MixCast will be fully functional when your project meets the following requirements (all lines must be true):

  • Running in Unreal 4.17 – 4.24
  • Building for Windows Standalone (x64)
  • If primary user is in VR: using the SteamVR or Oculus plugins.

Integrating The SDK

Note: For a general primer on installing and using plugins, review Epic’s Unreal Engine 4 plugin system documentation.

Download the SDK Plugin

The latest stable SDK release can be found here.

Import the SDK Plugin

Install the plugin by copying the MixCast folder into Plugins at the root of your UE4 project (you may have to create this directory).

Apply Required Project Settings

Open your project in Unreal. In the Project Settings window, select the Engine > Rendering section and ensure that the Support global clip plane for Planar Reflections option is checked.

Unreal will prompt you restart the editor, which you need to do before proceeding.

Add the MixCast Actor to your Scene

Find the class with the name MixCast SDK Actor in the Modes > Place panel in the All Classes list (the name may vary slightly in the UI depending on Unreal version) and add one of these to each of your available scenes. This handles initializing and managing all MixCast functionality in the SDK.


MixCast has now been integrated into your project. If you’d like, you can now test your integration or customize your project’s MixCast settings!

Testing Your Integration

To test that you’ve successfully integrated MixCast into your project, you should configure your own MixCast setup (don’t worry if you don’t have a physical setup for background removal, output quality isn’t an issue) and then run your application, either in-editor in VR Mode or as a standalone VR build.
By default, MixCast is configured (after setup) to run automatically, so simply running your application should cause MixCast to begin displaying to the desktop. If you don’t have MixCast already running (icon visible in the system tray), start it from the Start Menu entry “MixCast/Run MixCast“.

Note: If running your project in the Unreal Engine Editor with MixCast, it’s recommended to disable CPU performance throttling for consistent results. To apply this setting, open the Editor Preferences window, select the General > Performance section, and uncheck the toggle labelled Use less CPU when in Background

Accessing Project Settings

You can access your project’s MixCast Settings by selecting the Plugins > MixCast section in the Unreal Project Settings menu. Don’t forget to use File->Save All if you’ve modified a setting value!

Here is a breakdown of the options made available by section:

Editor / Build

Require Command Line Flag

This setting being enabled indicates that you’d like MixCast to remain inactive when running the application without an explicit MixCast command line argument being specified (the argument is “-mixcast”). This setting has no effect in the editor.

Enabling Accurate Transparency

In order to properly handle transparency of foreground objects, your application must use Premultiplied Alpha in its materials rather than standard alpha blending (as standard alpha blending operations are not associative). This change can be made solely in the material graphs of any translucent materials by changing the Blend Mode and multiplying your colour channels by your alpha channel.

Unreal premultiplied alpha material

Alternatively, your translucent assets can be premultiplied by their alpha channel directly in your image assets and then only the blend mode can be set in the material graph.

MixCast SDK for Unreal 2.0.1+

This update brings a new function for titles which are built to run as AR applications: Forced Additive Blending. This is a Project Setting which can be used to have MixCast ignore the alpha channel of your scene and blend the scene with the camera feed using Additive Blending throughout.

Configuring Per-Camera Visibility

There are sometimes cases where you want to display imagery to the VR user without affecting the results captured by MixCast (for example: a HUD that would be distracting when drawn over the user’s face in mixed reality). MixCast provides the ability to flag the Actors representing these objects so that they are excluded from the scene when MixCast renders it. To specify these Actors either a Blueprint or C++ function can be called at the start of each of the Actors’ lifecycles.

SDK Version 2.0.0 and lower

Using Blueprints

Add the Set Actor Visibility for MixCast node to a Blueprint of your choosing, then assign the Actor pin to the Actor you’d like hidden for MixCast viewpoints.

Using C++

Call the following function, providing your AActor reference and visible set to false:

void UMixCastBPFunctionLibrary::SetActorVisibilityForMixCast(AActor* actor, bool visible)

SDK Version 2.0.1 and up

This update expands on the options described above to give you full control over the types of cameras that an Actor should be rendered by. Here are some examples where this could be applied:

  • Regular Camera Only: Login Screen with Password Input
  • Regular Camera || First Person MixCast Cameras: Heads-up Display such as Health Bars
  • Third Person MixCast Cameras: Player Accessories such as Hats, Weapons, etc
  • Third Person && Virtual MixCast Cameras: Avatar Body+Head

Using Components

  1. Add the “MixCast Actor Visibility” component to the Actor to control the visibility of.
  2. Configure the fields in the Details pane as desired.

Using Blueprints

  1. Add the “Set Actor Visibility for MixCast” node to your Blueprint.
  2. Connect the Execution and Target pins. The Target should point to the Actor to control the visibility of.
  3. Configure the fields in the node as desired.

Using C++

Call the following function with the desired values. Your AActor reference should point to the Actor to control the visibility of.

void UMixCastBPFunctionLibrary::SetActorVisibilityForMixCast(
    AActor* Target, 
    bool ShowForRegularCameras, 
    bool ShowForMixCastCameras, 
    EMixCastCameraMode MixCastCameraMode, 
    ERenderTypeCondition RenderTypeCondition, 
    EPerspectiveCondition PerspectiveCondition

Further Customization

The basic MixCast SDK integration will allow end users to create engaging media however they’re interacting with the experience, but the MixCast SDK also allows for more complex behaviour to be configured as well, provided you’re working with us directly on tapping into that functionality. Contact us if you think this is something you’d like to explore!

Raising Experience Events

When working with MixCast to create a more tailored content creation process (contact us for details!), you’ll likely need to supply the MixCast client with additional information about what’s occurring in your experience. The MixCast SDK provides a simple, one-step interface for both Blueprints and C++ to send an “Experience Event” from your title to the MixCast Client for processing. Firing these events has virtually no performance impact, and can be done without concern for the current status of the user’s MixCast system (such as checking for the MixCast Client’s presence).


This example sends a simple event with no additional information aside from a unique Event ID of your choice.

This example sends an event with an argument: the amount of damage taken.


Call the following function:

void UMixCastBPFunctionLibrary::SendExperienceEvent(FString eventstr)

If the event has no arguments, the eventStr is just a unique Event ID of your choice. If this event has arguments, the eventStr should have the format: eventId(arg1Val,arg2Val,arg3Val).

Note that no matter how you’re sending your events, you won’t see anything happen at runtime until you’re running MixCast as well, and we’ve worked with you on your MixCast Moment sequencing logic (contact us for more info!).

Adding Custom Tracking

Understanding the locations of certain objects within your experience is sometimes helpful to MixCast’s operations (ex: specifying where a virtual security camera should be positioned for rendering). The MixCast SDK for Unreal makes it easy to expose the position and rotation of a specified Actor/SceneComponent with a number of options.

Using Components

  1. Add the “Custom Tracked Object” component to the Actor whose SceneComponent you want to track.
  2. (Optional) Make the Custom Tracked Object a child of the SceneComponent to track if not tracking the Actor’s root component.
  3. Configure the fields based on discussions with the MixCast team:

Additional Information

This section contains additional information that developers may find helpful in implementing and using the MixCast SDK for Unreal.

Updating The SDK

When updating to a new version of the MixCast SDK plugin for Unreal, first close the Unreal Editor before deleting and dragging in the new Plugins/MixCast folder. You may also need to close Visual Studio.

Known Issues & Limitations

Known Issues

  •  SteamVR:
    • Prior to Unreal Engine 4.20, MixCast won’t support having your VR tracking space set to Eye Level.
  •  Oculus:
    • MixCast currently requires that SteamVR is running for compositing/tracking to function correctly, even if the Oculus plugin is active and not the SteamVR plugin.
Suggest Edit