How To: Implement MixCast SDK for Unity

The MixCast SDK for Unity enables you and your users to create powerful and memorable content from your VR application in minutes. Installation is designed to be painless and is testable without leaving the editor. MixCast also provides significant access to its internal functions, giving developers unprecedented power to customize how their users’ experiences are displayed.

This document contains a number of resources to help you with all aspects of the MixCast SDK:

If you’re having issues with your integration, see if our troubleshooting page has the answer! If not, you can try our community discussion on Discord or open a support ticket.

Technical Requirements

Note: The MixCast SDK for Unity is designed for certain Unity project configurations, but should generally still compile when the project is set to build for another configuration. Please let us know on if this isn’t the case for your project on our Discord!

MixCast will be fully functional under the following scenrios (all lines must be true):

  • Unity version: 5.4.0 or greater
  • Build target: Windows platform x86_64 – Editor or Standalone
  • Graphics API:
    • DirectX 11
    • DirectX 12 (supported in MixCast 2.4.1+)
  • Render Pipeline: Default / LWRP (v6.9) / URP (v7.2)
  • Scripting Backend: Mono
  • Api Compatibility Level: .Net 2.0 / .Net 3.5 / .Net 4.x
  • One or more of the following platforms in use:
    • SteamVR
    • Oculus
    • OpenXR (supported in MixCast 2.4.1+)

Integrating the SDK

Integrating MixCast into your project shouldn’t require any changes to existing code, and only takes a few minutes!

Download the SDK Package

The latest SDK release can be found here.

Import the SDK Package

Open your Unity project, then open the SDK package from either the File Explorer or the Unity Editor. When the Import screen opens, select Import. Wait for Unity for finish re-importing and compiling the project.

Add the MixCast SDK Prefab to your Scene

The only modification required to your project in order to integrate the MixCast SDK is to add a single prefab to your first scene. The prefab is called “MixCast SDK” and can be found under Assets/MixCast/Prefabs. Drag the prefab into the root of your scene and hit CTRL+S to save it. That’s all there is to it!


MixCast has now been integrated into your project. Take a moment to read about if your build process needs to account for MixCast, and then if you’d like, you can test your integration or customize your project’s MixCast settings!

Building Your Project

For most developers, the process of building your project in Unity will be identical before and after installing the MixCast SDK. However, if you have created custom steps in your build pipeline, please read the section below!

Note: MixCast automatically deactivates itself for builds that target unsupported platforms; you should not need to remove the files from the project if the project can build to multiple platforms.

Project Scripting Defines

MixCast makes use of Unity’s Scripting Defines system to activate code for compatibility with the SteamVR and OculusSDK plugins. By default, MixCast will automatically add these defines (either MIXCAST_STEAMVR or MIXCAST_OCULUS) to the project settings when the presence of those plugins is detected under the project’s Assets folder.

In the event that your build pipeline overwrites the set of Scripting Defines used for the build (probably through the use of a call to PlayerSettings.SetScriptingDefineSymbolsForGroup), you need to ensure the proper MixCast flags are included as well. This can be achieved in a few ways:

  • Calling BlueprintReality.MixCast.ScriptDefineManager.EnforceAppropriateScriptDefines() after you’ve called PlayerSettings.SetScriptingDefineSymbolsForGroup. This will append MixCast’s flags to the existing list you just set.
  • Adding the flags to your own custom list of defines manually, using the following logic:
    • MIXCAST_STEAMVR – if you have the SteamVR plugin added to your project
    • MIXCAST_OCULUS – if you have the Oculus Utilities plugin added to your project

Testing your Integration

To test that you’ve successfully integrated MixCast into your project, you should configure your own MixCast setup (don’t worry if you don’t have a physical setup for background removal, output quality isn’t an issue) and then run your application. A standalone build is recommended for testing, although MixCast can also be run in the Unity Editor!

If you don’t have MixCast already running (icon visible in the system tray), start it from the Start Menu by typing “Run MixCast“. With both the MixCast process and your application running, you should now see the MixCast output appear on the desktop as well!

Accessing Project Settings

You can access your project’s MixCast Settings by selecting the menu item labeled MixCast/Open Project Settings.

This selects the Project Settings asset in the project so the settings are displayed in the Inspector window. Here is a breakdown of the options made available by section:


Override Quality Settings AA

If true, causes MixCast to use the next option (Anti Aliasing) to determine how much anti-aliasing to apply to the MixCast output, using Multisample anti-aliasing (MSAA). If false, causes MixCast to use the anti-aliasing option shared with the first person view, found under Edit/Project/Quality.

Note: MSAA is only supported by Unity in the Forward render path and is ignored otherwise.

Anti Aliasing

If set to a value other than Disabled, will enable Anti-aliasing on the MixCast output, so long as the render path for the MixCast camera is Forward rendering.


Using PMA

If true, this enables MixCast to use Pre-Multiplied Alpha rendering to represent transparent objects accurately in the foreground during Buffered Mode compositing. This option requires some additional work by the developer as described here. If false, MixCast uses the standard transparency model.

Grab Unfiltered Alpha

If true, this causes MixCast to store the transparency information for the scene from before post-processing is applied. This can help in the case that post-processing isn’t respecting the alpha channel of the color render buffer. If false, MixCast takes the final transparency information.


Specify Lights Manually

Use this option to completely avoid memory allocations from built-in Unity calls and optimize your project for the best broadcasting experience. Add a MixCastLight component to objects that you want to have lighting.

Directional Light Power

Value that multiplies into the power calculation for how directional lights affect the subject during relighting.

Point Light Power

Value that multiplies into the power calculation for how point lights affect the subject during relighting.


Visualize Subject in Scene View

If true, when running MixCast in your application in the Unity editor, you will be able to see the subject in the Scene View window for debugging.

Apply SDK Flags Automatically

If true, MixCast will automatically detect which additional SDKs are present in the project to support, such as the SteamVR or Oculus SDK, and define Project Script Define flags (MIXCAST_STEAMVR, MIXCAST_OCULUS, etc) to enable specific code within MixCast.

If false, these support flags can be manually toggled via the following controls.

Enabling Accurate Transparency

In order to properly handle transparency of foreground objects, your application must use Premultiplied Alpha (PMA) in its transparent materials rather than standard alpha blending (as standard alpha blending operations are not associative).

Thankfully MixCast supplies an easy solution to produce fully accurate transparency rendering in your project in mixed reality!

  1. Flagging PMA Blending in MixCast
    1. Select the menu item MixCast/Open Project Settings to open the MixCast Project Settings
    2. Set the Using PMA checkbox to True
  2. Enforce PMA Blending in Project Shaders/Materials
    1. Select the menu item MixCast/Fix Shaders to open the MixCast Shader Wizard
    2. Select the Generate Shaders and Update Materials button
    3. Select the menu item File/Save Project in Unity

Note: This process should only impact the mixed reality output and maintain how graphics appear to the user 🙂

Note: The second section need to be completed any time a transparent shader is added, modified, or used with a new material.

Configuring Per-Camera Visibility

When your 3D scene is being rendered from multiple cameras, the need sometimes arises to display some elements to only one type of camera and not others. MixCast offers a number of methods to ensure that only the desired objects end up being seen by the VR user and MixCast cameras.

Layer Culling Mask

A method of filtering visibility of objects that most developers are familiar with is using the Culling Mask field on a Camera component to exclude objects on certain layers from rendering. By default, MixCast cameras are set to copy the Culling Mask from the Main Camera (the headset view), but if you need finer control over the mask, here’s how to configure it.

  1. Override the Layer Cam Prefab as described on this page
  2. Uncheck the Culling Mask field on the SetCameraParametersFromMainCamera component from the previous instructions.
  3. Set the Culling Mask field on the Camera component to whichever layers you want rendered to the MixCast camera’s view.

Per-Renderer Masking

The other main method of displaying or hiding objects from the MixCast camera is by using a provided script called SetRenderingForMixCast. This component handles ensuring that the specified Renderer components (MeshRenderer, ParticleRenderer, etc) are only drawn for the configured camera.

  1. Attach SetRenderingForMixCast to the GameObject you want to hide.
  2. (Optional) Manually specify the Renderer components to control by adding them to the Targets list. If this list is empty, it will be filled on enable by all the Renderer components in the GameObject or its children.
Additional Notes

Controllers: Since the user in mixed reality footage is already displayed holding their controllers with their real camera, the virtual controller models are at best redundant, at worst distracting. To rectify this issue, you should employ the visibility techniques mentioned above to exclude the virtual controller models from rendering. If you’re using the SteamVR-provided SteamVR_RenderModel component to render your user’s controllers, MixCast provides the more tailored component SetRenderingControllerForMixCast to be attached to the same GameObject as the aforementioned script.

Applying Image Post-Processing

If your application employs the use of Post-Processing techniques such as Bloom or Color Grading to enhance the user’s view of the experience, you probably also want MixCast’s output to reflect that visual state.


  1. Create a new Prefab asset in your project and open it for editing
  2. Add a Camera component and any other components required for this pass of rendering. You can add the supplied SetCameraParametersFromMainCamera component if you want to have some of the camera’s parameters copied from your app’s main HMD camera.
  3. Open the MixCast Project Settings by selecting MixCast -> Open Project Settings in the Unity menu.
  4. Set the Layer Cam Prefab to your new Prefab. The Layer Cam applies its Post-Processing to the individual scene layers (background and foreground) as they’re rendered. This camera is rendered through twice per MixCast camera frame. It can access Depth information from the scene and so most effects are supported, although effects on the foreground may produce unexpected results, especially if they rely on the depth information from the scene.

Further Customization

The basic MixCast SDK integration will allow end users to create engaging media however they’re interacting with your experience, but the MixCast SDK also allows for more complex behaviour to be configured as well. At the moment, these features can only be taken advantage of if you’re working with us directly though, so contact us if you think this is something you’d like to explore!

Raising Experience Events

When working with MixCast to create a more tailored content creation process (contact us for details!), you’ll likely need to supply the MixCast client with additional information about what’s occurring in your experience. The MixCast SDK provides a simple, one-step C# interface to send an “Experience Event” from your title to the MixCast Client for processing. Firing these events has virtually no performance impact, and can be done without concern for the current status of the user’s MixCast system (such as checking for the MixCast Client’s presence).

Using C#

Call the following function:

void MixCastSdk.SendCustomEvent(string eventStr)

If your event doesn’t have any variables, your eventStr should just be a unique Event ID of your choice. Example:


If your event has variables, your eventStr should have the format: eventId(arg1Val,arg2Val,arg3Val). Example:

MixCastSdk.SendCustomEvent(string.Format("damageTaken({0})", amountOfDamageTaken));
//ex: MixCastSdk.SendCustomEvent("damageTaken(12.5)");

Adding Custom Tracking

Understanding the locations of certain objects within your experience is sometimes helpful to MixCast’s operations (ex: specifying where a virtual security camera should be positioned for rendering). The MixCast SDK for Unity makes it easy to expose the position and rotation of a specified GameObject/Transform with a number of options.

Using Components

  1. Add the MixCast > Custom Tracked Object component to the GameObject whose Transform you want to track
  2. Configure the component’s fields based on discussions with the MixCast team:

Additional Information

This section contains additional information that developers may find helpful in implementing and using the MixCast SDK for Unreal.

Updating the SDK

When updating to a new version of the MixCast SDK for Unity, it’s best to first delete the old installation in order to avoid issues with renamed/removed folders and files. Since Unity keeps project DLL libraries loaded in the Editor as long as it’s open, you should close the Unity Editor and Visual Studio before deleting the old installation folders.

The following folders should be deleted prior to the import of the MixCast SDK:

  • Assets/MixCast
  • Assets/BlueprintReality (if updating from a much older version)

Note that you don’t need to worry about changes you’ve made to the MixCast Project Settings as described here, as the settings file housing this information is created at Assets/Resources/MixCast_ProjectSettings.asset for this reason.

Example Projects

We’ve integrated the MixCast SDK for Unity into some example Unity projects so you can confirm how a project is intended to look post-integration. Take a look for the one most closely matching your project’s setup below. Note though that very little is different between projects since the setup for the MixCast SDK doesn’t require it.

MixCast SDK v2.3.4
Suggest Edit