Quick Start

Last updated: March 05, 2023


There are multiple downloads for the RealtimeMeshComponent, depending on the Unreal Engine version you’re using.

  Download v5 for UE 5.0+ (Current)

  All other releases for older engine versions can be found here


Installation from Marketplace

  1. Like any other plugin from the Marketplace, you will need to install it to the engine first through the marketplace and Epic Games Launcher.

  2. For Blueprint usage, you’re basically ready to go!

  3. For C++ usage, you will want to follow the steps Below to get C++ access!

Installation from GitHub

  1. You will need to start with a code project. This does mean you’ll need all the build dependencies necessary to create c++ projects in Unreal. However this doesn’t mean you’ll need to use C++ at all, just that the engine needs to be able to build the plugin. To turn a blueprint project into a code project, you can go to “Add New C++ Class” in the editor and it should set everything up for you!

  2. You can download the RMC from one of the links above.

  3. Navigate to your project folder, and assuming the folder doesn’t already exist, create a ‘Plugins’ folder in the root of your project.

  4. Within the plugins folder create a folder called RealtimeMeshComponent and copy the contents of the version you downloaded into this folder, making sure not to nest subfolders, all the files/folders in the directory of the .uplugin file should be in your newly created /Plugins/RealtimeMeshComponent folder.

  5. At this point you should be able to relaunch the project and the editor should detect the new plugin and compile it for you!

  6. You should be able to start using the RMC!

Using RMC From Code

  1. To make the RMC available in your C++ project, you must first install it from either the marketplace or GitHub as described above.

  2. Next, in your projects build file ‘YourProjectName.Build.cs’ you must add this line:

  3. Now you should be able to #include the headers like any other plugin and use the RMC from C++!

Using RMC from Blueprint

Using the RMC from Blueprint is simple, once the plugin is installed to the engine or project it’s ready to go!

Mesh Basics

How Meshes are Represented

Meshes are represented through a vertex buffer and index buffer. These work in tandem to represent a mesh. The vertex buffer contains a list of all the unique vertices in the mesh, including their position, normal, tangent, color and texture coordinates. The index buffer, in the case of the RMC is in the form of a triangle list. This is laid out as a contiguous list of indices into the vertex buffer 3 at a time representing the 3 points of a triangle.

Mesh Structure Example

Representation of a triangle and it's arrays

Vertex Buffer Layout

The vertex buffer is a list of vertices with all its corresponding data for all the unique vertices in the mesh. There are several options, which can affect the memory consumption, rendering performance, and visual quality of the mesh.

The elements of the vertex buffer are:

  1. Position: This contains the position in local space of the vertex of this mesh. This is a FVector as it contains a (X, Y, Z) position.
  2. Normal/Tangent: This contains the normal and tangent at this vertex and is used for lighting information and materials. Normals can be in either normal or high precision format. High precision correllates to the high precision tangents of the engine, so this option is really only useful when the engine is configured to use high precision tangents. The size of the tangents in memory is either 8 bytes for normal position, or 16 bytes for the high precision. Normals point up from the triangle towards the camera when the triangle is visible, and tangents point towards the increasing U coordinate of the UV. Cotangents can be calculated from that and point towards the increasing V coordinate of the UV. These also determine if you have hard shading (where you can see individual triangles) or soft shading (usually used). For hard shading, each vertex of the tris needs to have the normal of the tris, so that means your vertex buffer is around the same length as the index buffer.
  3. Color: This contains a single color value for this vertex, it is up to the material if and how this is used. It can be for vertex painting, or just passing through extra data to the material for other effects. This is always represented as FColor, which contains a 4 bytes B, G, R, A, value.
  4. Texture Coordinates: Within UE, the mesh can have 1 to 8 channels of texture coordinates, most meshes will only use 1, but you can have all the way up to 8. Beware : most of the time UV 2 is used for lightmaps, so you’ll need to configure that correctly. There’s 2 options here, one is a channel count and the other is a normal/high precision UV’s. The datatype switches between FVector2DHalf and FVector2D for normal vs high precision. This means it uses either 4 or 8 bytes of memory per channel, so the texture coordinates can range in memory from 4 bytes for a single channel normal precision up to 64 bytes for all 8 channels in high precision.

Depending on the configuration above, the total vertex size can range from 28 bytes for a normal precision tangents, normal precision texcoord single channel, to as much as 96 bytes for all high precision components with 8 texture coordinate channels.

Index Buffer Layout

The index buffer is meant to map the vertices into triangles. Within the RMC this takes the form of a triangle list, so indices will be in groups of 3 one after the other to define the 3 points of each triangle, each group of 3 is separate from the others and has no dependence on the others. With that said though order of the vertices within the triangle does matter for culling (See Winding Order below), and order of triangles within the buffer can have performance effects through things like cache coherency, locality, transform cache, and overdraw.

The index buffer can be either a 16 bit integer, or 32 bit integer per element, so it will consume 2 to 4 bytes per element. 16 bit integer is the default as it can handle meshes of up to 2^16 or 65,536 vertices.

Winding Order

Winding order refers to the order the vertices are reference for the triangle, this is important as one of the most common ways of improving performance is through backface culling. This works by looking at the direction the triangle wraps to detect if it’s a front or backface, and if it is found to be a backface then the triangle will get culled in hardware. By default Unreal Engine uses ClockWise Culling, so you should make the order of the vertices when referenced in the index list wrap Counter-ClockWise when viewed from the visible side.

It is possible to disable backface culling in Unreal by using two sided materials. If your mesh appears to render inside out, then you need to reverse the winding order.

Mesh Hierarchy/LOD

Each RealtimeMeshComponent can have 1-8 Levels of Detail or LODs, each of which can have any number of section groups, each of which can have any number of sections. Each LOD is separate from the others, and so can have different numbers of sections and different materials bound to those sections.

Each LOD has a ScreenSize associated to it. This is the percent of the screen the bounding volume has to cover before this LOD is rendered.

Material Slots

Unlike the ProceduralMeshComponent and old RMC, materials are handled similarly to how StaticMesh handles them. URealtimeMesh holds a number of material slots, setup by SetupMaterialSlot, each has a index, name, and material. You can find these slots by index or name. Each mesh section can be assigned to any slot.

The RealtimeMeshComponent, like the StaticMeshComponent, has override materials (which was how RMC and PMC previously handled materials). These materials override the slots by index, and allow different components to bind different materials even when they share the same underlying mesh data.


Collision in the RMC has a few different parts, including basic settings and two collision types each of which has its own benefits and limitations.

Simple Collision

Simple collision is made up of Boxes, Spheres, Cylinders, and Convex Elements. These are the basis of simple collision, which is required for movable objects that can interact with the environment and perform overlap tests. These are all convex shapes, as concave collision detection is far more complex. You can have none, one, or multiple of each of these 4 shapes, but you must have at least one to have any form of simple collision/physics simulation. Convex means that any line between two points belonging in the volume belongs to the volume.

Convex Elements are a convex mesh object. These can be generated directly by you, or possibly through a process known as convex decomposition where you take a source mesh and generate one or more convex shapes to represent it. This is how UE4 handles arbitrary shapes for collision. Convex elements are slower to compute than other primitives, and are limited to 256 vertices.

Note : Collisions don’t have to be perfect, since the primitives are invisible. Sacrificing accuracy for performance is alright : Avoid using convexes.

Complex collision

Complex collision is made from a triangular mesh. This can be either from your renderable mesh data, or a custom simplified mesh for collision. Usually the latter is better for collision performance, but takes extra effort to generate. No two complex collision shapes can perform collision tests, so complex collision only objects are not allowed to simulate physics. Having a line trace set to complex will return the complex collision mesh’s triangle index on hit.


The collision settings object is used to set the simple collision shapes, as well as some basic collision cooking settings.

Collision Cooking is required for convex elements and complex mesh. This process builds internal structures for performing high performance collision, but it takes a non-trivial amount of time to perform this collision on complex meshes. With this the RMC supports Async Cooking which can be turned on or off through the flag bUseAsyncCooking.

Complex as simple collision is where you have no simple collision for a shape and let the complex collision be used for things such as line tracing which would normally use the simple collision. This setting is controlled by bUseComplexAsSimple on the collision settings object.

CookingMode can be set to either CollisionPerformance or CookingPerformance. The first prioritizes efficient collision detection at the cost of a little more effort in cooking to build optimal collision structures. The second prioritizes quickly cooking meshes over having optimal collision structures.

Component Structure

The Realtime Mesh Component is made up of several distinct parts, each of which provides a portion of the overall functionality. Together the combination can be extremely extensible, or very simple. If you’re familiar with how UStaticMesh and UStaticMeshComponent works, this well be a simliar configuration, with several things built on top.


URealtimeMeshComponent is the main component that allows you to place a URealtimeMesh in the scene and interact with it exactly like UStaticMeshComponent. It’s possible to have multiple URealtimeMeshComponents all sharing a single URealtimeMesh, this means that a single copy of the gpu buffers can be shared among several individual components that all can act independently while drawing the same mesh. This is not the same as instancing which draws multiple copies of the mesh at different locations/rotations within the same component, but they all function together.


URealtimeMesh is the data carrier component, and is abstract on its own as it relies on a concrete implementation to fully function. It is responsible for owning the GPU buffers, and the physics object that can then be used by one or more URealtimeMeshComponents. When you update the mesh data of a URealtimeMesh all the linked URealtimeMeshComponents will receive the mesh update together and they will all start rendering the new mesh. The base URealtimeMesh does not store any data in main memory, it simply sends it over to and stores it in VRAM to be used by the rendering pipeline. For collision a similar situation is used where the concrete implementations of URealtimeMesh decides where it get the data and whether to store it.


URealtimeMeshSimple is the most direct and simple to use implementation of URealtimeMesh. It works exactly like the ProceduralMeshComponent with added features. You setup your LODs, setup your Section Groups, and setup your Sections, and provide them mesh data and the URealtimeMeshSimple takes it from there, storing all the mesh data for subsequent re-use so you can forget about it from there forward.


URealtimeDynamicMesh is the next implementation of URealtimeMesh, and it provides a way of rendering UDynamicMesh objects through the RMC. This lets you use the powerful geometry scripting systems found in Unreal Engine 5, while also being able to use the improved rendering performance and additional features of the RMC.


URealtimeMeshComposable is the third implementation of URealtimeMesh that uses a provider stack to get the mesh data. You can chain providers together to compartmentalize logic, and the URealtimeMeshComposable will handle requesting the mesh data as it’s needed. The URealtimeMeshComposable doesn’t store the mesh data itself, so for example if you have a caching provider that pages to disk it can pull it in the rare case it needs it, but doesn’t keep it in vram. This is similar to the older RMC’s provider stack except that a provider chain can be shared amongst several URealtimeMeshComposables to support having a factory stype setup that can create multiple components.