You can get very close to your Unity “RenderTexture + compute + blit chain” in Unreal – you just have to think in terms of Render Targets, Materials, and RDG/global shaders instead of CommandBuffers.
I’ll break it into:
- The most straightforward, Unity-style pipeline (no engine hacking, can be mostly Blueprint).
- The “real” compute path using RDG + global shaders (C++).
- A terminology / search cheat-sheet at the end.
Your Unity setup:
- RenderTexture A → RenderTexture B → RenderTexture C → final frame
via Graphics.Blit and/or compute shaders.
In Unreal, the most direct analogue without dropping into RHI/RDG is:
UTextureRenderTarget2D A → UTextureRenderTarget2D B → etc.
evaluated by Materials via Draw Material to Render Target (Blueprint node).
That gives you a pure GPU chain of passes, just like a series of Blits.
In Unreal, the basic “GPU framebuffer resource” you can pass around is a Render Target:
- Asset types:
Texture Render Target 2D (most common)
Texture Render Target 2D Array, Texture Render Target Volume, Render Target Cube if needed.
You can:
- Create them as assets in the Content Browser (right-click → Materials & Textures → Render Target).
- Or dynamically in Blueprint with
Create Render Target 2D.
These are your Unity RenderTextures.
Docs and examples use Blueprint functions like Draw Material to Render Target to fill them. citeturn1search6turn1search21
In Unity your blit pass is typically a full-screen quad material:
fixed4 frag (v2f i) : SV_Target {
float4 col = tex2D(_Source, i.uv);
In Unreal, you create a Material that:
- Has Material Domain: Surface (or Post Process for camera effects).
- Shading Model: Unlit (you’re just writing color).
- Uses
TextureObject or TextureSample nodes for texture parameters.
- Writes result into Emissive Color (since Unlit).
Crucial part: expose your input RT as a Texture Parameter:
- Add a
TextureSampleParameter2D node named e.g. SourceRT.
- Blueprint will set this parameter to one of your Render Targets.
This is your “Blit shader” equivalent.
In Unity you call Graphics.Blit(src, dst, mat).
In Unreal you do the same with a Blueprint node:
UKismetRenderingLibrary::DrawMaterialToRenderTarget in C++,
- Or the Blueprint node Draw Material to Render Target (Kismet Rendering Library). citeturn1search6turn1search21
Typical Blueprint flow:
- Have references to:
RT_A, RT_B, RT_Final (Render Targets).
- Material asset
M_Blit01.
- Create a Dynamic Material Instance of
M_Blit01:
Create Dynamic Material Instance.
- Set its texture parameter:
Set Texture Parameter Value → Parameter Name = SourceRT, Value = RT_A.
- Call
Draw Material to Render Target:
Material = your dynamic instance.
Texture Render Target = RT_B.
Now RT_B is the output of a full-screen pass that sampled RT_A. Exactly like Blit(A, B, mat).
You can chain this as needed:
- Pass 1: RT_Src → (M_Pass1) → RT_A
- Pass 2: RT_A → (M_Pass2) → RT_B
- Pass 3: RT_B → (M_Pass3) → RT_Final
Use two render targets and ping-pong if you need feedback (RT0 → RT1 → RT0 → …), just like Unity.
To get the “scene color” or a custom view into a RT:
- Add a SceneCapture2D actor.
- Set its
Texture Target to RT_Scene.
- Configure capture properties (only depth, only color, etc).
Then your first GPU pass can treat RT_Scene as input to a chain of DrawMaterialToRenderTarget calls.
Alternatively, for screen-space post-effects applied directly to the camera output, use Post Process Materials (Material Domain: Post Process) and apply them via a PostProcessVolume or Camera’s Post Process Materials. That’s more “integrated” but slightly less like your explicit chain.
Anywhere you’d use a Texture2D, you can plug in a Render Target:
- In a Mesh Material:
TextureObject / TextureSample → set it to your RT in Blueprint via Dynamic Material Instance.
- In UMG widgets:
- Use
Brush From Texture with your RT to draw it in UI.
This mirrors how you’d bind RenderTextures to materials in Unity.
This “render target + draw material” pipeline works well for:
- Fullscreen 2D image processing (feedback loops, blur, stylization).
- GPGPU-ish things that can be expressed as pure pixel shader ops on 2D grids.
- Simple simulation buffers (heightmaps, flow maps, etc.).
If your Unity project mainly did Blit chains with simple compute, this already gets you 80–90% there and you can stay mostly in Blueprint + Materials.
If you need arbitrary compute kernels, structured buffers, non-image data, or tight control over dispatch sizes, you’ll want Unreal’s global shaders + RDG.
Unreal’s equivalent to Unity .compute shaders and low-level DispatchCompute is:
- Global shaders (HLSL in
.usf files), wrapped in C++ with FGlobalShader,
- running in RDG (Render Dependency Graph), using
FRDGBuilder, FRDGTextureRef / FRDGBufferRef, etc. citeturn1search0turn1search2turn1search12
In UE 5.5, RDG and async compute got more love, but the pattern is the same.
- HLSL file – your compute kernel (
.usf).
- C++ FGlobalShader wrapper – defines parameters and boilerplate.
- Register an RDG pass in an appropriate rendering hook.
- Bind textures/buffers as RDG resources and dispatch the compute shader.
Conceptually:
FRDGBuilder& GraphBuilder = ...;
// Convert game thread resources to RDG:
FRDGTextureRef Input = GraphBuilder.RegisterExternalTexture(...);
FRDGTextureRef Output = GraphBuilder.CreateTexture(...);
// Parameters (mirrors your HLSL struct):
FMyComputeShader::FParameters* PassParameters =
GraphBuilder.AllocParameters<FMyComputeShader::FParameters>();
PassParameters->InputTexture = Input;
PassParameters->OutputTexture = Output;
RDG_EVENT_NAME("MyComputePass"),
[PassParameters](FRHICommandListImmediate& RHICmdList)
TShaderMapRef<FMyComputeShader> ComputeShader(GetGlobalShaderMap(GMaxRHIFeatureLevel));
FComputeShaderUtils::Dispatch(
FIntVector(GroupCountX, GroupCountY, 1));
The HLSL looks very familiar to Unity compute:
RWTexture2D<float4> OutputTexture;
Texture2D<float4> InputTexture;
SamplerState LinearClampSampler;
void MainCS(uint3 DTid : SV_DispatchThreadID)
float2 uv = DTid.xy / float2(Resolution.xy);
float4 col = InputTexture.SampleLevel(LinearClampSampler, uv, 0);
OutputTexture[DTid.xy] = col;
You’ll find many tutorials that walk exactly this pattern for UE5 (global compute shader + RDG pass). citeturn1search4turn1search19
- Unity .compute shader → Unreal Global Shader (
.usf).
- Unity
Graphics.SetRenderTarget / CommandBuffer → Unreal RDG builder (FRDGBuilder) + RDG passes.
- Unity
RenderTexture used as RWTexture2D → Unreal FRDGTextureRef derived from UTextureRenderTarget2D or internal textures.
You do lose the “just call it from anywhere” simplicity: in Unreal you generally plug your passes into the rendering pipeline (e.g. extending FSceneViewFamily passes, or custom subsystem that has access to FViewport / SceneViewExtension). But functionally it’s the same GPU pipeline you’re used to.
Use the RDG/global shader route if you:
- Need non-screen-space grids (3D volumes, structured buffers, arbitrary data layouts).
- Need fine control over barriers, async compute, or tight perf.
- Want to integrate directly into the main frame’s render graph rather than side-band Blueprint calls.
If your goal is “I need my fancy Unity compute-based simulation”, RDG/global shaders is the “real” equivalent.
These might or might not fit your use case, but worth knowing as a Unity dev:
- Niagara + Grid2D/Grid3D + Simulation Stages – more high-level, great for particle-like or grid simulations that need artist tooling. Uses compute under the hood.
- PCG / Compute Framework plugins – specialized systems for procedural generation / compute, if you find docs referencing them. citeturn1search7turn1search22
Here’s a quick mapping between Unity and Unreal terms, plus how to Google for them effectively.
| Unity term / thing | Unreal equivalent / keyword | Good search query examples |
|---|
| RenderTexture | Texture Render Target 2D (UTextureRenderTarget2D) | unreal draw material to render target, unreal texture render target 2d tutorial |
| Graphics.Blit(src, dst, mat) | Draw Material to Render Target (Blueprint / C++) | unreal DrawMaterialToRenderTarget example, blueprint chain render targets |
| Camera target texture | SceneCapture2D + Texture Target | unreal scenecapture2d render target setup |
| Post-process material (Blit to screen) | Post Process Materials (Material Domain: Post Process) | unreal post process material tutorial, unreal postprocess material chain |
| ShaderLab surface shader | UE Material (graph) | unreal material editor basics |
| HLSL fragment shader (full-screen) | UE Unlit Surface Material output to Emissive, or PixelShader in global shader | unreal fullscreen pass rdg, unreal custom postprocess hlsl |
| CommandBuffer | RDG (Render Dependency Graph) passes (FRDGBuilder::AddPass) | unreal render dependency graph tutorial, FRDGBuilder example |
| Unity term / thing | Unreal equivalent / keyword | Good search query examples |
|---|
.compute compute shader file | .usf global shader with compute entry point (FGlobalShader) | unreal custom compute shader usf global shader, ue5 fglobalshader compute example |
ComputeShader + Dispatch | FGlobalShader + FComputeShaderUtils::Dispatch in an RDG pass | unreal FComputeShaderUtils Dispatch tutorial |
RWTexture2D, StructuredBuffer | FRDGTextureRef, FRDGBufferRef, FStructuredBufferRHIRef | unreal rdg structured buffer, unreal compute shader structured buffer |
| Custom render pipeline / SRP | Engine’s built-in render pipeline extended via RDG & scene extensions | unreal scene view extension render pass, unreal extend rendering pipeline rdg |
| Unity | Unreal | Search queries |
|---|
| GameObject with components | Actor with Components | unreal actor component overview |
| MonoBehaviour script | Actor Component / Actor C++ class | unreal c++ actor tutorial |
| MaterialPropertyBlock / runtime material params | Dynamic Material Instance + SetScalarParameterValue / SetTextureParameterValue | unreal dynamic material instance set texture parameter |
| Need / concept | Unreal way / keyword | Search queries |
|---|
| “Draw many quads to a RT” | BeginDrawCanvasToRenderTarget / EndDrawCanvasToRenderTarget | unreal canvas to render target draw |
| GPGPU via particles or grids | Niagara Grid2D/3D Simulation Stages | unreal niagara grid 2d simulation stage compute |
| Multi-pass frame exports / AOVs | Movie Render Graph (for offline / high-quality renders) | unreal movie render graph layers, ue5.4 movie render graph citeturn1search1turn1search16 |
-
“Unity Graphics.Blit chain to Unreal”
→ Search: unreal chain render targets DrawMaterialToRenderTarget, unreal ping pong render targets feedback.
-
“Unity compute shader → Unreal”
→ Search:
unreal custom compute shader usf, ue5 rdg compute shader tutorial, Fast and Clean custom compute shader ue5 (great blog). citeturn1search4turn1search19
-
“Full screen post-effect like my Unity image effect”
→ Search: unreal post process material tutorial, unreal custom post process hlsl.
If you tell me what your Unity chain was actually doing (e.g. fluid sim, reaction-diffusion, GPGPU particles), I can sketch a direct UE 5.5 layout: which Render Targets, which Blueprint nodes, and where you might want to drop into RDG/global shaders.