Rendering out movies in Unreal Engine is somewhat counterintuitive. We can render movies using the Render to Video in Sequencer which gives us the rudimentary options. But we will miss on ability to queue longer renders with this plus we can’t use anti-aliasing. Welcome Movie Render Queue plugin.

You need to enable Movie Render Plugin in Plugins in order to access the settings. IF you need Additional Render passes, make sure you enable also the Movie Render Queue Additional Render Passes plugin.

Enable Movie Render Queue plugin first.
Enable Movie Render Queue plugin and restart first.

It is also now possible to render out Prores with Movie Render Queue by enabling. Apple Prores Media plugin. If you intend to use Prores 422 HQ for example, enable this plugin.

Having the plugins installed, you can then access the Movie Render Queue from the Window > Cinematics menu.

Opening the window will give you a clean slate that is a bit confusing.

Clicking the green + Render button will enable us to add any level sequences to it. Then each of the level sequences will have their own settings we can customize, or choose existing preset.

Clocking the Unsaved Config* preset will allow us to change render settings.

Here I have opened the Setting dialog of the default preset and added .EXR output instead of the default 8-bit .jpg. It is surprising that these .EXR files are 16-bit instead of 32-bit that the standard Sequencer outputs, I should wonder why this is so. We can also change output to Prores or Render method. We can render out path traced animations for example.

Some of the settings in Render Queue plugin. Having access to anti-aliasing settings is especially useful here.

What’s great about this is that we have access to Anti-aliasing options here. What I have found that works decently well is having some Spatial Sample Count like 8 or 16 while also adding Temporal Sample Count. Spatial Sample Count increases quality of each frames and Temporal Sample Count helps motion blur, reducing glittering pixels especially on shiny surfaces. I am still experimenting of what would work best for quality while keeping the render time decent enough here.

Lastly we can save the presets. If we want to save each of the sequence outputs to point to a different folder for example we can also do it here.

Definitely use Movie Render Queue if you can, it can save a lot of time and give you a higher quality animations.

See the official documentation of the feature over here.

What Apple just announced is definitely fascinating. There is tons to unpack here, but let me just put my thoughts together from the point of view of ex Adobe demo artist who had to run Unreal on a laptop in front of corporate clients.

Running heavy 3D software on notebook (such as Unreal or Substance Painter with something actually in viewport) has so far meant immediately finding outlet and plugging kilograms worth of power brick to outlet. That, or letting the performance die. I can’t stress how huge difference there has been on either plugged or unplugged. To put it simply, without a plug, no 3D, period.

I am quite happy with my 17 inch Zephyrus with 32gigs of memory and a 2080. This works well, as long as it is plugged, and if I set the performance profile to quiet it is also decent quiet, not bad at all. Can also run games in silent mode decently enough. But compiling shaders in Unreal engine is painfully slow when comparing to my AMD tower. If I was to travel and needing to answer to a client work, this would be near undoable, as average shader compile of a new project can take 3X as long as it does on my desktop, the computer being unusable at that time.

Same goes with my 2019 Macbook Pro, by the way, or make that 4X the wait. If Mac would do the same job on battery power, this would be fantastic.

The unified memory is fascinating as well, if Substance Painter got optimization to take advantage of the shared GPU memory, this could potentially challenge RTX3090 machine. I don’t know how feasible this is, but I am a dreamer.

However. At this point it is hard to say how useful the new Mac would be for someone running Substance or Unreal. Unreal key features such as ray tracing support or nanite do require NVidia graphics card which are non-existent at the moment for Apple. I would doubt there will be a workaround for this anytime soon.

No matter how attractive the Apple system is, this would be the deal breaker for me – so many workflows are tied CUDA or Nvidia Raytracing. Also add to this the missing software on Mac. And there is for example the Delighting feature of Substance Sampler.

Cinema 4D and recently Redshift are fascinating though which both run reportedly well on M1. Having Unreal run natively on M1 with Raytracing and Nanite, that would be very interesting.

UE5 Early Access 2 is fantastic to have as a testing grounds. According to Epic this is not supposed to be used as a production tool. This is likely to still change somewhat until the final release.

I wanted to list the most annoying problems I have so far with UE5 EA2 and I plan to update this with the fixes as they come. This is written as of October 17th, 2021.

  1. High-resolution Screenshot tool crashes.. a lot. At this time it is nearly impossible to get anything higher than 1X captured this way.
  2. Subsurface doesn’t work. (Subsurface Profile however works)
  3. Lumen reflections are often weird and soft (as this is still being developed)
  4. Lumen interior lighting is weird and kind of broken.
  5. No tesselation. Only way to tesselate terrain is to use Virtual Heightfields. This is can be a deal breaker.
  6. No cascades
  7. Splotches on meshes when using Raytracing. Use th is to fix: r.raytracing.normalbias 32
  8. Wrong textures with Nanite meshes. Here is fix. Set the Skylight to movable.
  9. VR Preview crashes the editor when playing
  10. Some 4.27 assets will not work but also Niagara systems from 4.26 will crash the editor.

This all said, this still early access and we likely will see most of them addressed in coming months. I also wish the versioning of the early access was made more obvious in the launcher. It is not obvious at all if we are using EA1 or EA2 and so on.

One of the less intuitive things in Unreal World Outliner is the the way how to hide lights or other objects. Disabling them by clicking on the eye in World Outliner results them turning off in the viewport, however reloading a scene will re-enable them causing confusion. Same goes for everything else.

Better way For lights, toggle Affects World to disable/enable it without deleting the actor/component.

For meshes, toggle either Visible (completely hides it) and/or Hidden in Game (hides in game view but not in editor view) to hide it.

This is definitely a head-scratcher. I think better way is to not to use the eye icon at all.

Having splotches in Unreal Engine ray tracing objects with HDRI Environment? Try this console command:

r.raytracing.normalbias 32

Go to Editor/Plugins and disable SteamVR.

Recently I have had issues with 4.27 loading a project. The engine would crash on start without even giving me an error. The culprit seemed to be SteamVR, disabling that removed the problem.

To stop SteamVR from loading automatically, all you need to do is to disable this from the Editor>Plugins. Uncheck “Enabled” and restart and it should no longer load.

Some users have also been able to stop the plugin from loading by editing this file in Unreal installation.

\Engine\Plugins\Runtime\Steam\SteamVR\SteamVR.uplugin

and make sure that this is false.

“EnabledByDefault” : false,

I think I am among many others who feel SteamVR should not be enabled by default. But then this would mean that HMDs won’t work by default. However as of 4.27 the Editor won’t even load properly with SteamVR enabled so no choice for me at least here.

MetaHuman Creator is a breakthrough in creating realistic human chracters.
MetaHuman created with a face rig, ready to be animated.

MetaHuman Creator was easily one of the biggest things that happened in 2021 so far. It is hard to describe by words how incredibly fast it is to create completely believable humans. And this includes very advanced facial and body rigs. To call this a game changer feels like understatement. Well, this is the result of decades worth of work with companies like 3Lateral and Cubic Motion that are now under the wing of Epic Games.

The way how the process goes as of today is that one signs up for MetaHuman Creator Early Access and then gets to the online site where the design and processing of the characters happen. What this means is that we don’t need super-spec PC to do this as most of the magic happens in the cloud.

Creating MetaHuman is very intuitive and easy that works by blending ready made templates. It seems limiting at first but there is real depth and freedom to this especially when diving to deeper level. There are several hairstyles, beards and so on to choose from. And the template animation that shows the character like posing in front of a mirror is so real it is kind of creepy.

It is amazing how MetaHumans are able to avoid the pitfall of the uncanny valley. Character Creator 3 is a great software but even that has issues with especially the out of the box expressions looking cheap. One needs to really know deeply facial animation and rigging to be able to make good results with that. MetaHuman on the other hand delivers very believable and natural results instantly.

When the character is finished in MetaHuman Creator it is as easy as launching Bridge (What used to be Quixel Bridge) and downloading MetaHuman from there. All of your metahumans appear there as downloadable files.

We need to setup the Bridge plugin from the Export settings. It is also possible to get MetaHumans to work with Maya and we can export as high as 8K texture maps.

MetaHuman with custom clothes I rigged in Maya.
MetaHuman with custom clothes I rigged in Maya.

Now what goes on under the hood is really remarkable stuff. What we get are believable shaders, absolutely fantastic hair that is based on grooms that reacts to physics, and full facial and body rig. The facial rig is something we could see in multi-million dollar Hollywood production. It is also pretty easy to use.

Furthermore what’s amazing about this is that we can do high-quality facial animation of MetaHuman characters with with iOS LiveLink. This takes advantage of iPhone ARKit. We can drive facial animation with Live Link Face that is available here. It is quite easy to setup and it is not difficult to imagine the possibilities here.

Rigging custom clothes from Marvelous Designer + ZBrush is relatively straight forward and one can also use the template body that comes with MetaHuman to transfer skin weights to make weighing of the rig faster. The bodies come in several templates and are named logically with a naming convention.

Now, there are limitations of course, one being extremely small amount of garments (the existing garments look good though). Also I feel like the shoulder rig could be better, what comes to body rigs I feel CC3 has more stable rig. In it’s current form – raytracing doesn’t work well with the eyes with simple HDRI for example making the horririble black-eye effect that looks like it is from the Ring franchise. It is easy to somewhat workaround this by lighting the eyes separately or doing a second render pass just for the eyes, but needless to say this is somewhat cumbersome. I suspect that these issues will get fixed in near future so it is not a deal breaker anyhow.

If you are into making characters I definitely recommend to check out MetaHuman Creator. It is free and fun. I will be creating in-depth tutorial on the topic soon.

Here is a scene I created In Unreal Engine 4.26 using MetaHuman.
https://www.artstation.com/artwork/XnL0Jw

If you are new to material editor in Unreal check out this small tutorial on using material functions. It is easy and fun.

Material functions in Unreal Engine sound scary, and I remember I was avoiding them as they sounded more like something programmer would use. None of my concern.. I thought. Well, I was wrong. These are super easy to create and use.

The concept of a material function is very simple. They are like a containers or building blocks that have outputs and sometimes inputs as well.

Creating material functions in Unreal Engine is easy

You can create Material Function by right clicking empty area in content browser and diving to Materials & Textures. Choose Material Function.

What you can see is a lonely Output Result node. Let’s build something.

Empty material functions in Unreal Engine

Let’s say that we want to create reusable roughness adjusting function. This is just a very simple example, but you can see how this technique could be used to make very advanced adjustable materials.

Lets start by making a scalar input to this. Right click on canvas and type “input” and choose Function Input. By default it is Vector3 but we only a single value, so choose Function Input Scalar from the left side Input Type. If you want to create material functions dealing with RGB color, choose 3 way vector for that.

Lets call this RoughnessInput. You can name it as you like in the Input Name.

Next press M on keyboard and left click on empty area of the canvas this will create Multiply node. Connect a line from the newly created RoughnessInput to the A input of the Multiply. Right click on B and select “Promote to a parameter”.

Let’s name this parameter RoughnessStrength and give it a default value of 0.5. We are almost done.

Just drag a line from Multiply to the Output Result node. Our Material Function is now complete.

A completed material function

It will accept Roughness (well actually any scalar value) as input and we have created a parameter that multiplies the input. Actually any parameters or switches we create along the way can be adjusted in resulting child materials. Let’s test it.

Here I have a lovely pink material. Its very simple, just color and .5 roughness value. Let’s drag and drop the function we created and connect it as follows.

Now we have this. In order to see and adjust the param, let’s create material instance of this material by right clicking the material in the content browser.

Now we have a new Global Scalar Parameter in the material instance that controls the roughness strength.

Of course, this is more handy when dealing with more complex materials.

I have created a special FoliageSet function that contains several aspects of entire foliage material, including wind effect. This was then fed to a master material, from which all foliage material instances in a level were created. This means I had to do this work only once, and now there can be several hundred different foliage assets that get their values from this one.

This is what the FoliageSet function contains:

It’s messy, I should have tidied it up a bit more, still way better than having all of this in every single material. I bet this is faster too. So let’s take a closer look.

For example the T_Basecolor is a texture parameter, in other words empty slot which shows up in the final Material Instances. So what we are making here is a template for the assets. Parameters we define in material functions will show up in Material Instances that are derived from the master materials.

Here is other example.

This is a world position offset style wind effect that is cheap to calculate. You can see that this refers to F_TreeWind material function, sets several parameters and a Switch Param and finally feeds into output that is called OutputWPO. So the material functions can have a hierarchy and can be built from blocks like this. This really helps in organization and the right mindset of creating materials modularly.

Please also check adjusting normal map strength tip I posted earlier.

Often we need ability to adjust normal map strength in Unreal after bringing in the maps. Normal map is a special map that contains vectors in color values, so it cannot simply be multiplied by itself to increase strength. This would cause issues.

We need to leave the blue channel unchanged while changing the intensity of R and G. Here is how we can can do this and expose a parameter for ease of use.

Now, I know there is normal flatten node, but in my experience it doesn’t work as well as it tends to also change the blue channel, so I rather do this instead.

Here are the nodes to normal map strength in Unreal
Here are the nodes to normal map strength in Unreal

So drag lines from R and G from the Normal Texture Sample. If you type “mul” you can get to multiply. You can also press m while left clicking to create multiply nodes fast. Make sure the line from R is connected to A of the Multiply node. Then right click B node and choose “promote to parameter”.

You have now made a parameter. Name that to NormalStrength for example. Give this a default value of 1.0.

Then do the same for the green G channel. Next, right click on the empty area to create MakeFloat3. This will combine the RGB values again to be fed into the normal channel. Last, connect MakeFloat3 to the normal input of the material.

Finally, save the material.

Congratulations. You have actually now created a material with a parameter in it. Yay!

Right click the newly created material and select “Create Material Instance”. This will be a child material of the master. You can now see the following Global Scalar Parameter in it.

Here is how to adjust normal map strength in Unreal using material instance.
The newly created Material Instance now has new Global Scaler Parameter values section with newly created parameter to adjust normal strength.

In this way you can adjust the strength of the material anytime, or even create several variations of it.

In next tutorial I will show how to create a material function of this, which helps to clean up the graph and can be reused easily.

Here is useful documentation on creating normal maps.

Don't forget to uncheck sRGB when using RGB channels as masks

I occasionally come up with materials that are incorrectly setup in Unreal Engine. This often manifests in a way that roughness values seem off, and materials appear too glossy.

When using PBR maps it is necessary to make sure to make them linear space by unchecking sRGB texture settings. This is especially important when using RGB masks such as in the video.

I am showing up a little late to this party, but Nvidia has finally released official version of the Nvidia RTX GI Plugin in the Unreal Engine Marketplace. Well link to external site rather, which leads to Nvidia’s site right here. You will needNVidia account to be able to grab this. But if you are using NVidia card, chances are you already have one.

Unreal Engine 4.7 RTX GI plugin in Unreal Engine Marketplace.
Click the External Link will take to to the page where you can grab the plugin.

Now, the premise of this is fascinating to say at the least. Unlike Lumen in UE5 this is purely hardware based, so it’s fast. It will require capable hardware as expected, but the results should be similar or even more crisp than what we are able to see with UE5. Finally we will be able to get realistic real time GI, colorful surfaces bleeding light to others, emissive light sources, well being actually emissive and so on. This gives us that sweet baked light look but without baking.

Installation

Let’s take a look how to get up and running with this. So head to the Nvidia’s site here, and grab the plugin. You need to login and click to “Agree to the Terms of the NVIDIA RTX SDKs License Agreement. And you should be able to download the RTXGI_UE4_Binary_Plugin_v1.140.zip.

Here you can download Nvidia RTX GI Plugin
Nvidia RTXGI Plugin plugin can be downloaded from NVidia

This will require DXR Capable GPU Windows 10 and such. And Unreal Engine 4.27. Please make sure you have those.

Make sure you have closed Unreal Engine before proceeding. Then go Unzip the archive and copy the complete RTXGI folder to Epic Games\UE_4.27\Engine\Plugins\Runtime\Nvidia.

Now when you open Unreal Engine, you should get popup on the down right about new plugins. Check “Manage Plugins” and make sure the plugin is marked as Enabled. If it is, the plugin should be installed correctly. Before the fun begins, make sure

Now when you open Unreal Engine, you should get popup on the down right about new plugins. Check “Manage Plugins” and make sure the plugin is marked as Enabled. If it is, the plugin should be installed correctly. Before the fun begins, make sure default RHI is set to DirectX 12 and Ray Tracing is enabled. If this is a new project, restart and the sometimes lengthy shader compile is necessary.

Using the Nvidia RTX GI Plugin

Next we need to fire up the console and throw the following commands at it:

r.GlobalIllumination.ExperimentalPlugin 1 to enable global
illumination plugins (this can also be set in .ini files, or in blueprints).

r.RTXGI.DDGI 1 to enable RTXGI (set in .ini files, on the console, or in
blueprints).

It is good to enable log view from the Window / Developer menu, in this way we can confirm if the GI is really enabled.

Now the engine should be ready for some fantastic RTX GI action. Yay!

Placing DDGIVolume to the scene will work a bit similarly as PostProcessVolume. They contain a grid of probe points that RTXGI will update using Ray Tracing. The DDGIVolume contains related settings that can be adjusted.

Here is what I was able to come up with five minutes of playing with it.

Nvidia RTX GI Plugin in action
Nvidia RTX GI Plugin in action.

Neat, I say! I placed a spotlight on the middle, shining the light in middle of the purple and white wall. You can see that the bounce now light casts colored light to the right side floor.

At the furthest point you can see emissive cube casting light, and shadows behaving believably way. Also there are no seams. Really, seriously. No seams between modular walls. There used to be a lot of tweaking required when baking modular walls, to get rid of the seams even when the modules were correctly snapped to grid. I welcome this.

If you used a template with static directional light, sun or skylight, and want to change scene to fully dynamic setup it maybe necessary to build lighting to be able to see the new settings in the scene.

It is worth mentioning that currently RTXGI lighting does not work with UE4’s other ray traced effects (for example, ray traced reflections). So Lumen implementation may have edge here although for what I have seen the reflections are not quite there yet either.

We can adjust the probe counts as well as many more useful settings in DDGIVolume. I am planning to return to this topic.

Nvidia RTX GI Plugin
In this example I increased probe count to 13,13,13 and this helped to remove some of the splotching on corners of the room.

With probe counts it is better to keep the count reasonable, otherwise this will cause instant crash but on the other hand too low probe count can lead to splotches or artifacts in the corners.

I will make more in depth tutorial on DDGIVolume settings in near future. At this point I think there must be very few people who know what each of them means. But let’s keep digging!

The video tutorial is now up in Youtube.