Comparing the Cinematic Workflows of UE and Unity

Trying to come up with some content for my blog, I figured I’d try making a cinematic I’ve already made in Unity, using UE. When the starting point is just making a cinematic, without having a game to consider, due to its superiority with photorealistic lighting, UE would nowadays be a more probable choice for the job. However, I’ve worked on cinematics with both, and wanted to get into the weeds of the strengths and weaknesses of both, regarding the basic tools that making a cinematic would revolve around.

For some background, I used to do 3D modeling, textures and some concept art for games, and in the recent years, I’ve continued that with a game project I’ve been slowly trudging on with alongside the freelance work making game trailers.

Below is a split-screen render of the two versions of the cinematic I made in the engines.

Split screen with both renders combined, here’s Unity and UE separately. Music by Olli Oja.

A simple cinematic in a dimly lit, foggy setup like this can be made to look pretty much the same. However, digging deeper topic by topic reveals more of the limitations and possibilities of the engines.

Versions I used:

  • Unreal Engine 5.5
  • Unity 6.0 with HDRP

The topics covered are determined by the relevance for cinematics, and to some extent, what I found interesting to explore. No doubt there are features I don’t get into that are also important for cinematics, like terrains, the MetaHuman tools and assets of UE, and particle systems to name a few.

  1. Structure of the sequence
  2. Cameras
  3. Animating characters
  4. Lighting
  5. Testing reflections
  6. Post processing effects
  7. Playing particle effects from timeline
  8. Rendering
  9. Working with assets
  10. Conclusion

1. Structure of the sequence

To start with the basics, the shots and animation tracks can be arranged in a similar way in the “Sequencer” tool of UE and “Timeline” tool of Unity. Animation tracks can be placed on either the main timeline where you have your shots, or inside the nested timelines of the shots. However, looking closer, there are some differences on details like track types and how you’d access the keyframes.

Track types

List of track types
UEUnity
ShotsControl Track
FolderTrack Group
Object Binding TrackActivation Track
Animation Track
Event TrackSignal Track
Audio TrackAudio Track
Playable Track
Visual Effect Control Track
Camera Cut Track
Subsequence Track
Time Dilation Track
Fade Track
Level Visibility Track
Data Layer Track
Media Track
The track types available out of the box, with corresponding ones on the same row. The list can be expanded with plugins. On the UE side, the list doesn’t include “sub-tracks” that would be added inside the Object Binding Track.

Clarifications:

Object Binding Track (UE)

Can be “Spawnable” (enabled only when needed in the sequence) or “Possessable” (enabled otherwise as well). Contains both the activation and animation keyframes of that object.

Event Track (UE) / Signal Track (Unity)

Fires events from specified blueprint (UE) or script (Unity).

Control Track (Unity)

Nested timelines would be placed in a control track, so it’s the same as the “Shots” track in UE which is added automatically when you make the sequence.

Camera Cut Track (UE)

An alternative way of handling camera cuts (instead of having the cameras in their nested clips). You can have cameras that exist in the sequence all the time and cut between them. Instead of “cuts”, they can be also be made smooth transitions between the cameras. The same thing could be done with Unity’s free Cinemachine plugin.

Fade Track (UE)

Fades the entire screen to any solid color.

Time Dilation Track (UE)

You can add keyframes for the playback speed of the cinematic.

Subsequence Track (UE)

A sequence can contain nested sequences, which allows several artists to work on the same sequence.

Level Visibility Tack (UE)

You can control which level (scene) is visible at the time, but the ones you switch between would have to be loaded all the time.

Data Layer Track (UE)

Switches the visibility of data layers.

Media Track (UE)

Used for showing or controlling the playback of images or videos from the sequence.

Playable Track (Unity)

You can create a custom “track type” that affects some variables and behaviors of your choosing, saving you some time compared to animating the normal way with an animation track.

Visual Effect Control Track (Unity)

Allows you to play an effect made using the VFX Graph through the timeline (so that it works also when previewing).

Differences:

  • UE’s Object Binding Track combines Unity’s Activation and Animation tracks, allowing you to adjust the time span of the object being spawned, and the keyframes of its properties, from the same place.
  • UE has many more track types, which I’m sure can be handy in some very specific situations.
  • Unity’s Playable track allows you to craft your own track type by coding, so that you could easily animate the things you want with it from the inspector. I didn’t try it myself though.

Nested timelines

Both engines can have separate timelines nested inside a master timeline. In UE, the UI pretty much handles adding some of them by default, but in Unity, you’d need to know to build that hierarchy yourself by adding a Control Track and dragging other Timeline objects containing the separate shots into it.

Dividing the tracks (especially the cameras) to separate timelines helps with keeping things tidy and saves some time.

When I was working on the original cinematic in Unity, I only worked with one big master timeline, and ended up having to fix some “flash frames” where an object was enabled or disabled, or a character teleported slightly out of sync with the cut.


Just an example of a Shots track (UE) and a Control track (Unity), and a nested clip in both engines.

Both engines also allow you to group tracks in their folders:

Accessing the keyframes

In UE, you can simply expand the animation track of an object to reveal the animated properties and their keyframes.

In Unity, after recording some keyframes, you have to right-click the recorded track and select “Edit in Animation Window”. This opens a separate panel where you can adjust the keyframes by hand.

Accessing the keyframes to change their timing or properties.

2. Cameras

Unreal Engine seems to have more advanced controls for animating cameras out of the box. In Unity, to reach a similar range of options, the free Cinemachine package would be needed. With my simple cinematic, the tools that come with Unity were sufficient, but I’ve also explored the options enabled by Cinemachine in this section.

Moving the camera

In both engines, you can move around in the scene by holding the right mouse buttton and using WASD. In Unreal Engine, you can also lock the camera to the scene view to move it. In Unity you’d position the scene view how you want it, select the camera, and choose Align With View from the Edit menu (or use the shortcut Ctrl + Shift + F).

It’s a small difference, but there’s quite a lot of tweaking and animating camera angles involved with cinematics. I think not having to really think about it in UE lets you focus more on what you’re trying to do.

From the user experience standpoint, I think there’s also another detail about the viewport that in my opinion makes working in UE smoother: You can select objects directly from the camera view you use to preview your cinematic. In Unity, you’d have to switch between the Scene view you use to select objects, and the Game view you use to preview the cinematic. In many cases, it can be handy to have those be separate views on the screen, but talking about just cinematics, I think you usually want to use all your available resolution on one viewport.

Camera motion beyond the basics

If you import the free Cinemachine package for Unity, using its cameras on the Timeline sequence instead of regular ones enables these features (also available on UE without plugins, though without the damping controls):

  • Setting the camera to target a moving object with damping controls
  • Constraining the camera’s position or rotation to a moving transform, also with controls for damping
  • Adding position and rotation transitions between cameras by having a bit of overlap on their timing on the track.
  • Animating a camera along a “dolly track” (which just makes controlling the speed easier than by using regular position keyframes)
Adding cameras and transitions between them with the Cinemachine package in Unity.
Using Dolly Camera as it’s called in Cinemachine, or Camera Rig Rail in UE, as well as Look At, in both engines.

Depth of Field

In both engines, depth of field can be adjusted either from the camera itself or the post processing volume. Using the latter, there’s more freedom to add more blur than would be physically possible.

In UE, there’s a handy sampler tool for adjusting the focus distance from the camera, as shown below.

Adjusting focus discance in UE.

Judging by the test scene below, at first glance the blur looks great on both engines (in Unity, you’d just have to set the quality to High). However, looking closer, there are differences.

In the foreground blur of the Unity screenshot below, on the top right, you can see kind of a “halo” of blur around the closest pressure tank, while in Unreal Engine, the border between the blurred object on the foreground and the wall structures in focus behind it looks natural.

Also, it doesn’t really show in the screenshots very well, but the DOF in Unity has a consistent blur on the far and near area out of focus, never mind how far the objects are from the focus distance. In UE, there’s a natural gradation to the levels of blur depending on the distance.

3. Animating characters

Animating characters using existing animation clips works similarly in UE and Unity, allowing you to add compatible clips on the timeline easily. To add transitions, in both engines you simply drag the clips so that they overlap.

Adding clips and transitions in both engines.

Layering Animations

Both Unity and UE allow you to extend “base” animation with additional motion on the timeline, but the methods and what you can achieve with them are a bit different.

In short, Unity lets you define a part of your character to receive its movement from another animation clip, while UE allows you to add a separate animation layer where you can animate any bones you want with keyframes, without canceling out the motion the bones already had.

In the video below, I froze the arms of the walking guard in Unity by adding an Override Track with an Avatar Mask restricting it to only affect the arms. On the override track, I placed an animation clip with the carrying pose.

Adding an Override Track in Unity.

To wholly replace the motion of some bones in a similar way in UE, you’d have to bake the keyframes and modify the separate keyframe tracks of the bones (at least according to my testing and what I’ve been able to find).

On the other hand, UE lets you add additive motion, which Unity doesn’t seem to currently support in the Timeline (although additive layers can be used in Animation Controllers, used to animate characters in game).

Here you can see how I made the walking guard turn its upper body and head while walking in UE, without canceling out the slight bobbing motion of the upper body.

The “Layered Control Rig” features that make this possible were introduced in UE 5.4. This video (by the product owner of Sequencer) explains those features in depth. Especially the Ground Alignment Layer described from 7:00 onwards (though it would require using an actual UE control rig) and the summary of the new features shown by examples at the end are worth checking out.

Baking keyframes (UE)

In UE, you can “bake” the track of animation clips on your timeline, to adjust the keyframes separately by hand.

For me, the resulting track is called FKControlRig (which is some default name from UE). It bakes the keyframes even for the weirder bones in the bird feet that are definitely not part of a standard humanoid rig.

I baked the whole sequence of animations on the main character to fix some clipping issues and jerky motions caused by some inaccurately placed transitions. The video below shows some of the process of one of those fixes.

However, baking the keyframes is a bit of a destructive workflow that makes it difficult to go back to adjust the sequence of animation clips. When doing the bake, the separate clips are stored on a hidden track for safekeeping, but if you want to change their order or timing later on, you’d have to abandon the changes you’ve done on the baked keyframes.

On the Unity side, baking the keyframes from character animation clips on a single track out isn’t possible without plugins. It might be possible to record all the character movement on a single track with Unity Recorder, but I didn’t try it out, as in most cases, just modifying separate animation clips in something like Blender to achieve what you want would be much handier.

Root motion

Root motion means that the animation itself is moving the character from A to B. This can help avoid foot sliding which could easily happen trying to match the movement speed to the animation.

Both the Sequencer of Unreal Engine and Timeline of Unity have tools to adjust the offsets between root motion clips so that the next animation starts where the previous one ends.

Since I don’t normally animate with root motion, I put my character through the automatic rigging of Mixamo and downloaded some animation clips with root motion from its selection. With those, I made the same sequence in both engines.

Using root motion clips In UE

In UE, there’s an option that tries to match the position and rotation of any bone you choose to what it was at the end of the previous clip to achieve a seamless transition. In a walking or running loop for example, the foot on the ground during the transition would be a good choice. It doesn’t seem to work every time but there are also manual offsets you can add to make corrections.

Building a sequence of root motion animation clips in UE.
Using root motion clips in Unity

In Unity’s timeline you can move and rotate the next clip to start at the right place by hand (or use numeric offsets). Although it doesn’t calculate them automatically, it’s still pretty quick to do.

Unity is better at handling looping animations with root motion on the timeline. Assuming you have the correct settings for root motion in the animation clip, the next loop starts where the previous one ends without having to add offsets as in UE.

Building a sequence of root motion animation clips in Unity.

Animation retargeting

With Unity’s Humanoid Rig, you can pretty easily set up your rigged character asset so that it can use any humanoid animations in the project (which also have to be set up to use the Humanoid Rig), even if the naming of the bones is different between the rigs.

In Unreal Engine, instead of having animations be interchangeable through one “basic humanoid” rig defined by the engine, you can retarget specific animation clips to work with a different rig. As a downside, it results in a lot of files, but as an upside, the same retargeting can be done for non-humanoid rigs as well.

The process of retargeting seems to have many more steps (and pitfalls for a beginner like me) in UE. That’s something that seems to be pretty recurring across different features in the engine.

Retargeting animations in UE

From UE 5.4 onwards, there are two ways to do this:

The new way didn’t work for me, probably due to my character rig being too different from the ones UE recognizes. Using the old way, I got retargeting to work after some trial and error (caused by having exported the character with wrong scale at first), with just one small problem left unsolved. The feet and hands (not part of the retargeted bone chains) seem to be scaled slightly bigger by the animations somehow.

Retargeting animations of the smaller character to be used by the bigger one in UE, using an IK Rig and IK Retargeter asset for each.
Retargeting animations in Unity

In Unity, it’s as easy as setting up both character rigs to be Humanoid rigs, so they can use all Humanoid animations in the project.

Control Rig (UE)

Unreal Engine has the possibility of adding a control rig with easy handles for animating the character in engine with IK.

UE 5.4 introduced something called “Modular Control Rig”, which in the optimal case allows you to add the control rig quickly by drag and drop. In my case, I had to solve some scale problems before I got it to work (and on some exports, the bones twisted in weird ways so it seems to be quite picky about the rig you use). Anyway, even if the setup work takes time, being able to animate a character in engine can definitely come in handy.

Below are timelapses of animating a backflip for both the UE mannequin and the seagull character. For the seagull, I used an old way of attaching it to the existing control rig of the UE mannequin, not the new Modular Control Rig (since I didn’t know about it at the time).

I couldn’t animate the bird’s toes since they’re extra bones added to the human skeleton. That would’ve probably required delving into the blueprint (or might’ve been easy to do with the Modular Control Rig).

Animating with a control rig in UE, using both the UE mannequin and a custom character model.

Things I like about the UE Control Rig:

  • The IK/FK switch in each limb is a checkbox, and when you switch it on or off in the middle of animating, the pose of the limb stays the same.
  • There’s an IK toggle for the spine, which allows you to animate the whole spine by rotating just the chest bone.
  • Similarly, the head also has an IK toggle, which allows you to move the head without stretching the neck or having to rotate it separately.
  • The IK hint objects of the limbs mostly stay where you need them to be (but their positions can be adjusted and animated at will).
Trying the same with Unity’s Animation Rigging package

The Animation Rigging package contains the basic constraints you’d use to set up a control rig in another program, like 2-bone IK and ones that allow you to copy transforms from another bone.

At first I didn’t write about it, since I thought the constraints only work in Play mode. However, since someone in Reddit reminded me of its existence, I looked into it more and read that they don’t require Play mode if you have the Animation panel open and Preview toggled on.

I set up a control rig to test it, adding IK for arms and legs with their control bones. However, when I moved the pelvis bone, it quickly snapped back to its original position in the next frame.

It might be something wrong with my setup but also, I couldn’t find any tutorials on YouTube about making a whole animation for a 3D character from scratch using the constraints, which would indicate that the package isn’t meant for making a full control rig. I think it’s mostly just for adding small tweaks for existing animations, and procedural animations controlled from code.

4. Lighting

Direct lighting has less technical complexity to it I guess, so the differences of the engines have mainly to do with the alternative ways of handling Global Illumination, which takes into account indirect lighting as well (light bouncing from lit surfaces).

Both engines have these options:

  • Baking lightmaps (which helps with GPU performance with the cost of disk space and memory, and the time it takes to bake them)
  • Screen Space Global Illumination / SSGI (a post processing effect that doesn’t take into account the light bounced from objects out of view)

What sets Unreal Engine ahead is its Lumen lighting and reflection system, which can provide very realistic looking indirect lighting in real time without being overly resource intensive.

Unity also has an option to bake what’s called in the engine “realtime lightmaps” (using a middleware feature called Enlighten), which are tiny lightmaps that allow the engine to calculate indirect light in real time. This video by UGuruz is great at showing what it does. The geometry of the lightmaps has to be static. Apparently the feature was introduced for Unity already in 2015. They’ve stated that the current Unity 6 is the last version which will support Enlighten. Having only learned about the feature recently, I really like the option to alter the lighting in real time with GI (which even Ray Tracing doesn’t seem to provide very well yet in Unity), so I’ll keep my fingers crossed that there’ll be a replacement in version 7.

I made a simple test scene (using models from this asset store pack) to test different lighting methods in both engines.

Unity:

  • SSGI = Screen Space Global Illumination
  • SSAO = Screen Space Ambient Occlusion
  • SSR = Screen Space Reflections

Note that with Path Tracing, rendering a frame takes minutes, so it’s unsuitable for any real-time use in both engines.

Someone in Reddit pointed out that I didn’t take into account the effect of Shadow Filtering Quality, which would’ve softened the real-time shadows around the lamps. Here’s a comparison between medium (which I had on in the screenshots above) and high:

This video by Sakura Rabbit shows a superb looking lighting setup using Adaptive Probe Volume, SSAO and SSGI (probably Ray Traced since the lighting gets updated with a bit of a delay).

UE:

I wasn’t able to bake lightmaps in UE, despite spending a few hours trying to problem solve it.

A video of NVidia’s “Zorah” demo using Lumen.

Testing realtime global illumination

I wanted to test how Unity’s “Enlighten” (“realtime lightmaps”) would compare to Lumen in a cavern-like environment with a lot of indirect light. The way Lumen illuminates the surfaces with bounced light looks more even, but I don’t think the difference is night and day. Lumen also has other upsides though, like that the light source can cast soft shadows, and the geometry with the global illumination drawn on it can be moveable, since nothing is baked.

In the video below, I also included Unity’s Screen Space Global Illumination with Ray Tracing. It takes quite long for it to get adjusted to the lighting changes as you can see.

Light Types

UEUnity
Directional Light
Point Light
Spot Light
Rect Light
Sky Light
Directional Light
Point Light
Spot Light
Area Light

The absence of a Sky Light in Unity is the main difference. In Unity, ambient lighting is added from the Environment tab of the Lighting window, by adding a Volume Profile to its slot to act as the source of the skylight, and generating the lighting. The intensity can then be adjusted from a different place (the Volume component). For years I didn’t know this and stumbled in the dark trying to achieve the appearance of ambient lighting with just the settings in the Volume component and reflection probes.

Comparing a nighttime and daytime lighting setup

Real-time volumetric lighting in this kind of a night-time scenario without much indirect light looks pretty similar in both engines.

I tried to get the Unity version to look as similar as possible to the UE one, so the lighting is a bit different from the video at the top of the post.

In the daylight screenshots below, the main difference is how much more bounced lighting you can see on the feathers in shadow with UE’s Lumen.

Screenshots in daytime real time lighting in both engines. The only thing baked is a reflection probe in the Unity screenshot.

I suppose there are much better setups to show Lumen in all its glory than this scene. However, below is a video of the global illumination switched on, helping you see what Lumen does for the lighting in this case.

Switching on Lumen’s global illumination
I tried using baked lighting and Adaptive Probe Volume in Unity, which add some indirect light on the walls, ground and the character.
Testing a video texture as a light source

This teaser video of the Megalights feature of UE shows video textures radiating light, and a bunch of Youtube tutorials released in its wake have shown how to pull that off, so I also had to try it out.

Sure enough, applying a video as a texture in a Rect Light works without much hassle with Lumen, without even having to switch on hardware ray tracing.

In Unity’s HDRP as well it seems to be possible to link a render texture as the “Cookie” (=a mask) of an Area Light, and play a video on it through the Video Player component. Apparently it would require Path Tracing for it to affect the volumetric fog though.

A guard watching the trailer of Gust of Wind in both engines.

5. Testing reflections

For some reason, Lumen seems to have added some brightly lit spots in the reflection. However, comparing the screen space reflections, in UE they seem to work better with the volumetric fog (even though I had volumetric fog enabled for reflections in Unity as well).

6. Post processing effects

Without delving deeper into them, both engines have pretty much the same post processing options available. In UE, there’s a list of effects as expandable foldouts, while in Unity, you’d add what you need in your Volume component as Overrides.

The color grading options of UE seem to have a vast amount of controls, even though Unity also has more settings than I’d know what to do with, not knowing much about color grading beyond the basics. If you know exactly what you’re after, I suppose it might be easier to get there in UE.

Though I’ve expanded only one, UE seems to have an infinite number of color wheels one can adjust.

7. Playing particle effects from timeline

Particles made with UE’s Niagara system and Unity’s Visual FX (the newer particle system) both show up when previewing the cinematic in the timeline. However, using Unity’s old particle system, you’d have to go to play mode to see the particles.

8. Rendering the sequence

In UE, you can render with an export window that’s included out of the box.

In Unity, you’d have to install the free Recorder package from the Package Manager and add a Recorder track on the Timeline, but after that, it works the same way.

Once you’ve set everything up, rendering is easy and takes about the same amount of time in both engines. For this cinematic in UHD resolution, it took me about half an hour.

Rendering transparency

In both engines, it’s also possible to render on transparent background. Unfortunately with Unity’s Recorder plugin, it seemed to be up to chance if it worked or not though. On most renders, there was either sky or the view of a previous camera filling the background. Changing the range of which frames to render seemed to randomly yield different results.

On both engines, getting the background transparent required hiding some effects that affected the foreground as well, so if I was to use these renders for something, I’d use only the alpha from the transparent render, and the colors from a normal one.

9. Working with assets

Though I have a bit of a bias from having much more experience with Unity, I think importing models, changing materials and just navigating in the program is much faster and more convenient in Unity.

Imported 3D assets

UE requires assets to be set up a certain way for the import (and especially reimport) to happen tidily. It seems to be very particular about how you should work in the 3D software you export your models from. Unity has its peculiarities too, but in general it seems a lot less picky about the FBX files you feed it.

Since most of this is information that most people don’t need, below are some foldouts covering different topics of the workflow between Blender and the game engines.

Avoiding scale and rotation pitfalls for FBX (Blender -> Engine)

When exporting from Blender (which just happens to be what I’m familiar with), most of the settings can be handled by saving the export settings to a preset, but there are some annoying steps you have to take for each asset that you export for each engine, to make sure the model is imported in the right scale (1, 1, 1) and rotation (0, 0, 0).

For Unity, when exported with 0 rotations from Blender, an X-rotation of -90 is added to all the objects at the root level in the FBX hierarchy. I’ve always fixed it by adding a rotation offset in Blender to reverse it, but this tutorial shows that alternatively, there’s a checkbox you can tick in the import settings in Unity to fix it.

These steps (from this tutorial) seem to work for UE:

  • Set units to metric and Unit Scale to 0.01
  • Scale the model 100 times bigger, and apply scale
Importing characters and animations

The way I usually work in Blender is making many animation clips in the same scene, then export it with all the animations. Using FBX format for that kind of export would result in a list of unnamed animations when imported in Unreal Engine.

FBX file exported “wrong” for UE (with many animations contained in the same file). Renaming the animation clips would be another mistake, since after that, you can’t reimport them.

As a side note, with GLB format, it seems like the animation clips appear with the correct names in UE as well, even if there are several in the same file (and can be reimported after modifications in the source file).

According to this video, the correct workflow to get characters from Blender to UE is this:

  1. Export each animation as a separate FBX, and the character rig itself as a separate FBX as well.
  2. Have those FBX files somewhere outside the UE project and import by drag and dropping to the Content Browser.
Exported and imported correctly in UE, the animation clips appear with the correct names.

For Unity, you can include as many animation clips in the same FBX file as you want. They can be easily organized and renamed in the import settings.

In Unity, animation clips in the same FBX file are listed in the import settings.
Importing environment assets

In Unreal Engine, an FBX containing many objects can be imported either as combined or with the meshes separated. In the latter case, they’d show as individual files in your Content Browser, without the object hierarchy you had in the FBX.

However, there’s a plugin named Datasmith, which can be used to import a 3D scene with the hierarchy intact. I followed this tutorial to do that with the hall environment. The imported scene is a group of objects you can just drag and drop to a scene. Beware though, changing it in one scene will change it for all scenes it’s placed in.

With Unity, the object hierarchy of imported FBX files is preserved without having to use external plugins.

Morph targets (AKA Shape Keys or BlendShapes)

Morph targets are a way to save many shapes of the model (without adding or removing vertices), which can be switched on and off smoothly with sliders.

In Unity, after simply ticking the Import BlendShapes checkbox in the import settings of your character model, they appear as sliders in the Skinned Mesh Renderer. The adjustments you make in the model in the scene, or prefab, will be in effect in the game or cinematic as well (assuming you don’t have animations overriding them).

BlendShape sliders are very simple to adjust in Unity.

In UE, you can only preview the effect of the BlendShapes with sliders by opening the skeletal mesh. To have them “stick” in the game or cinematic as well, you’d have to add them as nodes in a blueprint, as explained in this tutorial.

Previewing morph targets in the “skeletal mesh” asset in UE. The changes made there aren’t applied so that they’d show up in the cinematic.
The node graphs needed in UE to get the values of two morph targets set in play mode and editor.

Browsing and making changes

In my experience, browsing assets and making changes to them is very much easier in Unity, thanks to the inspector panel and preview window. The latter shows an image of the Prefab, model or material, and animations can be played on it. If you select many assets, it gets divided to a tile grid showing the assets. The inspector panel shows the settings of the selected asset and in most cases, you can make changes there without further hassle.

In Unreal Engine, if you want to make changes to a material for example, you’d have to double-click it open, which doesn’t sound like a big hurdle, but if you’re starting a project, have just imported a bunch of assets and maybe need to do the same change to many materials, it definitely feels to me like there’s some traction preventing me from working as quickly as I could.

Ease of problem solving

When I get stuck using Unity, the help is usually a quick Google search away. The Unity community is huge and very active on forums. Pretty much everything has already been asked and answered before.

In UE, settings affecting the same thing can be found in many places, and whatever you’re trying to do, there can be a setting somewhere else preventing it. In many cases, finding the solution to something simple I was trying to do took hours.

10. Conclusion

From the standpoint of visual quality alone, UE is pretty clearly ahead, thanks to Lumen enabling good looking real-time lighting with a pretty good performance, better looking depth of field and reflections. It also has other nice features I didn’t get into, like Nanite and Megalights allowing a huge amount of mesh detail and light sources in scene without much cost on performance, more control over volumetric clouds, MetaHuman, and no doubt a lot that I don’t know about.

Reading some materials and watching videos when making this blog post, I gathered that there seems to be a consensus that UE is better with animations. I’m not so sure about that myself, at least when it comes to the use cases outside gameplay. If you do the painstaking work of setting things up correctly in UE, you can work more within the engine and rely less on external animation software. However, Unity has some perks that save a lot of time (especially with retargeting animations).

I think from the standpoint of ease and convenience of working in the engine in general, Unity takes the points in my books. There seems to be less “friction” slowing down the work, and more freedom to set up things the way you want. UE seems to box you into a very specific way of setting things up for them to work, like with character rigs.

Looking at the cinematic tools alone, I’d say in UE it’s a bit more convenient to animate things though, and it allows you to do more without external plugins.