Term1 Project (Final) – Go Bear Go
Format: Video (less than 1 minute), can be developed into a game (Third-person character)
Tools: Unreal, Blender, Nuke, 3DEqualizer, Mixamo, Treeit, Capcut
VFX Focus: Niagra, Groom, Visual Effects with 2D and 3D production, Compositing, MetaHuman, PCG Generating, Animation
Pre-production Goal
This is a series of short videos which I designed to separate into two sessions. Overall, I have prepared to tell a story, half realistic, and half dramatic.
The first session is to show the character going through four scenes to mimic human progression. The second session will consist of a more sophisticated and dramatic narrative. I am exploring different production techniques and workflows to optimise subsequent projects.

Final Output
To keep the Term 1 project within one minute, I rewrote the storyline, showing more detail and preparing cliffhangers. Several camera angles were changed and only the first two scenes were shown. This gave an idea of how the story would unfold and how the character begins his journey.

I explored techniques and visual effects to understand the differences between Unreal Engine and Blender’s particle systems. My workflow encompassed 3D modelling, scene construction, animation, rendering and finally compositing with Nuke.

VFX Breakdown Video
Significant Milestones
- The character’s design incorporated a finished hair and texture created in Blender and subsequently transformed into Grooming in Unreal. This process ensured a smooth rendering of the fur.

2. More detailed camera angles and techniques were explored in Unreal, focusing on particle systems, MetaHuman and faster animation compositing.

3. 3D scenes and cameras were successfully transferred between 3DEqualizer, Blender, Unreal, and Nuke, with Unreal sequences rendered and composited in Nuke to enhance the video’s visual effects.
Progress & Workflows
This production contains important developments for my workflow and techniques.
The Niagara system in Unreal performs better than Blender for creating and rendering visual effects. It takes five times as long to create the same smoke or effects in Blender as it does in Unreal, and the process of adjusting effects in Niagara is smoother and more intuitive. This makes it easier to create effects like fluids, particles, smoke and fire.
I think Houdini particles will be better for this in the long run, but Unreal’s effects have already exceeded my expectations for this project. My workflow in this project is as below. I think this will work well for other projects, especially the plugins and methods I used and proved this time.
- 3D modelling and exporting
I create objects, characters and scenes in Blender, exporting GLB files to Unreal for development. For this project, I did the grooming and re-created the shaders in Unreal. So I think exporting gltf instead of fbx will be better for faster processing and use in other website development.

2. Animation: Blender, Mixamo and Unreal (Plugins used)
(1) I created my bear and rigged it in Blender
(2) I exported my rigged character from Blender to Unreal and adjusted the character and rigging.
The only valid plugin: Auto Rig Pro (Blender)
(3) Imported animations from Mixamo to Unreal, applied the animation to the character in Unreal and modified it.

The fastest way to use animation assets from Mixamo and all animations can be ready for Unreal to apply in seconds
Tool: mixamo_converter, inside Terribilis_studio_Launcher

3. Tracking – 3DEqualizer and Nuke & Exported to Blender and Unreal for 3D scene
Nuke and 3DEqualizer can both do camera tracking for 3D scenes, but I find them to be used differently.
Exporting animations with suitable camera angles: 3DEqualizer -> Blender -> Unreal


In this project, I’ve used 3DEqualizer to export camera and scene data to Blender. I then exported the 3D scene from Blender to Unreal using USD formats. I need to create the animated footage of the bear coming out of the portal in Unreal, not Blender, because my bear was covered in fur. It could only be done smoothly in Unreal. To speed up compositing footage in Nuke, I exported the animation at the same angles from Unreal and added it to the scene, rather than adjusting the character and animation angles in Nuke.
To add effects and stable objects: Setting up 3D camera tracking, maps and objects in Nuke.

4. Rendering and compositing workflow – combining Blender, Unreal and Nuke.
I did the compositing in Nuke, added effects by importing all the Unreal footage, analysed the timelines, and then designed the displays and effects. After confirming the frame lines and effects needed, I created the 2D animations in Blender with the corresponding time frames. Then I added the EXR sequences, textures and 3D objects into cards and other objects of the 3D camera scene I set up in Nuke. These workflows would not waste time adjusting the gaps and differences in Nuke.


Valuable Explorations & Improvements Needed
Valuable Explorations
- Character Making – Groom and Game Play
Making fur and hair was slow when using Blender. I asked Manos and Emily about making fur and got feedback to try later with Houdini, but I still want to test the effectiveness in Unreal. So, I researched methods and workflows from Blender to Unreal. Finally, I made a fluffy character, a bear, successfully using these two software. I also replaced the third-person character with my bear in Unreal and tested him running in my designed levels. This may also prepare for the project to be developed into a game.
I created my character, then used the hair asset in Blender and exported it to Unreal for Groom Edit. (Image reference: Wombat)


2. VFX Production in Unreal
Unreal’s real-time rendering helps me create particles, smoke and fluid quickly. As mentioned above, the Niagara effects in Unreal outperform Blender’s particle system. I experimented with the fluid on the glass house, the particles that lead the bear to the portal and the smoked portal. I also added small particles to follow my character and tested the performance while manipulating my bear to run in the play mode.
3. MetaHuman Animation
I photographed myself with Polycam and then used the 3D model to create my own MetaHuman. Simple and convenient, but not enough.
I then generated the facial animation and discovered that using an audio file to generate the animation would only work with Unreal 5.5.
Audio was generated using AI – ElevenLabs

4. PCG Generating with Trees and Landscape – Gaea, Heatmap, and Treeit
Having used PCG to generate landscapes for the Wood Cabin project last time, I know the rendering difficulties of generating too much detailed grass and foliage. So this time I did some research to find out how to use the tool – Treeit to create different trees using leaf maps. This method not only allows me to quickly create an individual tree but also allows me to replace the leaves shapes (a simple drawing in Photoshop) quickly in Unreal’s shader slot.



Another advantage of PCG is that I can easily create paved roads with lines and keep them in line with my landscape.

Improvements Needed
- Character Physics and Blueprints in Unreal
This is my first time exploring character blueprints and physics in Unreal. I still need more knowledge and practice to see how physics affects the character, as this time I have problems rendering, resulting in different results between the movie render queue and the engine viewport.

2. Material Shaders difference between Blender and Unreal
Shaders vary between programs. Unreal has multiple materials. Substrates are more complex than standard Unreal materials, requiring more time to explore.
3. Compositing techniques with Nuke
More time is needed to research and practice the 3D scene in Nuke and the techniques. I’ve tried some and find it more fluid and powerful than After Effects.


The train is an asset from Sketchfab.

Thanks @gildermanvladimir089 https://skfb.ly/p6BEV
Planned Changes, Unexpected Challenges & Solutions
Planned Changes & Actions
To adjust the video time and fulfil a detailed story, I have changed the camera angles and character routes. Compared to the blockouts I created in Unreal previously, I needed more installations and models.
Inspired by the art at the Wellcome Collection, I designed the transparent installation to swirl and swing around a stick to simulate the way the sound plays out.
This was to express the idea of the character waking up after hearing the sound.

Unexpected Challenges & Solutions
1. Character rigging in Blender and exporting to Unreal is not smooth
I found ways to export rigged characters from Blender to Unreal, but most are unavailable due to Blender and Unreal being upgraded.
I tested each method with and without add-ons and found that only the Auto Rig Pro add-on in Blender works well. It supports both auto and manual rigging, which is perfect for rigging a non-human creature.
I replaced the third-person character with my bear and discovered incorrect finger rigging. Although it took me some time to fix this in Blender and re-export to Unreal, the workflow from creating the model and rigging in Blender to exporting to Unreal using the Auto Rig Pro add-on is efficient and available in the latest versions of both software.


2. Groom performance difference between rendering in Movie Render Que and Unreal gameplay mode
I had a problem with the transparent material and groom transparency. Parts of my character’s fur were not visible after rendering with the Movie Render Queue.

It took me about two days to figure out how to fix this.
I searched for similar problems online and found that many people had the same problem. I found out that the problem should be solved in terms of the following aspects:
- The materials setting in Unreal
When using translucent colours in Unreal, overlapping objects affect the rendering result. This is only visible in the movie render queue, not the viewport. Adjust the overlap and layers of transparent objects and grooms before rendering, choosing before or after the DOF to change the order of the objects.

b. The anti-aliasing setting in the Movie Render Queue
The spatial and temporal sample counts affect the output. Keep both values as low as possible to reduce rendering time, but this can make the final output look worse than the viewport in Unreal. If there are particle effects or other Niagara effects that need time to generate in advance, set the Warm Up Count value higher than 16 to allow time to generate the particles. Otherwise, the output may lose particles in the first few frames.

c. The setup in levels affects the physics and therefore the groom
The groom appears to jiggle and mass around in the output from the movie render queue compared to the stable state in the viewport. This took me more time than expected to figure out and re-render several sequences.

I found that one of the reasons is that the character will have a physics problem if there is no solid ground or planes for the engine to identify. The groom will pile up during the rendering process, and may even crush as the CPU and GPU struggle to locate the end. Add a plane in the level for the character and tick “Actor hidden in game”. This way, it won’t be rendered and Unreal will render the groom without errors and take less time.
3. Live Link Face frames imported failure in Unreal
When I imported the videos I’d recorded into my Unreal project, the files were created, but the depth and video frames were not visible in the Unreal content browser. After researching online, I found that people were having the same problems, which seemed to be a bug left over from Unreal 5.1.
The solution I found was to create a new project, and then the footage and frames were all able to be processed.