
A new version of Unreal Engine 5 has launched, further boosting development efficiency with a range of workflows integrated directly into Unreal Editor, while enabling creators to build and render vast, highly realistic worlds in real time.
Epic Games (CEO Tim Sweeney), the leading interactive entertainment company behind Fortnite, Unreal Editor for Fortnite (UEFN), Unreal Engine, and the Epic Games Store, announced on the 13th that it has officially released Unreal Engine 5.7, which ships with a host of new features and improvements.
The Unreal Engine 5.7 update delivers tools for building expansive, lifelike worlds with rich detail, and rendering them in real time at high quality on current-generation hardware. Creators can author complex layered and blended materials with physical accuracy, and make use of far more lights than before with much greater freedom.
In addition, the release introduces more powerful and intuitive animation and rigging workflows, deeper and more flexible MetaHuman integration, expanded virtual production capabilities, and a new in-editor AI Assistant that offers expert-level guidance throughout the development process.
The Procedural Content Generation framework (PCG) is now available as a production-ready feature, enabling developers to assemble environments quickly and organically while building large-scale, highly immersive worlds more easily and efficiently.
The new PCG Editor Mode offers a customizable library of tools built on the PCG framework, including spline drawing, point painting, and volume creation. Thanks to numerous performance optimizations, PCG GPU compute is now dramatically faster, and new support for GPU parameter overrides lets you set a range of parameter values dynamically when working with GPU nodes.
A new Polygon2D data type and associated operators give creators even more flexibility, while newly added Spline Intersection and Split Splines operators extend spline-based workflows. On top of that, the Procedural Vegetation Editor (PVE) lets you create and customize high-quality vegetation assets directly inside the engine in real time. Through Fab, developers can download a set of five Quixel Megaplants assets now, with hundreds of additional plant presets planned for future updates.
Rendering technologies across the board—including Nanite Foliage, Substrate, and MegaLights—have all been upgraded.
A new geometry rendering system, Nanite Foliage, enables teams to create and animate highly detailed, dense vegetation for large open worlds. It maintains stable frame rates without the need to author LODs and without cross-fades or popping. By leveraging Nanite Assemblies, it reduces storage, memory, and rendering costs, while Nanite Skinning drives dynamic behaviors such as reacting to wind.
Substrate, Unreal Engine’s cutting-edge modular material authoring and rendering framework, is also now production-ready. With inherent support for layered and blended materials, it lets you accurately and convincingly combine physical properties from metals, clear coats, skin, cloth, and more. As a result, effects like multi-layer automotive paint, oiled leather, or blood and sweat on skin can be authored much more easily while maintaining high visual quality.
MegaLights moves into Beta with this release. Creators can add far more dynamic shadow-casting lights to a scene, achieving realistic, soft shadow effects even in complex lighting setups with numerous area lights. This allows for much freer lighting direction than before and supports the creation of larger, richer, and more intricate worlds. Directional lights, translucency, shadow casting for Niagara particle systems, and lighting and shadowing on hair have all been refined, significantly improving overall image quality.
MetaHuman now offers deeper integrations with Unreal Engine and other tools in the production pipeline. The MetaHuman Creator Unreal Engine plugin now supports both Linux and macOS, broadening cross-platform compatibility and accessibility.
Using Python or Blueprint scripting, creators can automate and batch nearly all editing and assembly operations on MetaHuman character assets—interactively in Unreal Editor or offline on a render farm. New functionality to conform meshes in various poses has been added, and round-tripping with external DCC tools is supported via the FBX format.
On the animation side, Live Link Face can be connected to an external camera on an iPad or Android device, enabling real-time animation generation and performance capture. Creators can also author and refine hair guides and strands directly in Unreal Engine, and the latest MetaHuman for Houdini update introduces a guide-driven workflow that uses preauthored data to efficiently build hairstyles.
Following the major overhaul of in-editor rigging and animation authoring tools in Unreal Engine 5.6, version 5.7 introduces a refactored Animation Mode that streamlines workflows and optimizes screen real estate.
With the new Selection Sets feature, animators can select multiple controls with a single click. Improvements to the IK Retargeter deliver more natural foot-to-ground contact and support retargeting of squash-and-stretch animation.
On the rigging side, the updated Skeletal Editor lets users move seamlessly between placing bones, painting weights, and sculpting blend shapes directly on a skeletal mesh.
New support for one-way physics world collisions means characters can be dropped into a scene and interact with objects in the environment in a more realistic way—resulting in more convincing ragdolls, dynamic gameplay, and immersive animation tests. A new Dependency View visualizes data flow through Control Rigs or Modular Control Rigs as a clear node-based graph, making it easier and faster to debug and optimize complex control setups.
The newly added Dynamic Constraint Component for Props allows objects to automatically attach to hand positions after motion capture, producing smooth, natural results even during complex actions like juggling.
A new Live Link Broadcast Component enables Unreal Engine itself to act as a source of animation data across a network. This supports a wide range of multi-machine virtual production (VP) and mocap stage workflows—for example, offloading retargeting work to a separate Editor session and streaming the results back into a main scene.
Unreal Engine’s built-in real-time compositing tool, Composure, has also been significantly enhanced. It can now process both live video feeds and file-based image media in real time, delivering instant results even for film and video at 24 fps. New shadow and reflection integration features, along with an improved keyer, allow for more natural compositing of live-action footage and CG elements.
The newly introduced AI Assistant provides Unreal Engine guidance directly within the Editor, offering real-time support that includes C++ code generation and step-by-step instructions. The Unreal Editor Home Panel aggregates key resources—tutorials, documentation, news, forums—in a single location, and an interactive “Getting Started” sample is available for newcomers.
Epic Games Korea will host a webinar on November 27 (Thursday) at 2 p.m. to showcase Unreal Engine 5.7’s new features and celebrate its launch. Anyone interested can enable broadcast notifications via Epic Lounge or the Unreal Engine YouTube channel to take part.
For more detailed information on Unreal Engine 5.7, visit the official Unreal Engine website.
This article was translated from the original that appeared on INVEN.
Sort by:
Comments :0
