Rick Batka
Project 1: Movement & Combat System with Melee & Gun
Unreal Engine 5, C++, Blueprints. Solo project, inspired by Remnant: From the Ashes 2
Description
Inspired by Remnant: From the Ashes 2, I set out to create a similar movement system that tackled the problem of seamless switching between gun and melee combat. I wanted to explore how one might implement a melee lock-on system, while still supporting seamless and smooth transition to ADS gun combat. For this, I eventually settled on what is shown in the video: lock-on effect is simply toggled off when aiming the gun. I found this felt natual when switching from Dark Souls style melee combat toGears of War style gun combat. I also wanted to explore the UE5 animation blending sytem and hone my abilities in creating fluid moment-to-moment gameplay with “juicy” game-feel, tight controls, and smooth transitions.
Features
Melee Combo System with Pre-emption: Melee attacks are locked in, until a preemption window where they can be cancelled. Subsequent attack inputs during the preemption window continue the combo. Animators can adjust the preemption window in the montage timeline.
Souls-like Combat: Over-the-shoulder camera seamlessly transitions between melee mode (with lock-on) and aim-down-sights. Weapon switching is automatic, as in Remnant: Toggling aiming fluidly switches between melee and ranged combat modes.
IK Gun Recoil & Spread: Configurable recoil and spread parameters jolt and jostle the gun realistically. IK rig keeps arms realistically attached.
Dodge Roll I-Frames: Animators can tweak invincibility timings right in the montage timeline editor. Actors and ActorComponents can easily register for AnimStateNotify events with a simple but powerful composition approach.
Lock-On System: Character remains facing target, switches to blended strafing animation. Seamlessly switches back and forth.
Dynamic Camera Effects: Screen shake intensity is influenced by gun strength (which also drives recoil and spread). FOV + Zoom animation helps smoothe out the transition between melee and gun.
Project 2: Unreal Plugin for SpatialOS
Unreal Engine 4, C++. Team Size: 2.
Description
Improbable's SpatialOS was a scalable game and simulation back-end for massive scale interactive experiences, often showcasing 1 million+ complex actors being simulated in real time. Improbable needed a way to showcase interoperability with game engines in their scale demos - in this case, Unreal Engine 4. This particular demo used open source city data to show 500,000 civilians going about their normal pattern-of-life in a real city. Civilians would enter and exit buildings and react to emergency situations by seeking shelter before returning to their normal pattern of life.
Knowing that Unreal can struggle to render hundreds of actors, we used a combination of tricks to render the closest 5,000 entities to the player at any given time at 60fps on an average gaming laptop.
Features
500k Complex Actors Simulated: Using SpatialOS's distributed Entity Component System, we were able to write agent behaviors that scaled to massive numbers. By dividing the world spacially (a.k.a. Geo-Sharding), we could scale nearly indefinitely.
Unreal Plugin Architecture: We built a general-purpose plugin for viewing large SpatialOS simulation in Unreal. The plugin took over the networking for any proxy actors controlled by SpatialOS. It featured a dynamic property mapping GUI that utilized Unreal's reflection system to make it easy to map SpatialOS data fields to their Unreal UPROPERTY counterparts.
5k Animated Entities on Screen in Unreal at 60fps: We used instanced static meshes with pre-baked animations to render thousands of entities in Unreal. With a handful of skinned meshes and a small set of pre-baked animations, the visual variability was convincing despite no true Unreal “Actors” in the scene.
Interest-Based Data Culling: We used SpatialOS' “killer feature” - its infintely scalable spatial querying system - to limit the entities in view to the closest 5,000 to the camera. The entities in view were seamlessly loaded and destroyed as the camera moved around the world.
Complex Pathfinding AI for 500,000 Entities: The simulation back-end featured 500k entities navigating around the city using efficient A-Star (A*) pathfinding as they moved from building to building. The navigation mesh was built from OpenStreetMap road and sidewalk data, which we processed and turned into an efficient geographically sharded set of smaller NavMesh instances that each simulation worker could use.
Selectable Buildings: Our city mesh contained tens of thousands of buildings within a single mesh and was performant enough to show all of New York City at once, but without separate meshes for each building, we were challenged to find a way to let the user highlight and inspect buildings. We baked unique building IDs as UV coordinates into the vertices that represented buildings. This let us quickly find all relevant faces for the currently selected building to recolor it and show some metadata to enhance the simulation.