The Environment Art Team continued their push on Lorville Section One Nine (L19), bringing more and more areas to a final state. With L19 coming alive, they’re kicking off production on the surrounding areas, including the procedural tiles that represent the wider city.
Regarding organic environments, the team wrapped up the first pass on the Hurston biomes and moved onto defining Hurston’s four moons. Each moon has its own look and feel and reflects some of the elements found on Hurston itself. These new moons are an example of the planet tech system’s flexibility – being able to reuse, remix and create new locations using all elements built to date saves a lot of time, and the process is getting faster as the tools and assets mature.
The DevOps Teams in ATX and DE are coordinating a suite of tools to accommodate the various Feature Teams. This includes extending the existing auto-integrators behavior, which involves enabling classic chronological integrations as well as parallel, non-chronological integrations based on an acyclic dependency graph. This is similar to what’s used in code compilation to determine the ordering of tasks. Any stream of changes flowing via auto-integration can communicate directly to whoever the owner of the changelist is, or be deferred to a single stream stakeholder if they choose to oversee the resolution of integration conflicts themselves. As new options are added, a counterpart API is being rolled out to enable complementary tools to hook in. The first tool to hook into the API is a merging tool that handles laborious integrations from feature stream teams back into game-dev.
The Level Design Team focused on areas of the Persistent Universe, including the flagship landing zones of Lorville and Area 18, and the Rest Stop space station layouts generated by the procedural tools. They completed a versatile whitebox version of a security checkpoint that can be easily adapted to locations of different sizes and security levels, and revisited the interiors of the Refinery space station.
The Engine Tools Team improved the general game editor stability and usability, and fixed bugs for the 3.2 release. They added the Look Development Mode for artists to unify and isolate light setups for assets. Artists and Designers can now select any asset in any level and activate this mode to have a consistent light environment for tweaking their materials – it’s important to have a consistent and neutral material setup across all assets as it results in much higher visual quality. The Look Development Mode supports a flat light setup for tweaking materials to get as close to a neutral in-engine light environment as possible, along with a presentation mode to get the best visual quality out of a given asset. With the push of a button, this in-engine mode works with all asset sizes, from little props to capital ships. It’s a great tool, as it avoids error-prone and inconsistent manual light setups, so saves time on tweaking the assets for the game.
The QA Team started June by joining the EU Gameplay 5 Feature Team. One QA member was embedded with the team to attend weekly sprint and planning meetings and work closely with Developers to identify and resolve issues related to the Transit system. The Transit system includes elevators, metros, trams, and similar methods of transportation. Memory testing with the logging of Environment Variables enabled is also underway. In the first run-through, QA assisted the Engine Team in identifying a memory leak within particle emitters. The Graphics Team quickly created a fix and confirmed it once the changes were checked in. This type of testing will be a regular occurrence to ensure the team can stay on top of any potential future memory issues.
In addition, they continued to test new versions of the Subsumption Editor, as well as stay on top of regression and inputting new issues encountered by the development team. The new Subsumption versions consisted of fixes that needed verification as well as a new option to allow subsequent messages to be skipped when there are multiple invalid callback functions generating warnings upon initialization. Functionality in how Activities read information from the platforms has also been improved. A developer can now directly edit an NPC’s schedule loadout via the Activity the platform is being called from. Physics refactoring has also started to pre-emptively catch any new issues from the introduction of these new changes. QA will be doing a weekly Physics smoke test every Monday to test the PU, Arena Commander, and Star Marine in the Game-Dev branch. Then, a report is generated and sent for review and any new issues introduced from the Physics refactor are addressed.
The Engine Team continued with Physics optimizations that allow for more overlap during terrain patch physicalization. They also optimized raycasting when flying over the planet grid along with shadow batch processing. Look Development Mode was introduced to the engine to enable the team to test shading setups in a controlled environment. They improved horizon SSDO quality and performance, which is now enabled by default. They started streamlining the asset pipeline flow for consistency in shading, cleaning up asset presets, and changing the plugin to improve consistency.
For general code development, they supported the inline function expansion in callstacks presented via JIRA and Sentry, and improved support to compile Linux targets through Visual Studio. They completed the first iteration of the mining painter, continued to work on the telemetry system, and made skinning and vertex processing improvements.
The Weapons Art Team completed all work, bugfixing, and polish for the 3.2 release. This included work on the Gemini F55 light machine gun, Klaus & Werner Demeco light machine gun, Kastak Arms Scalpel sniper rifle, and the Associated Science and Development distortion repeaters (size 1-3). They also spent time on Vanduul lances and knifes.
The VFX Team worked on and optimized the 3.2 mining feature. They tackled effects for the fracture beam, the tractor beam that sucks up rocks, and for various rock impacts and explosions. They also worked on the cinematic destruction pipeline for the soft body destruction simulations. They smoothed out the pipeline for importing the soft body simulations from Houdini into 3ds Max, and then from 3ds Max into the engine. This will be used for bespoke, cinematic destruction sequences.
The System Design Team worked to make enemy ship AI more fun to engage with. Initially, it was built as realistic as possible, but that doesn’t always make for the best gameplay. For example, a computer is extremely good at using the decoupled mode, far better than any human. While this is technically the best solution for fighting in space, it also feels unnatural and unintuitive to the player. They want to strike a balance between realism and giving the player a fun, challenging experience by cutting down the number of unnatural maneuvers the AI can perform.
The team also focused on FPS AI for multiplayer, making sure they function similarly to the single player mode. The push with the Vanduul reached its completion and their behaviors are almost locked down. A lot of work went into how the Vanduul navigate the environment, attack, defend, melee, and react to various weapons. They also spent time on civilian and guard behaviors, adding as much life into them as needed. The team primarily focused on designing modular conversation vignettes, which are blocks of randomly changing dialogue that can be controlled by various AI parameters like morale, hunger, etc. Basically, it’s a giant tree of animation/line clusters. The AI simply navigate through them and every time there’s a vignette, it produces a different outcome based on those parameters, so the same combination of lines will be a rare experience. They also put finishing touches on planetary mining. While some improvements are still needed, they believe it’s at a point where players can enjoy it.
The AI Team was busy implementing new functionalities and fixing/optimizing existing systems. For Subsumption, they introduced the concept of global variables: for single player campaigns, designers might require the definition of generic variables available across different missions. Global variables are visible globally on the various missions and can be saved so that the game status can be preserved and restored for the players.
They also improved the way entities can receive ranged events. A ranged event is started when an entity is in proximity of another event. The proximity can be specified by designers in the definition of the event.
They introduced a new Subsumption task that can mark the entities to be notified so that the game code can efficiently calculate ranges only for the specified objects of the world. Work was done on improving and generalizing the way assignments are handled by entities. Assignments are a sort of command or suggestion designers can send to the AI entities. They can vary from ‘attack my target’ to ‘move to the specified position’, etc. Assignments are a very generic way to influence systemic behaviors and can be used to both script mission flow and/or give AI entities commands. The mastergraph is now responsible for bringing the assignment request to the execution phase, unless the behavior is defined to override the handling of the scenario. For example, while throwing a grenade, the behaviors might wait to handle an assignment that requests his relocation, while if the NPC is just patrolling, it can immediately fulfill the request.
The team had numerous people visit from the UK office for a week of meetings regarding the useable system. They discussed numerous use cases and possible improvements on the system and worked side by side on both existing and new code. Items that came from the meeting to be implemented are a new tool to speed up the usables pipeline, as well as new functionalities in Subsumption to describe complex scenarios of NPCs interacting with the environment. This ensures that their behaviors maintain a simple logic while the complexity remains embedded within the system itself.
Ship behaviors received significant improvements for the future Alpha 3.3 release. The team started implementing different behavior strategies associated with different ship types, so pilots can use the best capabilities of their specific vehicle. They also improved the way accuracy is calculated, so that ships attacking enemies feel more natural and are respectful of the new skill levels recently implemented. Human combat is also being polished. The team is currently going through as many use case scenarios as possible to validate the system works as intended.
The Tech Art Team worked on the Sandbox-Editor-to-Maya live link tool for synchronizing animations between the Digital Content Creation (DCC) and the game’s editor which, for this particular purpose, is essentially used as a rendering backend. Now that the underlying interprocess communication and object serialization frameworks are in place, animators will soon be able to see their changes on in-engine character assets live within the Maya viewport, rendered with all the advanced shading effects that the game engine provides. This will not only allow them to create better animations in less time, it will also facilitate animation asset and rig asset/deformation quality control and make for a much more immediate and precise workflow. They also integrated Motion Based Blending into Maya to enable the animators to easily remove any potential foot sliding after changes were made to an animation.
They implemented a ‘light rig’ switch into Maya that improves performance in heavy scenes and, in return, raises the frame rate during replay to make it easier for animators to work in heavily populated scenes.
Multiple FPS and Ship Weapon bugs were also addressed for the 3.2 release.
The Lighting Team focused on tasks and bugs related to the upcoming 3.2 release. In addition, they collaborated with the Environment Art and Level Design teams on Lorville. Their current goals are to provide a first pass for lighting and atmosphere across all areas, as well as ensure that the level is built to allow for easy optimization and better performance in the future.
The Cinematics Team worked with UK Gameplay Engineers to create a Cinematic FreeLook control system that works with Star Citizen’s unified 1st and 3rd person character rig. Cinematic FreeLook allows a player that’s locked in position during a 1st person cinematic to look around freely. Cinematic designers can specify up/down/left/right limits and steer the player’s headcam towards specific things, like a character, vista, or an important event in the distance. The original Cinematic FreeLook system was limited as the player’s body was not fully considered, so looking too far in one direction meant the player could potentially see part of their own face inside of their helmet. This new system allows the player to work in 1st and 3rd person with performance capture on the body/head, additive animation of the headcam rotation, and Cinematic FreeLook active. The FreeLook system works with the mouse and gamepad thumbstick input, as there are areas that slow down control towards the edges of the ‘look window’ and a smooth re-center after a specified time with no input.
They also gained the ability to run Mannequin fragments on any NPC in Trackview, which will create seamless blends between locomotion and those performing in scenes via Trackview.
The Cinematic Animation Team also worked on testing how far they can technically push the Walk & Talk AI conversations. The goal is to retain the performance that the actors brought to these talks while running numerous real-time additives, such as layering upper body performance capture on top of a personalized per-character locomotion walk set. The tests have so far yielded positive results.