Distributing editor binaries cheaply

Morva Kristóf
Unreal Engine Tech Shreds
6 min readFeb 17, 2024

--

Context

Out of the box, Unreal Engine provides three ways to share the same engine builds with your teammates:

  • Use the Epic Launcher to fetch an official release. It has an obvious drawback that you can’t change the engine code.
  • Submit your binaries directly to source control. Easy to do, but your version control storage (and therefore your fees) will grow.
  • Use Unreal Game Sync (UGS) with Perforce. It is a more expensive solution (both P4 licences, and having the infrastructure for UGS).

There are more pros and cons for each option listed in the Unreal Engine documentation.

At Rapax Games (a tiny team with 0 full-time developers and minimal cash) the first option served us well up until some point while working on Polars; artists and designers used the stock 4.27.2 from the Launcher to work with the editor, and programmers had a source engine build to be able to implement some improvements for our packaged games, to integrate console SDKs, etc. This way, we could for example take some Ray Tracing fixes from UE5 into our engine, and even if artists worked in the stock engine, we could use our source build to create shipping packages for Steam with the integrated engine improvements.

Of course, as time went on, it became more and more uncomfortable — whenever we stumbled upon an engine bug, we had to make up a workaround fix in the game, and if it was too sub-optimal, then we had to accompany it with an engine fix (basically doubling the work) to make sure that the final build did not contain undesired performance regressions. Another issue was of course that our engine fixes were not applied while working in the editor, so if there was an incorrect Ray Tracing reflection somewhere, we had to imagine that it’s gonna look just fine in packages.

Because of this, we’ve recently built a pipeline to be able to share engine binaries with the whole team, so that we can all take advantage of the engine changes, without unexpected differences between PIE and packages.

Considerations

Now, our case is pretty specific, and if might not work for you for different reasons. If you can afford P4 + UGS or having your engine binaries version-controlled, those are probably better options. Our needs were the following:

  • Suitable for programmers too, so that they only need to compile the engine source if they want to make changes directly in the engine or if they need to debug something low-level. It means:
    – Debug symbols included.
    – All target platforms (Windows, Linux, Switch, etc) built for all configurations (Development, Shipping, etc).
  • Seamlessly updated on the receiving side when there’s a new version.
  • Trivial to maintain, no VPS management, dealing with IT security, etc.
  • Not involve programs having to be installed manually by all users.
  • Cheap.

On the other hand, there were a few things we were willing to sacrifice (as we can’t have everything all at once, obviously):

  • No need for different engine versions, we can make it always backwards-compatible, so a single latest version suffices.
  • It’s a very small team so we don’t really care about the risk of parallel uploads and downloads (although it’s a nice to have).
  • We don’t mind if the engine is not minimal in size, as we make changes really rarely.
  • It’s fine if it only works on Windows.

I encourage you to consider all your needs before deciding about the best solution for your team, as it really depends on your size / funding / etc.

Solution

So what we’ve ended up doing is storing an installed engine build without any kind of versioning (to make it really cheap) in a central place, and fetch it automatically when somebody syncs to a new changelist.

I’ll mostly focus on the part that is not covered by other resources: distributing the build itself. The Unreal Engine documentation and WisEngineering’s document can be good starting points for understanding the process of building the installed engine itself (and for alternative ways of distribution).

A screenshot of the previously linked UE documentation, with the following disadvantage highlighted: “If you make your own engine distribution without using the launcher, it’s up to you to figure out how to distribute and install it.”
The point in the official documentation which this post focuses on

We chose to use a cloud object storage. Any S3-compatible bucket works well, but as a rule of thumb I didn’t care about services with complex pricing calculators (like Amazon or Google), nor ones with a minimum monthly fee (DigitalOcean, Linode, etc), which is how I ended up with Cloudflare R2. Its pricing is pretty simple and affordable for our scale.

For reference, our Unreal Engine installed build weighs 47.1 GB containing ~138'000 files, which means that we can update the whole engine 7 times a month, and download it 72 times for free, so we only need to care about storage space in practice. Our invoice is a whole lot of $0.60, which is cheap enough.

Invoice for a “R2 Data Storage” with “49.13” GB has a subtotal of $0.60

Of course, if you choose to not store files one-by-one, but rather compressed (in i.e. a zip), then that’s gonna be even more compact, but as a drawback, even if you only change a single XML in the engine, clients will have to download the whole archive (which we wanted to avoid).

To make checking for new versions near-instantaneous (if there are no engine updates then we want to have a very fast code path), we stored a version file in a separate bucket that we can compare against on the client side; if it matches, then we don’t need to check the timestamp of every file one-by-one (which can take some time). This is of course not a concern if you store a single zip file in your bucket.

So our workflow on the upload side is the following:

  • The coder creates an installed engine (which is covered by the linked references, basically a RunUAT.bat BuildCookRun -installed ... call) with a build script written in PowerShell.
  • The build script writes an EngineVersion.txt file after the UAT has finished with the current timestamp as value (code below).
  • They start the upload with our cloud script (command being upload).
  • The upload script installs all needed dependencies for the S3 CLI.
  • It uploads all changed engine files (checking the file hash).
  • Then it uploads the written EngineVersion.txt to the version bucket.
Set-Content 'EngineVersion.txt' ([DateTimeOffset]::UtcNow.ToUnixTimeSeconds())

And now to the client side:

  • When the repository is updated, a PlasticSCM (the Version Control System we use) trigger runs (Git and other VCSs also have client-side triggers) that executes our cloud script (command being download). The Plastic trigger command is below.
  • The download script installs the dependencies.
  • It checks if the already installed engine version is same as the one on the server.
  • If not, it’ll download the changed files, add the engine to the registry and install UE’s prerequisites.
  • After it succeeded, it’s gonna write the version file (to be able to compare against the server’s on the next update).
cm trigger create after-update "Update Engine" "powershell -ExecutionPolicy Bypass @WKSPACE_PATH/Build/Engine/DownloadEngine.ps1" --server=YourServer@cloud
Window showing a list of deleted files and some transfer statistics
The sync progress window

In the cloud script you’ll see that there’s both AWS CLI and Rclone support — that’s because I’ve used the AWS CLI originally, but it does not have good parallel file transfer and hash-check for uploads, and it also had an annoying bug, so I’ve switched to Rclone instead (but kept the AWS support there just in case).

And basically that’s it. I apologize for the abhorrent PowerShell code, I hate this language and run away from it as soon as it starts working.

And of course, every team’s workflow is different, so it’s likely that you might want something to work differently — in which case feel free to take and modify the script to suit your needs (or to contribute to them via Pull Requests).

Future work

Avoiding potential download / upload conflicts would be nice to have (although unnecessary for our current scale). One solution for this might be to have a lock file in the version bucket, so downloads / uploads can wait until conflicting processes have stopped running. Cloudflare R2 does not have object locking, but it does have If-Unmodified-Since conditional uploads, so theoretically it should be possible to acquire file-based locks atomically.

Other improvements would be a decent-looking update window with legible progress, Linux support, selective update (coder vs artist), and probably many more.

--

--

Morva Kristóf
Unreal Engine Tech Shreds

Game Developer, lover of open source projects, freedom, sustainability and strawberries.