RUST Server Performance, Hints and Tips

RUST Server Performance

RUST Server Performance

This RUST Server Performance guide was provided by antisoma and LeDieu of EU BEST with special thanks to Alistair of Facepunch Studios and wulf from OxideMod and tyran from Rustoria. It is primarily for RUST server owners offering large public servers with high player slots (100+) where performance becomes increasingly important.

As a server owner – you have a few main tasks:

  1. Minimize server downtime (crashes/inattentiveness, DDOS)
  2. Develop a player base
  3. Agonize endlessly that the server is performing as well as possible and that the decisions you have made about hardware and settings are optimal.

We will focus here on the final task – server performance.

Initial Considerations for RUST Server Performance

Server Hardware

The vast majority of large and busy RUST servers are running on dedicated servers in data centers. This isn’t because it is impossible to get good performance on shared hosting, but because it’s often unclear exactly what resources are available to your RUST server.

If possible, you want fast dedicated cores for your RUST server. RustDedicated.exe (the server executable) is largely bound to one or two cores – so we want cores to be as fast as possible.

Facepunch has stated that “hard drive speed matters” for both server loading times and performance generally. So choose a server with an SSD if you can.

RUST Performance is Variable

RustDedicated.exe performance is variable from patch to patch. Occasionally Facepunch Studios will concentrate specifically on optimizing server performance. This will markedly improve server performance but it will tend to degrade over time.

Players Numbers

It’s pretty obvious that greater numbers of players require better server hardware and are more likely it is to expose game engine limitations.

Lots of players building (especially on large bases) and destroying bases requires the server to do a lot of work and it hurts RustDedicated.exe performance.

Entities and Colliders

The more things (entities 1) that exist in the world the greater the impact on the server.

Players create entities by building bases. And some exist already in the world (trees for example) which increase with map size and some variability depending on the map seed.

As entities (and colliders) increase server performance decreases. While there doesn’t appear to be a hard limit to the number of entities you will need to manage the number of entities to a level where your server performance is acceptable. More below.

An entity is basically a Facepunch Studios concept and is a networked game object that exists in the RUST world.  They include walls, doors, furnaces, ores, trees, animals, code locks, sleeping bags, and tools. Basically all the things you use or place in RUST.  Each entity can have one or more colliders (so that collision detection exists). There is no hard limit for entities in the game. There is a “collider limit” because of a unity/physx bug.  That’s why RUST now batches colliders so that the collider limit isn’t reached.

2 Most players have no idea that lag will normally be network/connection related –  and usually on their end. However, when it’s consistent for all players you know that either it’s a wider (server/datacentre) network issue – or a performance problem

Network Performance

This guide isn’t about the connection between the player and the server. This can be poor for a variety of reasons resulting in high latency, packet loss, and dropped connections. It is very common for players to have networking problems and server hosts/data centers will also be responsible for this on occasion. Contact your server provider if many/all of your players are affected by high ping or packet loss.

DDOS (Distributed Denial of Service) is a very real concern for server owners. Find a server provider that can meet your needs – they do exist and they are not extortionately expensive.

Modded Servers (Oxide)

Running oxide doesn’t impact server performance. But you wouldn’t be running oxide unless you also wanted to run extensions or plugins and they do have an impact on performance.

Some plugins might improve performance over time, consider one that increases building decay and helps limit the number of entities in the world. Every plugin has some impact on server performance when it is active. But for most plugins the impact will be vanishingly small.

How do you assess RUST server performance?

Server Responsiveness

When a server is not performing well actions like opening boxes and doors will take longer than expected, PVP will start to feel inconsistent, placing building parts will not be smooth, adding chat is delayed, players shout “lag 2” in chat. Once you’ve ruled out network issues – it’s necessary to investigate further.

Server FPS

Server FPS is the most obvious and easy way to check server performance. And because of this many server owners place too much emphasis on server FPS. Facepunch Studios has stated multiple times that a server FPS could be limited to 30 and players would not know the difference.

Setting excessively high server FPS will increase CPU usage with no actual performance gain. So that’s important to consider if you are running multiple RUST servers on one dedicated box.

However, we do want servers with consistent FPS that doesn’t drop close or below 30 FPS. If the server is dropping below 30 FPS (or the FPS is very inconsistent) then you will need to make changes to improve performance.

Server FPS is displayed in the bottom right of the server console window. Or if you don’t have access to the console you can type “fps” in remote server console and it will return the value.

CPU, Memory, Hard Drive, and Network Usage

Whether running your server on Windows or Linux you should be using standard tools to monitor CPU, memory, hard drive and network usage.

Let’s describe an ideal situation. You are running a single RUST server on a dedicated box. Using tools like Windows Task Manager and Resource Monitor you determine the following:

  • CPU usage is low (never hitting 100% on any core) and spreading somewhat between more than one core. CPU usage 100 players.
  • Memory usage is as expected. Usage will increase with time since last RUST server restart and depending on the number of entities and players. This is memory usage with 100 players, 170k entities,12 hours up time:
  • Network usage is well below maximum bandwidth (250Mbps and above are common) and appears relatively stable. This is network usage with 100 players connected:
  • Hard drive usage is low and queue length 3 does not suggest any bottleneck. Also note the files that are being written and that it includes oxide files.

https://technet.microsoft.com/en-gb/library/cc938625.aspx

If you check and your server performance is not as described above – then RustDedicated.exe performance may be impacted.

Keep in mind – even if CPU, memory, hard drive and network usage appear okay, RustDedicated.exe can still perform poorly.

Garbage Collects

Time warnings are not reported in the console by default. You may wish to turn them on (global.timewarning 1) for periods of time to assess performance. The most critical information that this will provide is how long garbage collects are taking.

Without going into any real detail – the process of locating and freeing up unused memory is known as garbage collection. Garbage collection is critical to control the amount of memory being used and so that new memory allocation remains efficient. While garbage collects are required (eventually) the process is very costly – while a garbage collect is running the server otherwise stalls and players freeze/lag. Longer lasting garbage collections can delay a number of important RUST server functions.

The length of garbage collects is a good indication of server performance (particularly as it relates to number of entities on the map). When garbage collects take longer than 2 seconds each time it becomes quite noticeable for players. Some very knowledgeable server owners will wipe when garbage collects take longer than 2 seconds. But that might not fit with your schedule.

Keep in mind that garbage collects also collect oxide extension and plugin garbage – and so running plugins will (to some extent) increase the length of the collects.

How do we Improve RUST Server Performance?

Hardware and Operating System

  • As mentioned above, make sure each instance of RustDedicated.exe has at least 2 or 3 fast dedicated cores, sufficient ram, and if possible a dedicated SSD.
  • Some server owners have reported that turning off hyperthreading has improved server performance. However, the performance improvement seems marginal.
  • In Windows – always make sure the power settings are set for “performance”. Also, check these settings after applying Windows updates as they are known to change. It is incredible how many times reports of terrible performance are fixed by correcting these settings.
  • If you are running on Windows you can experiment with setting RustDedicated.exe affinity to all but core 0. The reasoning is that Windows preferentially uses core 0 (for Windows tasks) and you don’t want RustDedicated.exe primarily using this core as they will be in competition.
  • You can change the priority of RustDedicated.exe in Task Manager to give it preference. Some owners report improved performance with this setting.

Both affinity (/AFFINITY FE) and priority (/HIGH) can be set in the batch 4 used to run RustDedicated.exe, here is an example:

@echo off
 cls
 :start
 echo Starting server...

start /WAIT /HIGH /AFFINITY FE RustDedicated.exe -batchmode -nographics -silent-crashes +server.ip xx.xx.xx.xx +rcon.ip xx.xx.xx.xx +server.port 28015 +rcon.port 28016 +rcon.password "password" +rcon.web "1" +server.identity "SERVER"

@echo.
 @echo Restarting server...
 @echo.
 goto start

Start – contains information on affinity and priority – http://ss64.com/nt/start.html

RUST Server

  • Restarting the server can improve performance. It is common for heavily modded servers to restart daily (or more). Vanilla servers tend to restart much less frequently. You might need to consider restarting your server if memory usage becomes unacceptably high or to improve some un-diagnosed performance problem. The necessity of restarts also changes with each build/update. So be flexible in your approach.
  • Every time the server saves it impacts players. Consider setting the save interval longer (server.saveinterval). It’s also useful to announce saves as they happen (via a plugin or RUSTAdmin) so players understand what’s happening.
  • Animal AI appears to have a large impact on server performance. Many owners will turn off animal AI at peak times (ai.think 0). This does impact gameplay (bears won’t attack for example) but it might be worth the extra performance. RUSTAdmin has the ability to run commands when player numbers reach a particular level. Play around with it to see what works for your server.
  • Use decay and upkeep settings that work for your server and help to control the growth of entities. You can change decay and upkeep settings 5 to make them more or less aggressive (there are lots of oxide plugins that adjust decay as well). Unused raided bases and random clutter on the map just waste entities and will force a wipe for performance reasons sooner.
  • Find a wipe schedule that maintains acceptable performance and works for your players. This will largely be dictated by the average number of players on the server, the rate they build, and decay settings.
  • Maybe you remember that there used to be a collider limit in RUST? It was 65k in legacy and 260k in early experimental. Basically when you hit that limit no one could build and a bunch of other stuff broke. Well Facepunch Studios introduced collider batching to get around that problem. However, at the moment building on large bases requires colliders to be unbatched and it’s causing a considerable performance loss 6. This is particularly an issue for modded servers where large bases are more prevalent because of increased gather rates and kits.While there isn’t a solution there are ways to mitigate this issue. You can unbatch colliders until they reach ~260k. To unbatch colliders include “batching.colliders 0” in your command line or /cfg/server.cfg. While the command can be entered directly in server console without restarting it will not take effect. Make sure to keep a close eye on the number of colliders (use the command “colliders” in server console) – because you will need to turn collider batching on and restart before you reach ~260k colliders. Some servers set a rule about maximum base sizes, which would also help to alleviate this problem (and help control entities generally).

5 decay.scale controls the cooldown before decay starts and the time it take for decay to destroy a foundation.  decay.scale 1 is default and setting it to 2 would effectively double the rate of decay. (This has now been replaced with the upkeep system, but if the upkeep system is disabled it will fallback to the old decay system and this will then apply.)

6 On a test server, a single player building as quickly as they can on a large base can drop FPS from 256 (set as max) to below 80 FPS.  This is repeatable and considerably worse on busy servers where it can reduce FPS below 30.

Oxide

  • We have no full way of assessing the impact of plugins on server performance, there is no plugin profiling that exists and it’s unlikely to ever exist.
  • Remember that every plugin has an impact on performance when running. But also remember the impact of most plugins will be extremely small when compared to the total activity of the server. Modded server owners often obsess over which plugins might be causing performance problems – when the the cause is probably more fundamental (hardware or just RUST/Unity). Removing a few plugins is unlikely to make any marked improvement to server performance.
  • Some extensions/plugins are more expensive (performance-wise) than others. And it depends on what they are doing. If a plugin is doing a bunch of stuff every tick (OnTick) or is looping through every single entity frequently it can cause issues. It can also happen if a plugin has created and is using a large data file (stored within /oxide/data) – clearing out large data files is often advised between server wipes.
  • RUSTIO and LustyMap are examples of an extension and a plugin that do a lot. And so they could have an impact on server performance. But most server owners that run either of these would probably accept the impact on performance because they are important to players.
  • It is possible to see total hook time for each extension/plugin with the command “plugins” in the server console. This isn’t really of great use – it’s not a measure of impact on the server. Just the amount of time that a plugin took. But if you have a plugin that has a high total hook time and you know it is doing a lot then it could be impacting performance.
  • Make use of plugins that will improve server performance – for example, by managing entity growth more effectively (decay plugins) or kicking players with high ping.

Leave a Reply