High CPU usage after rcon connection


#1

Hello! I just recently started using Sponge for a new private server I’m hosting, and I quickly hacked together a little backup script for it using mcrcon and rsnapshot.

In what I initially assumed was an unrelated issue, my Sponge server began showing very high CPU usage (up to 100% of two cores on my VPS). However, I noticed that these ramps in CPU activity happened as distinct cliffs of about +50% in the usage graph every two hours, which is when cron is told to run the backup command for the most frequent series.

I was led to believe that this was caused by the connection and immediate disconnection of my backup script’s rcon session, as this behaviour is not caused if I input the commands manually.

I am running SpongeVanilla 1.12.2-7.1.0-BETA-59 using Ubuntu 18.04 on a 4GB Linode. The following is the script responsible for backup:

#!/usr/bin/env python3
import subprocess
import sys
import mcrcon

snap = sys.argv[1]
rcon = mcrcon.MCRcon()
rcon.connect('localhost', 25575, 'rcon')
rcon.command('save-off')
rcon.command('save-all')
subprocess.check_call(['rsnapshot', snap])
rcon.command('save-on')
rcon.disconnect()

(don’t worry about my trivial rcon password, iptables drops all non-local traffic sent to 25575)


#2

I am also having this problem with some sort of rcon interaction from localhost. I’m running CentOS 7 with spongeforge-1.12.2-2705-7.1.0-BETA-3361 and I’ve noticed significant CPU usage after an rcon attempt by localhost, I believe it’s from netdata’s spigotmc chart but I’m still a linux newb so I’m having a difficult time uh… disabling it. Anyway, I figured I’d post about it here to perhaps troubleshoot further.

I also notice on my debug log that there is some mixin stuff being thrown in during the rcon attempt.


#3

I do believe this issue was fixed recently. Try updating SpongeForge.


#4

Yep this issue and subsequent commit looks like it should be fixed, thanks tux.

However since 1.12.2-2705-7.1.0-BETA-3361 is the recommended stable version currently, I’ll leave this here for others.