I had the same problem, I almost lost my job because of it, luckily I had backup, it happened on a dedicated server I want to manage, I never use Cyberpanel again.
It was reported by users back then in around august or September month but the developers were unable to reproduce this bug because it doesn’t happen with all setups… Later got to know from cyberpanel users that if the server storage is less, then this bug triggers while doing restore operation.
It was fixed immediately after identifying the issue.
You need to update cyberpanel to get those bug fixes done.
With a 2Gb website, more than 10Gb of memory is more than enough. So this time the cause is not from memory.
VPS uses the latest version of cyberpanel since it was released, because I have a server to test the version, updates are always prioritized.
When was the last time you performed upgrade command? Cyberpanel minor updates doesnt change the version number. If you want to check this in detail, then do check cyberpanel at github branch.
Just do a cyberpanel upgrade using the upgrade command provided here.
Secondly, this issue was occured in August-September around and it happens with server having less remaining storage when restoring a backup. This was fixed immediately and so far none reported any issues with any file deletion.
If you are by any chance even after upgrading is having such issues, please share the relevant log files to see why this happened so that the developers can fix it.
If the updates are minor, I haven’t updated yet. Update only when the version number on versionManagement changes. Currently creating a backup for VPS before running the upgrade command. Will notify if there is a similar problem.
As I figured out backup needs a least 2 time size of website, because in first step the backup script copies the entire website to a subfolder in backup directory. After this all aassocaited databases are dumped in the same folder and some more informations. So we have twice he size. And at last step all is packed with tar.gz.
Maybe a good wish for wish list is a check of freevspace an of the size of the websites dir. If double this is more than free space the backup should quit before it do anything
My “solution” for prevent losing data is to move regular a backup file outside the websites homedir to the global /home/backup
And I do snapshots on regular times (cyberpanel is running on a virtual root server).
And I managed to install the “Synology Backup Agent”. It´s my hint for all ownders of a Synology NAS. The built in backup Tool called “active backup for business” is a perfect tool for me. After insalling its agent onyour machine you are able to run automated regular increental backups. I do it every hour. An best: there is a client version for linux servers