CyberPanel CLI backup not showing in Web GUI

AlmaLinux release 8.8 (Sapphire Caracal)
Current Version: 2.3
Build: 4
Current Commit: d65b6b5bcd4e89ce1a0eeb74131f98093e85bb3a
Latest Version: 2.3
Latest Build: 4
Latest Commit: d65b6b5bcd4e89ce1a0eeb74131f98093e85bb3a

My typical stack is CloudLinux/cPanel/LSWS Enterprise on an EC2 instance on AWS. I use cPanel’s build in backup functionality to do nightly backups to AWS S3. This is a private server for a client, so I am deviating a little with a VPS on VULTR running on AlmaLinux. The CyperPanel version is licensed as LSWS Enterprise bundled with CyberPanel.

Because CyberPanel does not appear to have this functionality built in, I am writing a script to do it. The basic logic of the script being use the CyberPanel CLI createBackup function to create the account backups, then use AWS CLI v2 to sync the /home/user/backup folders to an S3 bucket. The script would cleanup local backups more than 5 days old, and a S3 bucket policy would cleanup remote backups after 90 days.

The first step was planning, and for step two I tested by running this command: cyberpanel createBackup --domainName bamtrail.com at which point everything appeared to run correctly. It didn’t throw any errors, it informed me of the log location it wrote to, and I confirmed the existence of /home/bamtrail.com/backup/backup-bamtrail.com-09.15.2023_21-40-07.tar.gz

However, since I used the CLI call to native CyberPanel tool, I expected to be able to find this backup somewhere in the web GUI for easy restoration, and it doesn’t appear to be there. If I am not going to be able to restore backups from the GUI easily, there is no real reason to maintain the last 5 locally.

Am I misunderstanding what cyberpanel createBackup is suppose to do, or is something wrong here? Thank you.

I found the documentation on this page: CyberPanel Command Line Interface and gave it another read. I am rather confused by it. It seems like a poor design in the CLI functionality.

Why have the backup command save the backups under /home/${WEBSITE}/backup/ when the restore in the web GUI will only ever look under /home/backup? Furthermore, why does the backup command not have an option for saving to a specified path so you could direct it to save to /home/backup? It appears the CLI command was actually written to be incompatible with the Web GUI, as to even make it work you have to follow every backup from the command line with `mv /home/${WEBSITE}/backup/backup-${WEBSITE}-*.tar.gz /home/backup/’ to even get the web GUI to recognize the backup’s existance.

I found an old script I created on another one of my servers to do backups to S3 for a single site and email the results. I reused that and modified it to backup all sites on the server. I am going to put my finalized script here in case it is useful to anyone else. You can use it and/or modify it to fit your needs. I have made the variables generic for posting on the internet without referring to my server setup.

In my personal use, I have an S3 bucket dedicated to each server for backups, so the backups just go to the root of the bucket. If you reuse buckets, you can modify this to sync to a subfolder. I also have Thunderbird email filters that trigger on the presence of “:white_check_mark:” and “- Successful” then tag the message so it is green, and “:no_entry:” and “- Failure” then tag the message so it is red. Then I have good visual indications in the email subject to easily identify backup status emails, and if they succeed or failed. That way every morning I look for either a failed/absence of success versus the presence of success, to ensure the backups completed on my servers ok.

This script runs well both manually, and as a cron job.

#!/bin/bash

#title           : Backup CyberPanel to S3
#description     : Backup CyberPanel websites to AWS S3 Storage.
#author          : John C. Reid <[email protected]>
#date            : 6/21/2022
#last update     : 9/18/2023
#usage           : backup
#prerequisites   : This script has the following prerequisites:
#                : - Required for AWS:
#                :   - S3 Bucket Exists
#                :   - IAM User Exists
#                :   - IAM Bucket Access Policy Set
#                :   - S3 Bucket Retention Policy Set (RetentionPolicy: Expire 90)
#                : - AWS CLI v2 installed and configured to give bucket access
#                : - jq installed
#                : - swaks - Swiss Army Knife for SMTP installed
#notes           : AWS Bucket Policy is set to remove S3 backups after 90 days,
#                : Local backups will be maintained for 5 days for quick restore
#                : from CyberPanel web GUI.
#================================================================================

set -e

AWS_S3_BUCKET="serverHostname-Backups" # Replace with your S3 Bucket name

# Email settings
email_from_address="[email protected]" # Replace with desired From
email_address="[email protected]" # Replace with the To: address
email_subject_success="✅ MTNC Server Backup - Successful"
email_subject_failure="⛔ MTNC Server Backup - Failure"
smtp_server="shastaemail.com" # Replace with your SMTP host
smtp_port="587"
smtp_username="[email protected]" # Replace with your SMTP username
smtp_password="&Zs0yoBDvNfiEdXCKpTBPe" # Replace with your SMTP password
smtp_elho="mtnc.prime42.net" # Replace with your sending server identification

##### End basic config #####
# stop editing here
#================================================================================

PID_FILE=/tmp/cyberpanel_backup_running.pid

# Function to remove PID file
function remove_pid_file {
  if [ -f "$PID_FILE" ]; then
    rm -f "$PID_FILE"
  fi
}

# Set up a trap to remove the PID file upon script exit
trap remove_pid_file EXIT

# prevent multiple backup running at the same time
if [ -f "$PID_FILE" ]; then
    echo "Process is running! Exiting..."
    exit 0
fi
touch $PID_FILE

LIST_WEBSITES=$(cyberpanel listWebsitesJson | jq -r '. | fromjson')

# Function to delete files older than 5 days in the /home/backup folder
function delete_old_files() {
  local folder_path="/home/backup"
  local days=5

  if [ -d "${folder_path}" ]; then
    find "${folder_path}" -type f -mtime +${days} -exec rm -f {} \;
    echo "Deleted files older than ${days} days in ${folder_path}"
  else
    echo "Error: The specified folder does not exist: ${folder_path}"
  fi
}

# Run backup process and capture output
output=$((
for WEBSITE in $(echo "${LIST_WEBSITES}" | jq -r '.[].domain'); do
    echo "Backing up ${WEBSITE}"
    cyberpanel createBackup --domainName ${WEBSITE}

    echo "Moving backup to /home/backup..."
    mv /home/${WEBSITE}/backup/backup-${WEBSITE}-*.tar.gz /home/backup/
done

echo "Uploading to S3..."
aws s3 sync --storage-class=GLACIER_IR --no-progress /home/backup/ s3://$AWS_S3_BUCKET/

echo "Remove old backup..."
delete_old_files
) 2>&1)

# Check if an error occurred and send the appropriate email
if echo "$output" | grep -Eiq "^error|^fatal error"; then
  echo "$output" | swaks -tls --silent 3 --to "$email_address" --from "$email_from_address" --server "$smtp_server" --port "$smtp_port" --auth LOGIN --auth-user "$smtp_username" --auth-password "$smtp_password" --ehlo "$smtp_elho" --header "Subject: $email_subject_failure" --body -
else
  echo "$output" | swaks -tls --silent 3 --to "$email_address" --from "$email_from_address" --server "$smtp_server" --port "$smtp_port" --auth LOGIN --auth-user "$smtp_username" --auth-password "$smtp_password" --ehlo "$smtp_elho" --header "Subject: $email_subject_success" --body -
fi
1 Like

This topic was automatically closed 3 hours after the last reply. New replies are no longer allowed.