NOTE: I have changed some values in my code to hide critical information.

I needed to automate a full backup of a remote cpanel-based server. My limitations were:

I needed to store the backup to a Drobo 5N NAS which does not offer scp or the ability to mount the file system using sshfs.

I didn't want to copy the backup across the network to my local server, copy the backup to the Drobo, and then delete the backup from my local server.

I also wanted to perform most of the tasks from the cloud server, except for moving the backup to my local network.

My first step was to go into the cpanel, find the full backup button, click the button, and copy the URL.

I then logged out of the cpanel, and attempted to go to the URL I had copied. It re-directed me to this page:
https://cpanel.mysite.com/login

I copied the source for that login page.

Parsing the source code I pulled these forum fields out (yours may vary):

goto_uri=/frontend/paper_lantern/backup/dofullbackup.html
goto_app=""
user
pass

The first 2 were automatically filled-in hidden fields. The last 2 were asking for the username and password in order to proceed to the goto_url.

Next I installed sshfs:
Code:
zypper install sshfs
I then created the directories as ~/mnt/mysite

In order to automatically mount the remote server's file system I needed to be able to ssh to the server without a password. Good instructions for that can be found here:
https://www.linuxtrainingacademy.com...thout-password.

I tested my mounting by utilizing this command (modify as needed.):

Code:
# -p 87 is because the hoster does not utilize the default ssh port.
sshfs myuser@mysite.com:/home/myuser /home/d3vnull/mnt/mysite/ -p 87
This particular remote server when performing a full backup creates the backup-*.tar.gz file on the root of the file system.

Next I connected the remote server through ssh and created two directories:

/backups
/bin

Inside of bin I created two scripts.

fullbackup.sh

Code:
#!/usr/bin/bash

# The purpose of cd here is simply because the wget produces a cookie file.
cd /home/myuseri/bin

# This performs the full backup
wget -q -O /dev/null --user-agent=Mozilla/5.0 --save-cookies cookies.txt --post-
data 'goto_uri=/frontend/paper_lantern/backup/dofullbackup.html&goto_app=""&user
=myseri&pass=mypassword' --no-check-certificate https://cpanel.mysite.com/login
I used the cpanel interface to cron fullbackup.sh for 12 midnight.

mvbackup.sh

Code:
#!/usr/bin/bash

home='/home/myuser/'

backups="${home}backups/"

# Check to see if a backup file exists
if [ -f ${home}/backup-*tar.gz ]
then
  # If a backup exists then move it to the backups directory and cleanup any cookie files leftover from the wget 
  mv ${home}backup-*tar.gz ${backups}
  rm ${home}bin/cookies.txt
else
  echo No backup!
fi
I used the cpanel interface to cron mvbackup.sh for 2 AM.

Finally, on my local server I created mysite_backup.sh to pull the backup off the remote server, place it on the Drobo, and recover the space on the remote server.

mysite_backup.sh

Code:
#!/bin/bash
nahome=/home/myuser/
mnt=/home/d3vnull/mnt/mysite/

# Check the remote server file system is mounted. If it isn't then mount it.
if [ ! -d ${mnt}backups ]
then
  sshfs myuser@mysite.com:${nahome} ${mnt} -p 87
fi

# Check to see if there are any backups.
cd ${mnt}backups/

NOTE: This test failed if more than one backup was on the server.
#if [ -f backup-*gz ]

# Used this test instead.
if [ `ls -1 backup-*gz 2>/dev/null | wc -l ` -gt 0 ]
then
  # Copy the backup to the Drobo using the smbclient
  smbclient //192.168.200.100/Data -W WORKGROUP -U 'DroboOUser%DroboPassword' -c 'cd \backups\;prompt;mput backup*gz;exit'
  
  # Remove the backup archives from the remote server.
  rm ${mnt}backups/backup-*gz
fi
The cron command runs mysite_backup.sh at 6 AM.

Hope this helps someone.