[Plugin] CA Appdata Backup / Restore - Deprecated


Squid

Recommended Posts

One could only be so lucky for so long. :(

 

My server just froze again. This time it must have frozen during the backup as I did not get the email that it finished.

 

I logged in to the server physically, which went slowly but worked.

I ran top just to see if something was hogging any resources but everything was looking normal.

Man how I regret not checking the log file first...

I instead ran diagnostics and nothing happened. I gave up after waiting for 30 minutes and force rebooted it again.

 

I guess my system is haunted :/

Link to comment

I love this app, thank you @Squid, many times in the past I wish I had set this up to run automatically. Would of saved myself a lot of grief!

 

The script works great and I could use some advice on what the best way to make it more efficient. As it's been noted many times already in this thread, metadata folders from plex, sonarr, radarr...etc contain a lot of small files. Working with or storing such a large number of small files is inefficient both disk operations or file system block size. After reading this thread there seems to be a few ways to solve it:

  1. Add a feature to tar or compress. Seems like the right solution, but feature doesn't exist today.
     
  2. Exclude the sub-directories with the metadata causing this inefficiency to be exclude from the backup. This isn't much of a solution IMO, as it's not being backed up. One could argue if it's even worth backing up in the first place, given it just takes time to easily re-download.
     
  3. Do not use the dated back-up option. This would allow rsync to much more quickly compare and only replace files that have changed. All this media metadata is pretty static. So this may be the best option to pick?

What's everyone's thoughts?

 

Link to comment

I've been suffering from random server crashes for the past month or 2 and have just made the connection that it happens at the same time very week, when CA Backup is running. I've carried out some of the suggestions in here and we'll see what happens next Monday morning at 5am :)

Link to comment

Got an error on last nights backup-


Event: Community Applications
Subject: appData Backup
Description: Backup of appData complete - Errors occurred
Importance: warning

Error in file I/O

 

Not sure what this means? Wanted to attach diagnostics but can't connect to my home network. My guess is that the OpenVPN docker didn't get restarted. Looks like Plex is also still shutdown. Can't check the rest remotely.

Link to comment
2 hours ago, wgstarks said:

Got an error on last nights backup-

 


Event: Community Applications
Subject: appData Backup
Description: Backup of appData complete - Errors occurred
Importance: warning

Error in file I/O

 

 

Not sure what this means? Wanted to attach diagnostics but can't connect to my home network. My guess is that the OpenVPN docker didn't get restarted. Looks like Plex is also still shutdown. Can't check the rest remotely.

The errors that occurred should be logged.  It should have still restarted the containers though... O.o

Link to comment
21 hours ago, Lev said:

Do not use the dated back-up option. This would allow rsync to much more quickly compare and only replace files that have changed. All this media metadata is pretty static. So this may be the best option to pick?

You can use dated backups and still only copy changed files.  Use attempt faster rsync in the settings.  The first couple backups will have to copy everything, but once a backup is slated to be deleted due to its age, then it will wind up getting updated with changed files instead.

 

Personally, I don't bother with dated backups though.

Link to comment

For those suffering from lockups etc after the backup is completed, during the removal stage of old dated backup sets, I am currently testing (in between my work's unreasonable demands upon my time) a slightly different way of starting the deletion process.

 

I still have never been able to replicate the issue, but I have come up with a theory as to what might be going on (and why the lockup's don't occur if manually removing the directories via the command line using an identical command)

 

  • Upvote 1
Link to comment
10 hours ago, Squid said:

The errors that occurred should be logged.  It should have still restarted the containers though... O.o

Found that about half my dockers were stopped (Jackett, NowShowing, OpenVPN-AS and PlexMediaServer) and wouldn't start. Couldn't get a diagnostics. I was able to open the webgui but when I tried to download diagnostics I got a 404 error. Connected via telnet and ran diagnostics and got a warning that the file couldn't be written because there was no space left on the disk even though I have 15 GB free on flash.

 

After rebooting everything seems to be ok. The only thing I see that is unusual is that the docker log is at 17%. Seems a little high since the server has only been up about 5 minutes, but not sure how to check for spamming? I've attached the current diagnostics. Might show something. Never seen anything like this before. Maybe it's Gremlins???:S

 

brunnhilde-diagnostics-20170731-1715.zip

 

 

 

Edit: Looks like maybe this was caused by a dropped connection to the UD drive I use for the backups connected via USB. Noticed it wasn't shown in Main and cycled the power to the UD drive to get it mounted again. Checked CA Backup settings and saw that destination was set as RecycleBin. Reconfigured the settings and backups seem to be running ok now.

Edited by wgstarks
Link to comment
15 hours ago, Squid said:

For those suffering from lockups etc after the backup is completed, during the removal stage of old dated backup sets, I am currently testing (in between my work's unreasonable demands upon my time) a slightly different way of starting the deletion process.

 

I still have never been able to replicate the issue, but I have come up with a theory as to what might be going on (and why the lockup's don't occur if manually removing the directories via the command line using an identical command)

 

 

After my last post I attempted to delete my old backup folders via Midnight Commander and again my server locked up while removing a random file in amongst my Plex Metadata folders. 

 

I'm looking to get rid of my 20+ dated backups and go non dated but i'm now at a bit of a loss as to how to get these old backups deleted.

Edited by NeoDude
Link to comment
17 hours ago, NeoDude said:

Ok, so managed to get my backups deleted by using "rm -rf *" at the CLI. Strange that it works here but causes a hang in MC. Does MC not use exactly the same command?

Yes.  And the exact same command the plugin uses...

Link to comment

Something I did notice was that the share that I had the plugin Backing up to didn't actually exist. What I mean to say is that the folder was physically there when looking via Windows explorer and within the plugin but it was not listed under Shares in the UnRaid GUI. 

 

Also, you say the same command is used with MC but when I delete via MC it looks like a "Directory Scan" is carried out first before then starting the delete process. When carrying out the RM from the CLI I have noticed there is no "Directory Scan", it just starts blitzing through the files.

Edited by NeoDude
Link to comment
On 8/1/2017 at 10:49 AM, Kewjoe said:

Is there a particular format needed for the Exclude Folders section?

 

if i typed   "Appdata\EmbyServer\metadata"  that would work? and if i want more than 1 folder i separate them with commas?

 

Anyone know an answer to above? :D

Link to comment
On 8/1/2017 at 10:49 AM, Kewjoe said:

Is there a particular format needed for the Exclude Folders section?

 

if i typed   "Appdata\EmbyServer\metadata"  that would work? and if i want more than 1 folder i separate them with commas?

You would want to use EmbyServer/metadata assuming that the folder that you're backing up is Appdata

  • Upvote 1
Link to comment
  • 2 weeks later...

Sorry to be a total n00b here, but I'm looking to run the backup on the 1st Sunday of every month at 5am. I understand that this requires a custom cron.

I thought I'd use one of the many different cron syntax generators online, but they all seem to give different outputs! Worse, the many different validators also give differing opinions on the validity of the cron I entered. 

So I thought I'd take a look on the forums to see if anyone recommended a particular cron syntax generator, but I have found none. 

So here I am. Can anyone confirm if my cron syntax is correct?

0 0 5 ? * 0#1 *

Again, I'd like it to run at 5am on the 1st Sunday of every month.

Many thanks.

Link to comment

Hi Guys,

 

I've just started using CA appdata backup, but its disappointing to check the dockers and VMs in the morning to find that they were either shutdown improperly or no longer running the tasks that were setup from the night before because of the shutdown requirements of CA appdata backup.

 

Does anyone know of a way to perform an scheduled "open" file backup on the appdata folder? (I know there's a way to exclude some dockers from the backup, but that's not the same as an "open" file backup) If not, this would be a cool feature to implement some day.

 

Thanks guys for any help that is offered.

Edited by Joseph
Link to comment
49 minutes ago, Joseph said:

Hi Guys,

 

I've just started using CA appdata backup, but its disappointing to check the dockers and VMs in the morning to find that they were either shutdown improperly or no longer running the tasks that were setup from the night before because of the shutdown requirements of CA appdata backup.

 

Does anyone know of a way to perform an scheduled "open" file backup on the appdata folder? (I know there's a way to exclude some dockers from the backup, but that's not the same as an "open" file backup) If not, this would be a cool feature to implement some day.

 

Thanks guys for any help that is offered.

For the improper shutdown, increase the shutdown time before forcibly killing the container.  This is a docker thing.  Stopping a running container will always try to do it cleanly, with an adjustable timeout before its killed.

 

If its doing some sort of task, then take your chances on keeping things running.

 

"Open" file backup.  Keep everything running.  But then you also run the chance that the backup is incorrect because one file is in the middle of being written to, and another related file hasn't be written yet, etc etc.  Open files on backups will always possibly result in indeterminate states on an ultimate restore that may or may not render the application non-functional.

 

Up to you.

Link to comment
On 8/15/2017 at 2:57 PM, jademonkee said:

Sorry to be a total n00b here, but I'm looking to run the backup on the 1st Sunday of every month at 5am. I understand that this requires a custom cron.

I thought I'd use one of the many different cron syntax generators online, but they all seem to give different outputs! Worse, the many different validators also give differing opinions on the validity of the cron I entered. 

So I thought I'd take a look on the forums to see if anyone recommended a particular cron syntax generator, but I have found none. 

So here I am. Can anyone confirm if my cron syntax is correct?


0 0 5 ? * 0#1 *

Again, I'd like it to run at 5am on the 1st Sunday of every month.

Many thanks.

 

I've tried to look into this and honestly I can't find a solution without running some kind of custom scripting. 

Link to comment
3 hours ago, kizer said:

 

I've tried to look into this and honestly I can't find a solution without running some kind of custom scripting. 

Try this as a custom cron entry

0 5 * * 0 [[ $(date +%e) -le 7 ]] && 

Should run the backup first Sunday of every month @ 5am

 

Nutshell is that that cron entry is every Sunday, but you're entering in a script also that checks if the Day is <= 7 and only then executes the script which I'm supplying

 

Obviously not tested since its not the first week of August

Edited by Squid
  • Upvote 1
Link to comment

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.