landS Posted August 31, 2017 Share Posted August 31, 2017 5 hours ago, Helmonder said: So what you are saying is that whenever I delete a file on my main system it will be moved to sync/archive on my secundary system ? That actyally sounds great... I only have to monitor that archive dir for accidental deletions... Erm.... How about recycle bin plugin .... And Cron rsync to move any of deletion folders nightly? Quote Link to comment
Helmonder Posted August 31, 2017 Share Posted August 31, 2017 3 hours ago, landS said: Erm.... How about recycle bin plugin .... And Cron rsync to move any of deletion folders nightly? That would work also.. There are a few ways to recreate the crashplan experience, its only that crashplan had it all nice in one package 1 Quote Link to comment
jimsefton Posted August 31, 2017 Share Posted August 31, 2017 1 minute ago, Helmonder said: That would work also.. There are a few ways to recreate the crashplan experience, its only that crashplan had it all nice in one package Yeah, and crashplan seemed pretty lightweight compared to most other options Quote Link to comment
landS Posted August 31, 2017 Share Posted August 31, 2017 I to am not thrilled about the loss of 'home' features.... Gotta make do sadly. Quote Link to comment
jimsefton Posted August 31, 2017 Share Posted August 31, 2017 (edited) Seems like it was a pretty valuable differentiating feature for crashplan. Wonder if it is an opportunity for one of the competition to introduce it. Not expecting it to necessarily be free, but I don't necessarily need/want all my data innthe cloud. Edited August 31, 2017 by jimsefton Quote Link to comment
Helmonder Posted August 31, 2017 Share Posted August 31, 2017 4 hours ago, jimsefton said: Seems like it was a pretty valuable differentiating feature for crashplan. Wonder if it is an opportunity for one of the competition to introduce it. Not expecting it to necessarily be free, but I don't necessarily need/want all my data innthe cloud. Agree on that... I would pay for that function.. Quote Link to comment
denishay Posted August 31, 2017 Share Posted August 31, 2017 Agree on that... I would pay for that function..Same here. Would gladly pay to have that back. Sent from my ONEPLUS A3003 using Tapatalk Quote Link to comment
Shadowrunner Posted September 4, 2017 Share Posted September 4, 2017 Just migrated to the Pro version, docker updated ok. Also - Having fully expected to re-upload my 17TB, the whole lot migrated over without having to re-upload a single byte. I've confirmed this is correct with Crashplan support as I didn't want any nasty surprises if something had gone wrong. This might not be the case for everybody, but it looks like the 5TB migration limit isn't that rigid. SR 1 Quote Link to comment
mbc0 Posted September 4, 2017 Share Posted September 4, 2017 2 hours ago, Shadowrunner said: Just migrated to the Pro version, docker updated ok. Also - Having fully expected to re-upload my 17TB, the whole lot migrated over without having to re-upload a single byte. I've confirmed this is correct with Crashplan support as I didn't want any nasty surprises if something had gone wrong. This might not be the case for everybody, but it looks like the 5TB migration limit isn't that rigid. SR I can confirm this as well with a 7.5TB backup but although my docker has updated to pro and all my files have been scanned, it shows 7.5TB complete but cannot backup any more as it says subscription expired? I upgraded to small business and can log in to code42 and all seems well? I restarted the docker but still the same any ideas? Quote Link to comment
Shadowrunner Posted September 4, 2017 Share Posted September 4, 2017 It did that on me too. CP support told me it was just because the migration hadn't completed yet. I left it overnight and all was fine the next day. I'd try leaving it for a bit and see if it comes back. If not, or if it's already been a while I'd log a ticket with support. They've been pretty good in the past when I've needed them, even running with an unsupported headless installation. SR Quote Link to comment
t33j4y Posted September 6, 2017 Share Posted September 6, 2017 I have now migrated my subscription to Small Business and migrated to the Crashplan Pro docker by jlesage. What caused the most grief was that the actual subscription got stuck in a 30 day trial instead of the Migration subscription, so I had to get CP Support to intervene (For those who would need to get in touch with them - use the chat function as email is hopeless) Want to say a BIG thank you to gfjardim for the docker that I have been using since I don't know when. Had it not been for this change by CP, I would have stayed with that :-) Quote Link to comment
FreeMan Posted September 14, 2017 Share Posted September 14, 2017 I haven't upgraded to Pro (or figured out a different solution) yet. I came home to this tonight: I know it didn't have that message yesterday. Has anyone else seen anything like this? Quote Link to comment
dadarara Posted September 26, 2017 Share Posted September 26, 2017 Please advise ? How to upgrade the 4.8.0 version in the docker to the 4.8.3 Home version ? I have downloaded the CrashPlan_4.8.3_Linux.tgz file. I can copy it to the docker via SSH/Hash access can I simply uninstall the old versiona and install the new ? will I lose all the configuration ? will I need to adopt backups etc... Uncompress the TGZ file to Downloads. Open Terminal and enter: cd ~/Downloads/crashplan-install Press Enter. Then enter: sudo ./uninstall.sh -i /usr/local/crashplan Press Enter. At the prompt, type YES to uninstall and press Enter. Install latest copy Make sure you are in the crashplan-install folder in terminal Then enter: sudo ./install.sh Press Enter. I have my subscription expire in one year, so I am staying with what I have for now. But recently I see the CP only synchronizing the data to the cloud without actually backing. it already 6 days without backup and I get notifications about it. So the CP support advised to upgrade to the 4.8.3 version. thanks Quote Link to comment
Djoss Posted September 26, 2017 Share Posted September 26, 2017 CrashPlan in this container should update itself automatically. You may look at the history and/or logs to see why it’s not happening. Quote Link to comment
dadarara Posted September 27, 2017 Share Posted September 27, 2017 (edited) do people have the version 4.8.3 updated ? (these who did not move to business....) I dont see anything in the history tab though I might miss is. there are no RED error lines. hard to see. cant copy this history msg list. which log files can I look at? in any case. can I update manually ? Edited September 27, 2017 by dadarara Quote Link to comment
Leifgg Posted September 27, 2017 Share Posted September 27, 2017 (edited) This container will auto update (or actually the app will). Mine was in June. Check history.log in the log folder. I 06/15/17 12:32PM Downloading a new version of CrashPlan. I 06/15/17 12:32PM Download of upgrade complete - version 1436674800483. I 06/15/17 12:32PM CrashPlan has downloaded an update and will restart momentarily to apply the update. I 06/15/17 12:33PM Installing upgrade - version 1436674800483 I 06/15/17 12:33PM Upgrade installed - version 1436674800483 I 06/15/17 12:33PM CrashPlan started, version 4.8.3 Edited September 27, 2017 by Leifgg Quote Link to comment
jaj08 Posted September 27, 2017 Share Posted September 27, 2017 I just noticed I am running 4.8.0 as well... I saw where my environment auto upgraded in july to 4.8.3... But for some reason my environment now shows that its running 4.8.0. No clue when or what would have caused the downgrade. No new errors, or auto upgrade attempts have been made since. Quote Link to comment
Leifgg Posted September 27, 2017 Share Posted September 27, 2017 My guess is that you have forced an update of the container and that version 4.8.0 is the one embedded in the container (one more guess…). In my case the CrashPlan app notice the old version and updates automatically however have no idea why yours doesn’t behave the same. Sorry... Quote Link to comment
jaj08 Posted September 27, 2017 Share Posted September 27, 2017 Yeah I update the containers whenever prompted. I manage a total of 4 unRAID + Crashplan environments. Looks like 3 of the 4 are all stuck on 4.8.0 Quote Link to comment
johmei Posted October 15, 2017 Share Posted October 15, 2017 (edited) On 9/4/2017 at 3:56 AM, Shadowrunner said: Just migrated to the Pro version, docker updated ok. Also - Having fully expected to re-upload my 17TB, the whole lot migrated over without having to re-upload a single byte. I've confirmed this is correct with Crashplan support as I didn't want any nasty surprises if something had gone wrong. This might not be the case for everybody, but it looks like the 5TB migration limit isn't that rigid. SR Can anyone else please confirm that it did not delete their archived over 5TB?? Because I'm getting some pretty solid, very clear, "Your data WILL be deleted" messages and warnings. It's not saying it might, it's saying it WILL. Did you get these warnings when you migrated, Shadowrunner? I'm trying to decide if I should delete enough data to bring it below 5TB or risk having to reupload it all. Granted, it's not near the size of yours, but I would still rather not have to upload over 2 TBs of data if possible. Also, something to add that's confusing me; it's reporting my backup as 5.8TBs on the migration page, however, the CrashPlan app (I'm using gfjardim's) says completed 7.3TB of 7.7TB. Anyone have any ideas about that? (Just checked my last backup report email and it says 7.6TB...so confused) Edited October 15, 2017 by johmei Quote Link to comment
mbc0 Posted October 19, 2017 Share Posted October 19, 2017 (edited) Have Crashplan done something to prevent the fast upload we could achieve by editing the my.service.xml? I used to get 18mb/s upload when changing the AutoMaxFileSizeForWan from 0 to 1 but I am now getting 1mb/s upload on Crashplan Pro which will take 3 years to complete instead of a month! I also notice the <dataDeDupAutoMaxFileSize>1073741824</dataDeDupAutoMaxFileSize> line? I do not remember this, can anyone else here still take advantage of higher upload speeds with Crashplan Pro like they did on Crashplan home? <dataDeDupAutoMaxFileSize>1073741824</dataDeDupAutoMaxFileSize> <dataDeDupAutoMaxFileSizeForWan>1</dataDeDupAutoMaxFileSizeForWan> <dataDeDuplication>MINIMAL</dataDeDuplication> Edited October 19, 2017 by mbc0 Quote Link to comment
Helmonder Posted October 20, 2017 Share Posted October 20, 2017 My crashplan is not backup up anymore.. Seems to have happened since the latest update.. Although it might also be a coincedence. I have troubleshooted the issue towards not enough inotify triggers, as described in this post: https://support.code42.com/CrashPlan/4/Troubleshooting/Linux_real-time_file_watching_errors I want to make the fix and for that I ofcourse have to log into the crashplan docker.. However when I give the command: docker exec -it CrashPlanPRO bash it does not work and I get the following error message: rpc error: code = Unknown desc = oci runtime error: exec failed: container_linux.go:262: starting container process caused "exec: \"bash\": executable file not found in $PATH" The same command works fine with my other dockers.. Quote Link to comment
Djoss Posted October 20, 2017 Share Posted October 20, 2017 Is your question for the gfjardim's container (looks like it's not the one you are using)? BTW, inotify limit must be increased on the host, not in containers... Quote Link to comment
Leifgg Posted October 20, 2017 Share Posted October 20, 2017 On 10/19/2017 at 10:47 AM, mbc0 said: Have Crashplan done something to prevent the fast upload we could achieve by editing the my.service.xml? I used to get 18mb/s upload when changing the AutoMaxFileSizeForWan from 0 to 1 but I am now getting 1mb/s upload on Crashplan Pro which will take 3 years to complete instead of a month! I also notice the <dataDeDupAutoMaxFileSize>1073741824</dataDeDupAutoMaxFileSize> line? I do not remember this, can anyone else here still take advantage of higher upload speeds with Crashplan Pro like they did on Crashplan home? <dataDeDupAutoMaxFileSize>1073741824</dataDeDupAutoMaxFileSize> <dataDeDupAutoMaxFileSizeForWan>1</dataDeDupAutoMaxFileSizeForWan> <dataDeDuplication>MINIMAL</dataDeDuplication> Yes, you can edit the xml file in CrashPlan Pro in the same way. Quote Link to comment
Helmonder Posted October 20, 2017 Share Posted October 20, 2017 (edited) 1 hour ago, Djoss said: Is your question for the gfjardim's container (looks like it's not the one you are using)? BTW, inotify limit must be increased on the host, not in containers... I am using: jlesage/crashplan-pro But hey ! If its on the host then I can do that right away, thanks !! EDIT: Just did that, did not immediately work, crashplan also started to restart.. So I also increased the memory to 8gigs (it has been working on 4 gigs sofar and I have backupped 14 terabyte of files to crashplan without a problem...) Crashplan's advice is to have 1gig per terabyte, but that is based on the average amount of files on a system, and not on media storage (which typically is a lot of data but not so much files / large files). If the memory usage extrapolates in the same way I should be fine until I hit 35tb ( in the allready backupped set there also is my mysic library which is a lot of files again). Edited October 20, 2017 by Helmonder Quote Link to comment
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.