[CONTAINER] CrashPlan & CrashPlan-Desktop


Recommended Posts

5 hours ago, Helmonder said:

 

So what you are saying is that whenever I delete a file on my main system it will be moved to sync/archive on my secundary system ?  That actyally sounds great... I only have to monitor that archive dir for accidental deletions... 

Erm.... How about recycle bin plugin .... And Cron rsync to move any of deletion folders nightly? 

Link to comment

Seems like it was a pretty valuable differentiating feature for crashplan. Wonder if it is an opportunity for one of the competition to introduce it. Not expecting it to necessarily be free, but I don't necessarily need/want all my data innthe cloud.

Edited by jimsefton
Link to comment
4 hours ago, jimsefton said:

Seems like it was a pretty valuable differentiating feature for crashplan. Wonder if it is an opportunity for one of the competition to introduce it. Not expecting it to necessarily be free, but I don't necessarily need/want all my data innthe cloud.

 

Agree on that... I would pay for that function..

Link to comment

Just migrated to the Pro version, docker updated ok. 

 

Also - Having fully expected to re-upload my 17TB, the whole lot migrated over without having to re-upload a single byte. I've confirmed this is correct with Crashplan support as I didn't want any nasty surprises if something had gone wrong.

 

This might not be the case for everybody, but it looks like the 5TB migration limit isn't that rigid.

 

SR

  • Upvote 1
Link to comment
2 hours ago, Shadowrunner said:

Just migrated to the Pro version, docker updated ok. 

 

Also - Having fully expected to re-upload my 17TB, the whole lot migrated over without having to re-upload a single byte. I've confirmed this is correct with Crashplan support as I didn't want any nasty surprises if something had gone wrong.

 

This might not be the case for everybody, but it looks like the 5TB migration limit isn't that rigid.

 

SR

 

I can confirm this as well with a 7.5TB backup but although my docker has updated to pro and all my files have been scanned, it shows 7.5TB complete but cannot backup any more as it says subscription expired? I upgraded to small business and can log in to code42 and all seems well? I restarted the docker but still the same any ideas?

 

 

Capture.JPG

Link to comment

It did that on me too. CP support told me it was just because the migration hadn't completed yet. I left it overnight and all was fine the next day.

 

I'd try leaving it for a bit and see if it comes back. If not, or if it's already been a while I'd log a ticket with support. They've been pretty good in the past when I've needed them, even running with an unsupported headless installation.

 

SR

Link to comment

I have now migrated my subscription to Small Business and migrated to the Crashplan Pro docker by jlesage. What caused the most grief was that the actual subscription got stuck in a 30 day trial instead of the Migration subscription, so I had to get CP Support to intervene (For those who would need to get in touch with them - use the chat function as email is hopeless) 

 

Want to say a BIG thank you to gfjardim for the docker that I have been using since I don't know when. Had it not been for this change by CP, I would have stayed with that :-) 

Link to comment
  • 2 weeks later...
  • 2 weeks later...

Please advise ?

 

How to upgrade the 4.8.0  version in the docker to the 4.8.3 Home version ?

 

I have downloaded the CrashPlan_4.8.3_Linux.tgz   file.  I can copy it to the docker via SSH/Hash access

can I simply  uninstall the old versiona and install the new ?

will I lose all the configuration ?  will I need to adopt backups etc...

  1. Uncompress the TGZ file to Downloads.
  2. Open Terminal and enter: cd ~/Downloads/crashplan-install
  3. Press Enter.
  4. Then enter: sudo ./uninstall.sh -i /usr/local/crashplan
  5. Press Enter.
  6. At the prompt, type YES to uninstall and press Enter.

Install latest copy

  1. Make sure you are in the crashplan-install folder in terminal
  2. Then enter: sudo ./install.sh
  3. Press Enter.

 

 I have my subscription expire in one year, so I am staying with what I have for now.  But recently I see the CP only synchronizing the data to the cloud without actually backing.

it already 6 days without backup and I get notifications about it. So the CP support advised to upgrade to the 4.8.3 version.

 

thanks

 

 

Link to comment

do people have the version 4.8.3 updated ?  (these who did not move to business....)

I dont see anything in the history tab though I might miss is.  there are no RED error lines. hard to see. cant copy this history msg list.

which log files can I look at?

 

in any case. can I update manually ?

Edited by dadarara
Link to comment

This container will auto update (or actually the app will). Mine was in June. Check history.log in the log folder.

I 06/15/17 12:32PM Downloading a new version of CrashPlan.
I 06/15/17 12:32PM Download of upgrade complete - version 1436674800483.
I 06/15/17 12:32PM CrashPlan has downloaded an update and will restart momentarily to apply the update.
I 06/15/17 12:33PM Installing upgrade - version 1436674800483
I 06/15/17 12:33PM Upgrade installed - version 1436674800483
I 06/15/17 12:33PM CrashPlan started, version 4.8.3

 

Edited by Leifgg
Link to comment

I just noticed I am running 4.8.0 as well... I saw where my environment auto upgraded in july to 4.8.3...  But for some reason my environment now shows that its running 4.8.0. No clue when or what would have caused the downgrade.

 

No new errors, or auto upgrade attempts have been made since.

Link to comment
  • 3 weeks later...
On 9/4/2017 at 3:56 AM, Shadowrunner said:

Just migrated to the Pro version, docker updated ok. 

 

Also - Having fully expected to re-upload my 17TB, the whole lot migrated over without having to re-upload a single byte. I've confirmed this is correct with Crashplan support as I didn't want any nasty surprises if something had gone wrong.

 

This might not be the case for everybody, but it looks like the 5TB migration limit isn't that rigid.

 

SR

Can anyone else please confirm that it did not delete their archived over 5TB??  Because I'm getting some pretty solid, very clear, "Your data WILL be deleted" messages and warnings.  It's not saying it might, it's saying it WILL.  Did you get these warnings when you migrated, Shadowrunner?  I'm trying to decide if I should delete enough data to bring it below 5TB or risk having to reupload it all.  Granted, it's not near the size of yours, but I would still rather not have to upload over 2 TBs of data if possible.

Also, something to add that's confusing me; it's reporting my backup as 5.8TBs on the migration page, however, the CrashPlan app (I'm using gfjardim's) says completed 7.3TB of 7.7TB.  Anyone have any ideas about that? (Just checked my last backup report email and it says 7.6TB...so confused)

Edited by johmei
Link to comment

Have Crashplan done something to prevent the fast upload we could achieve by editing the my.service.xml?  I used to get 18mb/s upload when changing the AutoMaxFileSizeForWan from 0 to 1 but I am now getting 1mb/s upload on Crashplan Pro which will take 3 years to complete instead of a month!  I also notice the <dataDeDupAutoMaxFileSize>1073741824</dataDeDupAutoMaxFileSize> line? I do not remember this, can anyone else here still take advantage of higher upload speeds with Crashplan Pro like they did on Crashplan home?

 

<dataDeDupAutoMaxFileSize>1073741824</dataDeDupAutoMaxFileSize>
          <dataDeDupAutoMaxFileSizeForWan>1</dataDeDupAutoMaxFileSizeForWan>
          <dataDeDuplication>MINIMAL</dataDeDuplication>
 

Edited by mbc0
Link to comment

My crashplan is not backup up anymore.. Seems to have happened since the latest update.. Although it might also be a coincedence. I have troubleshooted the issue towards not enough inotify triggers, as described in this post:

 

https://support.code42.com/CrashPlan/4/Troubleshooting/Linux_real-time_file_watching_errors

 

I want to make the fix and for that I ofcourse have to log into the crashplan docker.. 

 

However when I give the command:

 

docker exec -it CrashPlanPRO bash

 

it does not work and I get the following error message:

 

rpc error: code = Unknown desc = oci runtime error: exec failed: container_linux.go:262: starting container process caused "exec: \"bash\": executable file not found in $PATH"

The same command works fine with my other dockers..

Link to comment
On 10/19/2017 at 10:47 AM, mbc0 said:

Have Crashplan done something to prevent the fast upload we could achieve by editing the my.service.xml?  I used to get 18mb/s upload when changing the AutoMaxFileSizeForWan from 0 to 1 but I am now getting 1mb/s upload on Crashplan Pro which will take 3 years to complete instead of a month!  I also notice the <dataDeDupAutoMaxFileSize>1073741824</dataDeDupAutoMaxFileSize> line? I do not remember this, can anyone else here still take advantage of higher upload speeds with Crashplan Pro like they did on Crashplan home?

 

<dataDeDupAutoMaxFileSize>1073741824</dataDeDupAutoMaxFileSize>
          <dataDeDupAutoMaxFileSizeForWan>1</dataDeDupAutoMaxFileSizeForWan>
          <dataDeDuplication>MINIMAL</dataDeDuplication>
 

Yes, you can edit the xml file in CrashPlan Pro in the same way.

Link to comment
1 hour ago, Djoss said:

Is your question for the gfjardim's container (looks like it's not the one you are using)?

BTW, inotify limit must be increased on the host, not in containers...

 

I am using: jlesage/crashplan-pro

 

But hey !  If its on the host then I can do that right away, thanks !!

 

EDIT: Just did that, did not immediately work, crashplan also started to restart.. So I also increased the memory to 8gigs (it has been working on 4 gigs sofar and I have backupped 14 terabyte of files to crashplan without a problem...)

 

Crashplan's advice is to have 1gig per terabyte, but that is based on the average amount of files on a system, and not on media storage (which typically is a lot of data but not so much files / large files).

 

If the memory usage extrapolates  in the same way I should be fine until I hit 35tb ( in the allready backupped set there also is my mysic library which is a lot of files again).

Edited by Helmonder
Link to comment

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.