massive tar file, can't untar
hi all
i am trying to tar up my web server directory which ends up being around 13GB using ssh but when i try to untar it on another server it takes up heaps of cpu and memory and then eventually the ssh connection drops out and the tar just quits is there anyway i can get around this? like tar into several archives and then merge them on untar thanks dave |
I'm not expert in TAR, but I would generally avoid handling anything in bites that big. Too many different issues can get you.
What's the downside of moving 3-4 "tarballs" rather than 1? |
hey, thanks for your reply
that's what i'm asking is it possible to run something like tar -cf -split 500MB file%d.tar and end up with like file01.tar, file02.tar, file03.tar, file04.tar then use something like tar xf -join file01.tar file02.tar file03.tar file04.tar to join them all back together untarred |
Hi...
Sorry to resurrect such an old thread.... maybe the problem is that ssh can't cope with such a big memory allocation. So, how about doing that not inside ssh? maybe programs like dtach, screen, or tmux could help here. Quote:
|
All times are GMT -5. The time now is 09:16 AM. |