-   Linux - General (
-   -   How to backup large files( > 2G or 4G)? (

myunicom 09-24-2003 09:35 AM

How to backup large files( > 2G or 4G)?
I have one directory that has many subdirectory and total size is about 11G.The files in that directory is various every day(created or deleted). Now I want to backup this directory once a day and want to keep the files permission and owner unchanged (becuase I want to restore it in the future). Because the total files size is very large(11G) and it will spend a long time if backup the whole directory every time, I just want to backup the changed files(created or deleted) except for the first backup.I have used the following methods and have some troubles:

1. cpio or tar : can not deal with so large files( they have a limitation 2G or 4G?).

2.cp -a (thanks for Cheers help): It can deal with large files and the option -a can keep the files permission and owner unchanged , but It have to copy the whole directory every time.
I have used the cp -aU , It can backup the newly created files only(the unchanged old files do not backup), but it can not delete the old files in the backuped directory while the coresponding files in the original directory was deleted .

my computer enviroment:

OS : redhat linux 9
filesystem: reiserfs

Who can help me?


myunicom 09-24-2003 10:55 AM

Who can help me?

From myunicom

pk21 09-24-2003 01:23 PM

The 2.1 Gb is a file size limit. You just can't make a file grow bigger than 2.1 Gb with a standard kernel.
You will have to compile a kernel with support for large files if you want to exceed this limit.

myunicom 09-24-2003 07:56 PM

My Kernel supports for large files.


kev82 09-24-2003 08:17 PM

i dont know how useful it will be but im thinking you could write a makefile, make executes commands based on whether files are older than there dependancies so could you put cp and rm commands in a makefile?

All times are GMT -5. The time now is 04:21 AM.