LinuxQuestions.org
Review your favorite Linux distribution.
Go Back   LinuxQuestions.org > Forums > Linux Forums > Linux - General
User Name
Password
Linux - General This Linux forum is for general Linux questions and discussion.
If it is Linux Related and doesn't seem to fit in any other forum then this is the place.

Notices

Reply
 
Search this Thread
Old 09-24-2003, 09:35 AM   #1
myunicom
LQ Newbie
 
Registered: Sep 2003
Posts: 14

Rep: Reputation: 0
Angry How to backup large files( > 2G or 4G)?


I have one directory that has many subdirectory and total size is about 11G.The files in that directory is various every day(created or deleted). Now I want to backup this directory once a day and want to keep the files permission and owner unchanged (becuase I want to restore it in the future). Because the total files size is very large(11G) and it will spend a long time if backup the whole directory every time, I just want to backup the changed files(created or deleted) except for the first backup.I have used the following methods and have some troubles:

1. cpio or tar : can not deal with so large files( they have a limitation 2G or 4G?).

2.cp -a (thanks for Cheers help): It can deal with large files and the option -a can keep the files permission and owner unchanged , but It have to copy the whole directory every time.
I have used the cp -aU , It can backup the newly created files only(the unchanged old files do not backup), but it can not delete the old files in the backuped directory while the coresponding files in the original directory was deleted .


my computer enviroment:

OS : redhat linux 9
filesystem: reiserfs


Who can help me?

thanks!
 
Old 09-24-2003, 10:55 AM   #2
myunicom
LQ Newbie
 
Registered: Sep 2003
Posts: 14

Original Poster
Rep: Reputation: 0
Who can help me?

From myunicom
 
Old 09-24-2003, 01:23 PM   #3
pk21
Member
 
Registered: Jun 2002
Location: Netherlands - Amsterdam
Distribution: RedHat 9
Posts: 549

Rep: Reputation: 30
The 2.1 Gb is a file size limit. You just can't make a file grow bigger than 2.1 Gb with a standard kernel.
You will have to compile a kernel with support for large files if you want to exceed this limit.
 
Old 09-24-2003, 07:56 PM   #4
myunicom
LQ Newbie
 
Registered: Sep 2003
Posts: 14

Original Poster
Rep: Reputation: 0
My Kernel supports for large files.

thanks.
 
Old 09-24-2003, 08:17 PM   #5
kev82
Senior Member
 
Registered: Apr 2003
Location: Lancaster, England
Distribution: Debian Etch, OS X 10.4
Posts: 1,263

Rep: Reputation: 50
i dont know how useful it will be but im thinking you could write a makefile, make executes commands based on whether files are older than there dependancies so could you put cp and rm commands in a makefile?
 
  


Reply


Thread Tools Search this Thread
Search this Thread:

Advanced Search

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off


Similar Threads
Thread Thread Starter Forum Replies Last Post
Can't burn KDar backup to DVD (slices too large) abisko00 Linux - Software 9 12-09-2005 04:15 PM
Large file DVD backup question labratmatt Linux - Software 3 01-08-2005 11:07 PM
large tar backup file to be burn stefane321 Linux - General 2 07-01-2004 02:33 PM
kernel 2.2 vs. 2.4 - large # of files rmang Linux - General 2 02-13-2004 08:20 AM
Large data files on CD dema Linux - Newbie 1 01-26-2002 10:30 PM


All times are GMT -5. The time now is 03:02 AM.

Main Menu
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
identi.ca: @linuxquestions
Facebook: linuxquestions Google+: linuxquestions
Open Source Consulting | Domain Registration