LinuxQuestions.org

LinuxQuestions.org (/questions/)
-   Slackware (https://www.linuxquestions.org/questions/slackware-14/)
-   -   file size limit (https://www.linuxquestions.org/questions/slackware-14/file-size-limit-25216/)

jaysan 07-06-2002 09:49 PM

file size limit
 
hello,

i have just installed slackware 8.1 with reiserfs.
i tried to backup data from our sun workstations using cpio but
there seems to be a 2Gb limit on the filesize.

is there a max file size limit for cpio?
is there a way to get around this limit?

the kernel installed is 2.4.18.

I also experienced this problem with slackware 8.0,
i installed the 2.2.19 kernel using ext2fs. I was hoping that
upgrading to slackware 8.1 using the 2.4.18 kernel and reiserfs would help solve this problem.

please help...

TIA

Excalibur 07-06-2002 11:16 PM

I have experienced similar problems. Even though a file system is suppose to support larger file sizes than 2 GB, most programs do not. And even if you find an app that will, most every operation you attempt with it will also fail. Your mileage may vary.

The only solution I was able to work with consistently and reliably was to make smaller files.

Cpio may very well have a file size limitation imposed through bash if your output is to a file or it may have a limitation within if the file to be backed up is greater than 2 GB. I am sure I have been able to far exceed 2 GB on total output using tar to a device (not a file), like a hard disk, DVD or tape. My daily backup on my workstation exceeds 3 GB on DVD+RW using tar piped to sdd to handle the write operation. But tar is purely sequential, it is a aging process to restore a single file from such a backup. You might want to investigate a program called "taper". I believe it allows file selections through an index list if I recall. I think the output was designed for tape drives though. If you have the disk space and can set it up, like on a network, you may want to look at rsync. On rsync though, I do not know if a single file over 2 GB would be a problem or not.

I have not used cpio direct to a device but I think it supports tape drives so you could attempt the following. On for example a 20 gbyte hard disk for backups, create 4 primary partitions of 5 GB each. Then perform your backup using the device name directly. No file system. Reference it like /dev/hdd1, hdd2, hdd3, hdd4. Modify for your needs as required.

If you are interested in a backup medium faster than tape, you may want to investigate DVD+RW. Last night my system backed up 3.9 GB in 20 minutes. (200 MB/minute.) The capacity without compression is 4.7 GB.

Good luck.

jaysan 07-11-2002 06:28 PM

filesize limit
 
Excalubur,

thank you very much for your suggestion. I tried using tar but it did not work. It also stopped at 2GB.

With regards to the 5 partitions that you suggested. The backup drives are physically not at the said machine. It was mounted from our NT based stations using smbmount. Could it be that smbmount has a problem with filesize exceeding 2GB?

I wish we had a tape drive but we cannot afford it. :(

Anyway, i also have a red hat linux machine with 2.4.2 kernel using ext2fs. It can tar files at more than 2GB.

zelgadis 07-11-2002 10:03 PM

Very complete information in:

http://www.suse.de/~aj/linux_lfs.html

and more links about this and other subjects in:

http://loll.sourceforge.net/linux/links/

Excalibur 07-11-2002 11:17 PM

Zelgadis, thank you for the suse.de link. It was most informative. But it appears that we are still not quite there as yet.

I was able to duplicate the 2 GB limit using cpio on local source files to local output. However, when I used tar to perform the same backup it created a file that the "ls" command wouldn't even display in the directory. And "rm" would not delete it. I had to use the tar command again to truncate to an acceptable size. Then delete it.

I can only recommend for the backup limitation issue using cpio; break it up into smaller files sizes, or performing data only backups. You can also consider piping the output through gzip to compress it. But that is only a limited workaround until the file output reaches the 2 GB limit again because gzip has the same limitation. Consider also the network usage for complete backups and time constraints for these massive transfer operations. For a Linux box data only backup consider the directories; /etc, /home, and /root. I also include /var/spool/mail and /var/spool/mqueue if it is a mail server and the apache files if a web server.

In regards to my suggestion on the hard disk partitions, I would think the disk physically needs to be where the tar/cpio command is executed would be best. But I do not see why it couldn't be remote on a Linux box, just use the HOSTNAME:/dev/hddx to reference it like a tape drive would be. If you are backing up to a destination located on a NT workstation that is using NTFS it should not have a 2/4 GB limit. If other, Win2K Pro, using FAT32 then high possibility of 2/4 GB limit.

Good luck on your endeavor.

jaysan 07-12-2002 03:43 AM

filesize limit
 
hi,

i tried to tar it on the local drive, it reached 2.3GB when i stopped it because my drive was almost empty.

could it be that smbmount is the problem?

zelgadis,

thanks for the link, very informative, though i think i need to read it a second time...:) am still a newbie...


All times are GMT -5. The time now is 07:55 AM.