LinuxQuestions.org
Download your favorite Linux distribution at LQ ISO.
Home Forums Tutorials Articles Register
Go Back   LinuxQuestions.org > Forums > Linux Forums > Linux - Distributions > Debian
User Name
Password
Debian This forum is for the discussion of Debian Linux.

Notices


Reply
  Search this Thread
Old 03-12-2008, 03:15 PM   #1
richinsc
Member
 
Registered: Mar 2007
Location: Utah
Distribution: Ubuntu Linux (20.04)
Posts: 224

Rep: Reputation: 32
Question Get around 2GB Tar Limit


Are there any suggestions or tips on getting around the 2.0GB Size limit for tar files in Debian Etch? Would like to use tar for backups but have some backups that well exceed 2.0GB and this causes problems with backup scripts.

Have thought about dd to an image file but prefer tar or some other similar utility that will work with my backup scripts. Suggestions are greatly appreciated.
 
Old 03-12-2008, 05:31 PM   #2
michaelk
Moderator
 
Registered: Aug 2002
Posts: 25,691

Rep: Reputation: 5894Reputation: 5894Reputation: 5894Reputation: 5894Reputation: 5894Reputation: 5894Reputation: 5894Reputation: 5894Reputation: 5894Reputation: 5894Reputation: 5894
Three things that effect file size limitations are the 1. the kernel, 2. the file system and 3. the application. Since Debian Etch uses a 2.6 kernel I would rule out 1. unless you have
compiled your own kernel and did not include large file support. 2. Would only be a problem
if you are trying to save the file to a FAT32 file system or using a kernel that did not include large file support and 3. would be a problem if using an old version of tar which does not handle large files.

So what kernel are you running, what file system are you trying to save the tar file on and what version of tar are you actually using? Doing some googling there appeared to be single file size limitation of 2GB but again I believe the latest version can handle a single file of 68 GB.

To see if your version of tar has large file support:
strings `which tar`|grep 64

If you are trying to save to a network share then you could have samba limitations depending on version.

So basically it is difficult to say what the problem is without additional information.
 
Old 03-12-2008, 07:11 PM   #3
kilgoretrout
Senior Member
 
Registered: Oct 2003
Posts: 2,987

Rep: Reputation: 388Reputation: 388Reputation: 388Reputation: 388
You can have bash configured to limit created file sizes. Check that by running:

$ ulimit -a

in a console. If you get a line of output that reads:

file size (blocks, -f) unlimited

Your OK there. I know the Bastille security program can be configured to limit the size of any created files as well. There's probably other examples, but the point is that there is no inherent limitation in tar itself limiting archives to 2GB. I routinely create much larger ones than that. It's probably not a tar problem; it's more likely something in the way your system is configured, unless you are running some really ancient version of tar for some reason.
 
Old 03-13-2008, 09:04 AM   #4
richinsc
Member
 
Registered: Mar 2007
Location: Utah
Distribution: Ubuntu Linux (20.04)
Posts: 224

Original Poster
Rep: Reputation: 32
kilgoretrout here is my output. Let me know what you think.


Code:
$ ulimit -a
core file size          (blocks, -c) 0
data seg size           (kbytes, -d) unlimited
max nice                        (-e) 0
file size               (blocks, -f) unlimited
pending signals                 (-i) unlimited
max locked memory       (kbytes, -l) unlimited
max memory size         (kbytes, -m) unlimited
open files                      (-n) 1024
pipe size            (512 bytes, -p) 8
POSIX message queues     (bytes, -q) unlimited
max rt priority                 (-r) 0
stack size              (kbytes, -s) 8192
cpu time               (seconds, -t) unlimited
max user processes              (-u) unlimited
virtual memory          (kbytes, -v) unlimited
file locks                      (-x) unlimited
Quote:
Originally Posted by michaelk View Post
If you are trying to save to a network share then you could have samba limitations depending on version.
I am trying to have the file go over a samba share to a windows box that has a shared folder on an NTFS File System.

Quote:
Originally Posted by michaelk View Post
To see if your version of tar has large file support:
strings `which tar`|grep 64
This command does not work for me. Strings command is not found. Is strings a package I need installed... here is output of tar version.

Code:
tar (GNU tar) 1.16
I will try to see if I bypass limit by taring to local file system. Using ext3 file system with stock kernel.

Code:
$ uname -r
2.6.18-6-686
 
Old 03-13-2008, 09:42 AM   #5
michaelk
Moderator
 
Registered: Aug 2002
Posts: 25,691

Rep: Reputation: 5894Reputation: 5894Reputation: 5894Reputation: 5894Reputation: 5894Reputation: 5894Reputation: 5894Reputation: 5894Reputation: 5894Reputation: 5894Reputation: 5894
Yes, depending on the version samba does have a 2gb limitation if you do not mount the share with the lfs option. The latest uses cifs which should not have this problem.

mount -t smbfs -o lfs //server/share /localdir

Last edited by michaelk; 03-13-2008 at 09:43 AM.
 
Old 03-13-2008, 09:49 AM   #6
richinsc
Member
 
Registered: Mar 2007
Location: Utah
Distribution: Ubuntu Linux (20.04)
Posts: 224

Original Poster
Rep: Reputation: 32
How would I do and enable lfs via fstab? My fstab is below.

Code:
# /etc/fstab: static file system information.
#
# <file system> <mount point>   <type>  <options>       <dump>  <pass>
proc            /proc           proc    defaults        0       0
/dev/hda2       /               ext3    defaults,errors=remount-ro 0       1
/dev/hda1       none            swap    sw              0       0
/dev/hdc        /media/cdrom0   udf,iso9660 user,noauto     0       0
/dev/fd0        /media/floppy0  auto    rw,user,noauto  0       0
//10.4.12.50/FTP /mnt/cfsscop1 smbfs rw,auto,credentials=/etc/samba/cred-file,uid=richinsc,gid=users,fmask=0770,dmask=0770 0 0
//cfsscdm1/SFTP /data   smbfs   rw,auto,credentials=/etc/samba/cred-file,uid=$i,gid=users,fmask=0770,dmask=0770 0 0
 
Old 03-13-2008, 10:01 AM   #7
michaelk
Moderator
 
Registered: Aug 2002
Posts: 25,691

Rep: Reputation: 5894Reputation: 5894Reputation: 5894Reputation: 5894Reputation: 5894Reputation: 5894Reputation: 5894Reputation: 5894Reputation: 5894Reputation: 5894Reputation: 5894
Try:
//cfsscdm1/SFTP /data smbfs rw,auto,credentials=/etc/samba/cred-file,uid=$i,gid=users,fmask=0770,dmask=0770,lfs 0 0
 
Old 03-13-2008, 10:03 AM   #8
richinsc
Member
 
Registered: Mar 2007
Location: Utah
Distribution: Ubuntu Linux (20.04)
Posts: 224

Original Poster
Rep: Reputation: 32
I was able to test the tar file to the local ext3 file system and it worked, backup came out at a nice 3.6GB. The problem lies with samba. So now I just need to get the samba share to allow for taring over 2GB files or tar to the local system and mv to remote share via my scripts. Installed Saber version is unknown, how would I find out samba version. Unable to parse via /etc/init.d/samba --version.
 
Old 03-13-2008, 10:16 AM   #9
michaelk
Moderator
 
Registered: Aug 2002
Posts: 25,691

Rep: Reputation: 5894Reputation: 5894Reputation: 5894Reputation: 5894Reputation: 5894Reputation: 5894Reputation: 5894Reputation: 5894Reputation: 5894Reputation: 5894Reputation: 5894
Well it should work (must be root):
/usr/sbin/smbd -V
or
smbstatus
 
Old 03-13-2008, 10:19 AM   #10
richinsc
Member
 
Registered: Mar 2007
Location: Utah
Distribution: Ubuntu Linux (20.04)
Posts: 224

Original Poster
Rep: Reputation: 32
Thanks Michael, Problem Solved and Thread/Issue can now be closed. Adding lfs option to smb mount really helped. I can now run my scripts and not worry about them ejecting because of a file limitation. Again, thanks so much. You made my life much easier. And in the process we've created a help doc for other users that have this problem.

The version of samba that requires lfs option is
Code:
Samba version 3.0.24

Last edited by richinsc; 03-13-2008 at 10:20 AM. Reason: Samba Version
 
Old 03-13-2008, 10:24 AM   #11
michaelk
Moderator
 
Registered: Aug 2002
Posts: 25,691

Rep: Reputation: 5894Reputation: 5894Reputation: 5894Reputation: 5894Reputation: 5894Reputation: 5894Reputation: 5894Reputation: 5894Reputation: 5894Reputation: 5894Reputation: 5894
Excellent.
 
  


Reply



Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off



Similar Threads
Thread Thread Starter Forum Replies Last Post
ReiserFS 2Gb maximum file size limit? Cannot copy files bigger than 2Gb ihtus SUSE / openSUSE 2 10-26-2007 09:21 AM
2GB swap limit twantrd Linux - General 2 10-19-2004 08:56 PM
Redhat 9.0 >2GB file limit Wilbs Red Hat 2 08-12-2004 01:26 PM
2GB File Limit AS2.1 jbovaird Red Hat 19 11-14-2003 11:29 AM
2GB > Limit optize Linux - General 1 03-18-2002 08:59 AM

LinuxQuestions.org > Forums > Linux Forums > Linux - Distributions > Debian

All times are GMT -5. The time now is 01:54 AM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Open Source Consulting | Domain Registration