Latest LQ Deal: Linux Power User Bundle
Go Back > Forums > Linux Forums > Linux - Newbie
User Name
Linux - Newbie This Linux forum is for members that are new to Linux.
Just starting out and have a question? If it is not in the man pages or the how-to's this is the place!


  Search this Thread
Old 12-01-2004, 08:53 AM   #1
LQ Newbie
Registered: Jul 2004
Posts: 12

Rep: Reputation: 0
Question maximum linux file size?

I am running RedHat 9.0 and i am trying to copy a file the is 3GB from a windows 2000 server to the RedHat machine. When the file copies, it takes up all 40GB of the harddrive on the linux machine. I've tried the comand "split -b 650000000 /mnt/windowsshare/filename /backups/filename" and it just created about 25 650MB files until the hard drive was filled up.

I am more familiar with Windows than linux. Can linux recognize files larger than 2GB properly?

any ideas what my problem would be?

any help is much apreciated.
Old 12-01-2004, 09:25 AM   #2
Registered: Apr 2004
Distribution: Debian
Posts: 185

Rep: Reputation: 30
Old 12-01-2004, 09:30 AM   #3
Senior Member
Registered: Sep 2003
Location: Sweden
Distribution: Debian
Posts: 3,032

Rep: Reputation: 58
Linux, or the filesystems commonly used with Linux, can handle big files. It depends on the type of filesystem you use, and on this page there is a table describing the functionality of different file systems when it comes to files and filesystem size.

The split command should work, so I'm guessing it may have something to do with the file size on the Windows file system is being reported wrong.

Old 12-01-2004, 09:49 AM   #4
LQ Newbie
Registered: Jul 2004
Posts: 12

Original Poster
Rep: Reputation: 0
that chart says the maximum file size on an ext3 file system (the file system I am using) is 2GB so that may be part of my problem. But that doesn't seem to explain why the split didn't work right. I believe the file size is being reported properly, the file is only taking up 3GB of space on the windows machine.
Old 12-01-2004, 10:00 AM   #5
Registered: Oct 2004
Location: Tartu, Århus,Nürnberg, Europe
Distribution: Debian, Ubuntu, Puppy
Posts: 619

Rep: Reputation: 45
The chart is perhaps a bit outdated. The caption says that
and once the 2.4 kernels come out, I am sure the limits will be extended

Old 12-01-2004, 10:53 AM   #6
LQ Newbie
Registered: Jul 2004
Posts: 12

Original Poster
Rep: Reputation: 0
so the file size limit has been extended?
Old 12-01-2004, 05:01 PM   #7
LQ Guru
Registered: Jan 2002
Posts: 6,042

Rep: Reputation: Disabled
Linux can handle up to 16 TB with certain filesystems. I used XFS and ReiserFS because they are much better than ext3. I used XFS for most of my partitions.

I have never seen this problem. It could be on your Windows system not calculating the correct size or smbclient not calculating correctly.

The only problem that I had is copying 4 GB through SAMBA that can only handle 2 GB per file at the time. I used the split command and it made four 1 GB files.

BTW, how many bytes did smbclient calculated for the file that you are trying to copy.
Old 12-03-2004, 07:57 AM   #8
LQ Newbie
Registered: Jul 2004
Posts: 12

Original Poster
Rep: Reputation: 0
ok, I think the problem must be with smbmount. I am trying to copy the file nightly in a script so I am using something like this:

smbmount //machine/share /mnt/share -o username=user,password=password
split -b 650000000 /mnt/share/bigfile.ext /backup

when i mount the share with smbmount and do an ls -l of the directory it is mounted in, the file is reported as a crazy huge file at 18 Exa bytes. If I connect with the smbclient command and do an ls -l it reports the correct size.

so the smbclient is working correctly for me but the smbmount is not. Does anyone know of a way I can use the smbclient command in a script and not the smbmount? will something like this work?:

cd /backup
smbclient //machine/share
split -b 650000000 /mnt/share/bigfile.ext
#rest of script................

any help is appreciated.
Old 12-03-2004, 10:31 AM   #9
LQ Newbie
Registered: Jul 2004
Posts: 12

Original Poster
Rep: Reputation: 0
ok, i've figured out how to script the smbclient command, here it is:

smbclient //machine/share -U username%password -c "lcd /backup; get filename"

Ive tested it and it works on small files, what I'm woried about ist that it will hang when i run it on the large 3GB file. does anyone know how I could pipe the above command to something like split that will alow me to split the file up into smaller portions that can be transfered by smbclient?


Thread Tools Search this Thread
Search this Thread:

Advanced Search

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off

Similar Threads
Thread Thread Starter Forum Replies Last Post
maximum size for crontab? JakeS Linux - Software 6 01-18-2005 06:23 PM
avisync maximum file size david.skinner Linux - Software 0 11-26-2004 06:19 PM
Maximum file size with AIX & JFS ? stephnane AIX 4 07-07-2004 03:23 PM
Maximum Size of File in Red Hat Linux 2.1 imsajjadali Red Hat 0 07-02-2004 10:05 AM
dvd+rw maximum file size problem (burning) swamp2003 Linux - Software 1 12-10-2003 10:30 AM > Forums > Linux Forums > Linux - Newbie

All times are GMT -5. The time now is 12:47 PM.

Main Menu
Write for LQ is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Facebook: linuxquestions Google+: linuxquestions
Open Source Consulting | Domain Registration