LinuxQuestions.org
Share your knowledge at the LQ Wiki.
Go Back   LinuxQuestions.org > Forums > Linux Forums > Linux - Newbie
User Name
Password
Linux - Newbie This Linux forum is for members that are new to Linux.
Just starting out and have a question? If it is not in the man pages or the how-to's this is the place!

Notices

Reply
 
Search this Thread
Old 12-01-2004, 07:53 AM   #1
alec77
LQ Newbie
 
Registered: Jul 2004
Posts: 12

Rep: Reputation: 0
Question maximum linux file size?


I am running RedHat 9.0 and i am trying to copy a file the is 3GB from a windows 2000 server to the RedHat machine. When the file copies, it takes up all 40GB of the harddrive on the linux machine. I've tried the comand "split -b 650000000 /mnt/windowsshare/filename /backups/filename" and it just created about 25 650MB files until the hard drive was filled up.

I am more familiar with Windows than linux. Can linux recognize files larger than 2GB properly?

any ideas what my problem would be?

any help is much apreciated.
 
Old 12-01-2004, 08:25 AM   #2
peacebwitchu
Member
 
Registered: Apr 2004
Distribution: Debian
Posts: 185

Rep: Reputation: 30
yes.
 
Old 12-01-2004, 08:30 AM   #3
hw-tph
Senior Member
 
Registered: Sep 2003
Location: Sweden
Distribution: Debian
Posts: 3,032

Rep: Reputation: 57
Linux, or the filesystems commonly used with Linux, can handle big files. It depends on the type of filesystem you use, and on this page there is a table describing the functionality of different file systems when it comes to files and filesystem size.

The split command should work, so I'm guessing it may have something to do with the file size on the Windows file system is being reported wrong.


Håkan
 
Old 12-01-2004, 08:49 AM   #4
alec77
LQ Newbie
 
Registered: Jul 2004
Posts: 12

Original Poster
Rep: Reputation: 0
that chart says the maximum file size on an ext3 file system (the file system I am using) is 2GB so that may be part of my problem. But that doesn't seem to explain why the split didn't work right. I believe the file size is being reported properly, the file is only taking up 3GB of space on the windows machine.
 
Old 12-01-2004, 09:00 AM   #5
otoomet
Member
 
Registered: Oct 2004
Location: Tartu, Århus,Nürnberg, Europe
Distribution: Debian, Ubuntu, Puppy
Posts: 588

Rep: Reputation: 45
The chart is perhaps a bit outdated. The caption says that
Quote:
and once the 2.4 kernels come out, I am sure the limits will be extended
Best,

Ott
 
Old 12-01-2004, 09:53 AM   #6
alec77
LQ Newbie
 
Registered: Jul 2004
Posts: 12

Original Poster
Rep: Reputation: 0
so the file size limit has been extended?
 
Old 12-01-2004, 04:01 PM   #7
Electro
Guru
 
Registered: Jan 2002
Posts: 6,042

Rep: Reputation: Disabled
Linux can handle up to 16 TB with certain filesystems. I used XFS and ReiserFS because they are much better than ext3. I used XFS for most of my partitions.

I have never seen this problem. It could be on your Windows system not calculating the correct size or smbclient not calculating correctly.

The only problem that I had is copying 4 GB through SAMBA that can only handle 2 GB per file at the time. I used the split command and it made four 1 GB files.

BTW, how many bytes did smbclient calculated for the file that you are trying to copy.
 
Old 12-03-2004, 06:57 AM   #8
alec77
LQ Newbie
 
Registered: Jul 2004
Posts: 12

Original Poster
Rep: Reputation: 0
ok, I think the problem must be with smbmount. I am trying to copy the file nightly in a script so I am using something like this:

smbmount //machine/share /mnt/share -o username=user,password=password
split -b 650000000 /mnt/share/bigfile.ext /backup

when i mount the share with smbmount and do an ls -l of the directory it is mounted in, the file is reported as a crazy huge file at 18 Exa bytes. If I connect with the smbclient command and do an ls -l it reports the correct size.

so the smbclient is working correctly for me but the smbmount is not. Does anyone know of a way I can use the smbclient command in a script and not the smbmount? will something like this work?:

cd /backup
smbclient //machine/share
split -b 650000000 /mnt/share/bigfile.ext
exit
#rest of script................

any help is appreciated.
 
Old 12-03-2004, 09:31 AM   #9
alec77
LQ Newbie
 
Registered: Jul 2004
Posts: 12

Original Poster
Rep: Reputation: 0
ok, i've figured out how to script the smbclient command, here it is:

smbclient //machine/share -U username%password -c "lcd /backup; get filename"

Ive tested it and it works on small files, what I'm woried about ist that it will hang when i run it on the large 3GB file. does anyone know how I could pipe the above command to something like split that will alow me to split the file up into smaller portions that can be transfered by smbclient?
 
  


Reply


Thread Tools Search this Thread
Search this Thread:

Advanced Search

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off


Similar Threads
Thread Thread Starter Forum Replies Last Post
maximum size for crontab? JakeS Linux - Software 6 01-18-2005 05:23 PM
avisync maximum file size david.skinner Linux - Software 0 11-26-2004 05:19 PM
Maximum file size with AIX & JFS ? stephnane AIX 4 07-07-2004 02:23 PM
Maximum Size of File in Red Hat Linux 2.1 imsajjadali Red Hat 0 07-02-2004 09:05 AM
dvd+rw maximum file size problem (burning) swamp2003 Linux - Software 1 12-10-2003 09:30 AM


All times are GMT -5. The time now is 10:46 AM.

Main Menu
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
identi.ca: @linuxquestions
Facebook: linuxquestions Google+: linuxquestions
Open Source Consulting | Domain Registration