LinuxQuestions.org

LinuxQuestions.org (/questions/)
-   Red Hat (https://www.linuxquestions.org/questions/red-hat-31/)
-   -   Redhat 9.0 >2GB file limit (https://www.linuxquestions.org/questions/red-hat-31/redhat-9-0-2gb-file-limit-216351/)

Wilbs 08-11-2004 03:25 PM

Redhat 9.0 >2GB file limit
 
Hi,

First post here so go easy! :-)

I have a vanilla redhat 9.0 install with all the latest updates running the 2.4.20-31.9smp kernel.

At boot time I map an smbfs file system on a Windows 2000 server sucessfully. I can write and read to and from this share from the linux box just fine. However, if I try to copy a file larger than 2GB to this mount it fails with an error message similar to 'file size exceeded'. The same is true if I mount using smbmount. I also have this issue with NFS.

I can also repeat this issue using ftp to and from the box.

This is all installed on an ext3 filesystem which does support large files. I have succesfully created an 8GB file locally without issue.

I have read about numerous occurences of this on other forums but have yet to find any definitive solution.

I guess my questions are,

1) Does the stock vanilla kernel shipped with RH9 come built with LFS (Large File Support)? If not what patches need to be applied?

2) Does the Samba install support LFS? 2.2.7a? If not does the newer version 3.0 support it?

3) What FTP client/server supports large file transfers?

4) Anybody know of a list of apps that do or do not support large files specific to RH9?

5) <small rant> How can HP/Compaq/Dell certify RH9 to work with high end workstations when the default config fails to support something like this. An example being some MSC analysis packages that quite easily generate files in excess of 2GB etc.

TIA.

huntz 08-11-2004 05:57 PM

I think its a combination of Samba and NTFS.

I had the same problem. However if I use NFS to copy the files to another linux servers I have no problems at all.

In my case I was backing up an array which was copying tar files to a NAS server running 2k Advanced. It would reach 2gig and fail. So I setup a linux server with a 250gig hard drive and and mounted the drive using NFS. No problems at all. I'm copying 20 gig tar files across now with no 'file size exceeded' errors.

I've searched everywhere for info on the samba/ntfs issue but havn't found anything.

Wilbs 08-12-2004 01:26 PM

If I repeat the process but this time using 'mount -t cifs' instead of smbfs on gentoo install running a 2.6 kernel it works just fine.


All times are GMT -5. The time now is 06:35 AM.