LinuxQuestions.org
Visit Jeremy's Blog.
Go Back   LinuxQuestions.org > Forums > Linux Forums > Linux - Newbie
User Name
Password
Linux - Newbie This Linux forum is for members that are new to Linux.
Just starting out and have a question? If it is not in the man pages or the how-to's this is the place!

Notices


Reply
  Search this Thread
Old 07-09-2008, 04:58 AM   #1
slideh
LQ Newbie
 
Registered: Jul 2008
Posts: 29

Rep: Reputation: 15
Samba filesize limit 2Gb


I am running RH9.

Is there a way of getting round the file size limit of 2Gb so I can work across the network with files larger than this?

Thanks you
 
Old 07-09-2008, 05:43 AM   #2
billymayday
LQ Guru
 
Registered: Mar 2006
Location: Sydney, Australia
Distribution: Fedora, CentOS, OpenSuse, Slack, Gentoo, Debian, Arch, PCBSD
Posts: 6,678

Rep: Reputation: 122Reputation: 122
What Samba limit do you refer to? Any limitation is probably the result of the file system you are using. How is your partition formatted? It is also possible that RH9 (which is very out of date and unsupported) has a limitation.
 
Old 07-09-2008, 06:16 AM   #3
slideh
LQ Newbie
 
Registered: Jul 2008
Posts: 29

Original Poster
Rep: Reputation: 15
Quote:
Originally Posted by billymayday View Post
What Samba limit do you refer to? Any limitation is probably the result of the file system you are using. How is your partition formatted? It is also possible that RH9 (which is very out of date and unsupported) has a limitation.
I can create unlimited-ish* file sizes on the local hard drive and I can create them on the host server from a windows machine but I can't create or copy them across the network from the linux machine.

Would I be right in assuming therefore that it is not a matter Linux file system being used or the way the partitions have been formatted?



* have created image files of 15 Gb and more
 
Old 07-09-2008, 06:18 AM   #4
slideh
LQ Newbie
 
Registered: Jul 2008
Posts: 29

Original Poster
Rep: Reputation: 15
Quote:
Originally Posted by slideh View Post
I am running RH9.

Is there a way of getting round the file size limit of 2Gb so I can work across the network with files larger than this?

Thanks you


PS...
I have mounted a windows server through
mount -t smbfs //windowsserver /mnt/windowsserver -o lfs,username=(),password=()
 
Old 07-09-2008, 06:55 AM   #5
stress_junkie
Senior Member
 
Registered: Dec 2005
Location: Massachusetts, USA
Distribution: Ubuntu 10.04 and CentOS 5.5
Posts: 3,873

Rep: Reputation: 332Reputation: 332Reputation: 332Reputation: 332
2 GB is the typical file size limit in FAT32. It is also a limit in CD-ROM file systems in the traditional format. CD-ROM file size limit can be doubled by using UDF format.

You might experience a 2 GB file size limit in other file systems such as NTFS or even ext2 if you don't have a generous number of inodes and/or if your partition cluster size is too small. (Mainly cluster size). This issue can cause various limits of file sizes. I have experienced a 17 GB file size limit in ext2/ext3 when I deliberately formatted a file system with a 1024 byte cluster size.

So there you are. The issue is about what file system format you are using and the number of inodes and the disk block cluster size. (Mainly the cluster size ). These characteristics work together to form a limit on file size.

The resolution is either to NOT use FAT32 to store large files AND to use larger values for the file system cluster size. If you are using FAT32 you may want to change to NTFS to maintain compatibility with Windows. Otherwise any of the modern popular file system formats used on Linux today should allow larger file sizes if the cluster size is about 4096 bytes per cluster.

Last edited by stress_junkie; 07-09-2008 at 07:04 AM.
 
Old 07-09-2008, 07:01 AM   #6
slideh
LQ Newbie
 
Registered: Jul 2008
Posts: 29

Original Poster
Rep: Reputation: 15
Quote:
Originally Posted by stress_junkie View Post
2 GB is the typical file size limit in FAT32. It is also a limit in CD-ROM file systems in the traditional format. CD-ROM file size limit can be doubled by using UDF format.

You might experience a 2 GB file size limit in other file systems, such as NTFS or even ext2 if you don't have a generous number of inodes and/or if your partition cluster size is too small. This issue can cause various limits of file sizes. I have experienced a 17 GB file size limit in ext2/ext3 when I deliberately formatted a file system with a small cluster size.

So there you are. The issue is about what file system format you are using and the number of inodes and the disk block cluster size. These characteristics work together to form some limit on file size.



I can create unlimited-ish* file sizes on the local hard drive and I can create them on the host server from a windows machine but I can't create or copy them across the network from the linux machine.

Would I be right in assuming therefore that it is not a matter Linux file system being used or the way the partitions have been formatted?



* have created image files of 15 Gb and more
 
Old 07-09-2008, 07:08 AM   #7
stress_junkie
Senior Member
 
Registered: Dec 2005
Location: Massachusetts, USA
Distribution: Ubuntu 10.04 and CentOS 5.5
Posts: 3,873

Rep: Reputation: 332Reputation: 332Reputation: 332Reputation: 332
Quote:
Originally Posted by slideh View Post
I can create unlimited-ish* file sizes on the local hard drive and I can create them on the host server from a windows machine but I can't create or copy them across the network from the linux machine.

Would I be right in assuming therefore that it is not a matter Linux file system being used or the way the partitions have been formatted?



* have created image files of 15 Gb and more
I am not aware of a file size limit for Samba or for network file shares. You may have to use Google or read the Samba documentation at www.samba.org.
 
Old 07-09-2008, 07:24 AM   #8
michaelk
Moderator
 
Registered: Aug 2002
Posts: 14,931

Rep: Reputation: 1520Reputation: 1520Reputation: 1520Reputation: 1520Reputation: 1520Reputation: 1520Reputation: 1520Reputation: 1520Reputation: 1520Reputation: 1520Reputation: 1520
Yes, samba did have a 2GB file limitation but I do not remember what the default was for RH9. Try using the lfs option i.e.
mount -t smbfs -o lfs //server/share /mount/point

As stated if you do not need to run a legacy application it would be best to upgrade to the latest Fedora.
 
Old 07-09-2008, 07:32 AM   #9
syg00
LQ Veteran
 
Registered: Aug 2003
Location: Australia
Distribution: Lots ...
Posts: 14,834

Rep: Reputation: 1820Reputation: 1820Reputation: 1820Reputation: 1820Reputation: 1820Reputation: 1820Reputation: 1820Reputation: 1820Reputation: 1820Reputation: 1820Reputation: 1820
Sounds like the (old) lfs problem - elsewise is cifs an option ???.

Edit: too slow typing again - back to sleep for me ...

Last edited by syg00; 07-09-2008 at 07:34 AM.
 
Old 07-09-2008, 07:54 AM   #10
slideh
LQ Newbie
 
Registered: Jul 2008
Posts: 29

Original Poster
Rep: Reputation: 15
Quote:
Originally Posted by michaelk View Post
Yes, samba did have a 2GB file limitation but I do not remember what the default was for RH9. Try using the lfs option i.e.
mount -t smbfs -o lfs //server/share /mount/point

As stated if you do not need to run a legacy application it would be best to upgrade to the latest Fedora.


This is how I mounted the windows server

mount -t smbfs //windowsserver /mnt/windowsserver -o lfs,username=(),password=()

But it still didn't allow >2Gb
 
Old 07-09-2008, 10:36 AM   #11
trickykid
LQ Guru
 
Registered: Jan 2001
Posts: 24,149

Rep: Reputation: 234Reputation: 234Reputation: 234
RH9 used Samba 2.2. This wasn't an issue with Samba or the filesystem per se but using smbmnt commands and the kernel. They had patches available but the site doesn't seem to be around any longer.

I'd recommend that you may want to upgrade Samba to get around this 2GB issue. Red Hat 9 is old itself and there have been plenty of updates, security and enhancements since, which you'll probably benefit from. You may want to try a newer version of Red Hat and if you can't afford it, go with something like CentOS which is a clone of Red Hat ES/AS, etc.
 
Old 07-09-2008, 01:55 PM   #12
kohtwelay
LQ Newbie
 
Registered: Jul 2008
Posts: 2

Rep: Reputation: 0
how to create cluster if i have three computers.
please reply me by mail with commands
 
Old 07-09-2008, 02:44 PM   #13
trickykid
LQ Guru
 
Registered: Jan 2001
Posts: 24,149

Rep: Reputation: 234Reputation: 234Reputation: 234
Quote:
Originally Posted by kohtwelay View Post
how to create cluster if i have three computers.
please reply me by mail with commands
Perhaps you should start a new thread instead of hijacking an existing one that has nothing to do with your actual question. And don't ask members to email you, that's not the point of forums.
 
  


Reply

Tags
file size limit


Thread Tools Search this Thread
Search this Thread:

Advanced Search

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off



Similar Threads
Thread Thread Starter Forum Replies Last Post
Get around 2GB Tar Limit richinsc Debian 10 03-13-2008 11:24 AM
ReiserFS 2Gb maximum file size limit? Cannot copy files bigger than 2Gb ihtus SUSE / openSUSE 2 10-26-2007 10:21 AM
xfsdump max filesize 2GB Crazy_lenny Linux - Kernel 2 09-13-2006 06:31 AM
samba limit on file sizes about 2GB adler321 Linux - Software 7 09-22-2004 08:59 AM
2GB > Limit optize Linux - General 1 03-18-2002 09:59 AM


All times are GMT -5. The time now is 02:46 AM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Facebook: linuxquestions Google+: linuxquestions
Open Source Consulting | Domain Registration