LinuxQuestions.org
Latest LQ Deal: Latest LQ Deals
Home Forums Tutorials Articles Register
Go Back   LinuxQuestions.org > Forums > Linux Forums > Linux - Server
User Name
Password
Linux - Server This forum is for the discussion of Linux Software used in a server related context.

Notices


Reply
  Search this Thread
Old 03-21-2007, 05:54 AM   #1
xpucto
Member
 
Registered: Sep 2005
Location: Vienna, Austria
Distribution: Mint 13
Posts: 524

Rep: Reputation: 31
apache: is there a maximal file's size?


Hi!

I run apache 2.0.53 and it seems that apache doesn't allow to access some files when there are too big. I explain: in one directory I have a few files (zip, iso, tar.bz2, rar,...). I can reach with the browser all files that are until 1.3GB. But I have a file that is 2.7GB and another one that is 3.8GB big and when I give their urls I get the message
Quote:
Forbidden
You don't have permission to access......
Additionally, a 403 Forbidden error was encountered while trying to use an ErrorDocument to handle the request.
. The rights and owners are exaclty the same for all files, and all files are in the same directory. Some I suppose that the problem has to do with the size of the files. Is there a line in httpd.conf that I may change in order to access those files?

thanks for any advice.
 
Old 03-21-2007, 07:19 AM   #2
reddazz
LQ Guru
 
Registered: Nov 2003
Location: N. E. England
Distribution: Fedora, CentOS, Debian
Posts: 16,298

Rep: Reputation: 77
Moved: This thread is more suitable in Linux Server and has been moved accordingly to help your thread/question get the exposure it deserves.
 
Old 03-21-2007, 11:52 PM   #3
Sagebrush Gardener
LQ Newbie
 
Registered: Mar 2007
Posts: 29

Rep: Reputation: 15
I couldn't find a definitive answer, but I did see some posts that suggest there is a 2GB limit on Apache downloads. For example:

http://www.linuxquestions.org/questi...hreadid=387137

Apache has a LimitRequestBody directive to limit download size, and the maximum you can set it to is 2GB:

http://httpd.apache.org/docs/1.3/mod...mitrequestbody

You might check your error_log for additional clues, but I would recommend using FTP instead for files of this size.
 
Old 03-22-2007, 05:01 AM   #4
xpucto
Member
 
Registered: Sep 2005
Location: Vienna, Austria
Distribution: Mint 13
Posts: 524

Original Poster
Rep: Reputation: 31
Quote:
Originally Posted by Sagebrush Gardener
I couldn't find a definitive answer, but I did see some posts that suggest there is a 2GB limit on Apache downloads. For example:

http://www.linuxquestions.org/questi...hreadid=387137

Apache has a LimitRequestBody directive to limit download size, and the maximum you can set it to is 2GB:

http://httpd.apache.org/docs/1.3/mod...mitrequestbody

You might check your error_log for additional clues, but I would recommend using FTP instead for files of this size.
Thanks! I really have to get the habit to look in the error-logs! I would have seen this:
Quote:
[Tue Mar 20 18:44:31 2007] [error] [client yyy.xxx.xxx.zzz] (75)Value too large for defined data type: access to /~myuser/mydata.rar failed
this 2GB limit doesn't make my business easier. I do not want to install a ftp server just because of those very few datas that concern only one person. I gess the personn will have to use scp then (I hope there isn't any limit!).
 
Old 03-22-2007, 04:48 PM   #5
Sagebrush Gardener
LQ Newbie
 
Registered: Mar 2007
Posts: 29

Rep: Reputation: 15
If you really want to use http for this, you can probably find a utility to chop up the file into smaller pieces. Then the user can download the pieces and put them back together. I don't know if that will be worth the trouble though.
 
Old 03-23-2007, 04:07 AM   #6
xpucto
Member
 
Registered: Sep 2005
Location: Vienna, Austria
Distribution: Mint 13
Posts: 524

Original Poster
Rep: Reputation: 31
Quote:
Originally Posted by Sagebrush Gardener
If you really want to use http for this, you can probably find a utility to chop up the file into smaller pieces. Then the user can download the pieces and put them back together. I don't know if that will be worth the trouble though.
probably not. some users want to deliver some big files to other users of the server. so this problem doesn't constitute the main activity of the server (it's a webserver). for those big files, they will have to use scp or make smaller archives if they think it worths it.
 
Old 03-28-2007, 08:36 AM   #7
pk21
Member
 
Registered: Jun 2002
Location: Netherlands - Amsterdam
Distribution: RedHat 9
Posts: 549

Rep: Reputation: 30
Not sure, but isn't the 2Gb a limitation in the 2.4 kernel in stead of an apache limitation? If so, then you can try to upgrade to kernel 2.6
 
  


Reply



Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off



Similar Threads
Thread Thread Starter Forum Replies Last Post
How to get a file's symlinks jfrankman Linux - Newbie 10 11-02-2006 11:45 AM
Help! tar file's name is truncated auden Linux - Newbie 1 03-02-2005 09:09 PM
file's extension??? amanjsingh Linux - Newbie 2 04-17-2004 09:22 AM
warning: maximal mount count reached, running e2fsck is recommended Donboy Linux - Hardware 1 03-05-2004 12:16 AM
Critical system file's. Tarts Linux - Security 16 10-09-2003 10:06 AM

LinuxQuestions.org > Forums > Linux Forums > Linux - Server

All times are GMT -5. The time now is 08:25 PM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Open Source Consulting | Domain Registration