Linux - ServerThis forum is for the discussion of Linux Software used in a server related context.
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
I run apache 2.0.53 and it seems that apache doesn't allow to access some files when there are too big. I explain: in one directory I have a few files (zip, iso, tar.bz2, rar,...). I can reach with the browser all files that are until 1.3GB. But I have a file that is 2.7GB and another one that is 3.8GB big and when I give their urls I get the message
Quote:
Forbidden
You don't have permission to access......
Additionally, a 403 Forbidden error was encountered while trying to use an ErrorDocument to handle the request.
. The rights and owners are exaclty the same for all files, and all files are in the same directory. Some I suppose that the problem has to do with the size of the files. Is there a line in httpd.conf that I may change in order to access those files?
You might check your error_log for additional clues, but I would recommend using FTP instead for files of this size.
Thanks! I really have to get the habit to look in the error-logs! I would have seen this:
Quote:
[Tue Mar 20 18:44:31 2007] [error] [client yyy.xxx.xxx.zzz] (75)Value too large for defined data type: access to /~myuser/mydata.rar failed
this 2GB limit doesn't make my business easier. I do not want to install a ftp server just because of those very few datas that concern only one person. I gess the personn will have to use scp then (I hope there isn't any limit!).
If you really want to use http for this, you can probably find a utility to chop up the file into smaller pieces. Then the user can download the pieces and put them back together. I don't know if that will be worth the trouble though.
If you really want to use http for this, you can probably find a utility to chop up the file into smaller pieces. Then the user can download the pieces and put them back together. I don't know if that will be worth the trouble though.
probably not. some users want to deliver some big files to other users of the server. so this problem doesn't constitute the main activity of the server (it's a webserver). for those big files, they will have to use scp or make smaller archives if they think it worths it.
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.