File System Limitations
Are there any?
I've been writing my own software for my site, it monitors all incoming/outgoing traffic, downloads and bandwidth, blocks traded passwords, does the toplist, affiliates, and handles the news, galleries, comments, and forum. All within the one user account... It 'stopped working' early this morning in the sense that it stopped creating new directories (which the traffic/toplist script does for every ip number), and stopped writing information to the text files it uses for user registration, and for the comments/forum. It was still creating directories/files, but the text files were empty. I've since deleted many tens of thousands of directories/files from last years traffic logs, and it started working again! Did I hit some kind of a ceiling? |
All file systems have a directory or file amount limitation.
as for what they are im pretty sure modern file or root directory limitations are very large. but then again you mentioned very large amounts of directory's tell us what file system your using :) |
I'm not sure, I'm on a virtual account with directadmin. I think it's a redhat distro. It's probably the default setting.
Is there a way of finding out for sure or do I have to ask my host? |
if you are in a consol and type mount it will give you the current mounted file system information.
|
I haven't got telnet/ssh access. Is there a perl command, or simple perl script that will tell me?
|
hmmmm
possibly if it can extract information from the proc file system. sorry i know nothing of perl. |
I'll ask my host. I also lost all email on that domain for the period in question, although directadmin reports I've still got about 1.5GB space left on the account, which made me think about the directories, especially since it all started working again once I started deleting the old stuff.
In the meantime, does anyone have any figures for how many directories/files the various file systems can handle, and would less than a million directories (with a few small files in each) be a problem for any of them? |
Sounds like an inode problem - ask your host provider to supply some numbers for your filesystems if you can't get to a console.
|
here you go
http://en.wikipedia.org/wiki/Comparison_of_file_systems good list but doesnt show the root directory limits im still looking and will update this post when i find the info. |
Thanks for the link!
It doesn't say how many directories or files you can have though. It does say there's 'no limits' on the pathname though (except ods-5 and udf), so I'm assuming that means the number of directories within directories (and therefore the number of directories as a whole) is limited only by the size of the disk? |
Oh, and if the number of directories/files wasn't the problem, could it be hackers?
I've seen lots of germans from deutsche telekom popping up in the traffic logs sniffing around for password scripts! |
Yea thats what im finding too that the number of directories is limited by disk space. hmmm I dunno about hackers tho.
|
Might be space, might be inodes - you need to get data from your provider as I said.
As an experiment, I had a look at a partition of mine - just under a Gig total size. Space was 30% used, inodes were 16% used. Created 10,000 directories (no files, just directories), and the numbers became; Space 34% used, inodes 34% used. Doesn't take much imagination to conceive of a situation where inodes are completely used but actual disk space remains. |
Quote:
|
So you've got a backup for the internet then? ;)
|
All times are GMT -5. The time now is 07:48 PM. |