Linux - SoftwareThis forum is for Software issues.
Having a problem installing a new program? Want to know which application is best for the job? Post your question in this forum.
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
I know this is because there are too many files in directory, except use the command "find" , how to fix it ? is it because the memory is not enough to list all files ? if yes , how to change the parameter to increase the memory ? thx
The "argument list too long" error is due to a kernel limitation relative to the number of characters that can be processed in the arguments list. I don't know if this limitation can be tweaked without recompiling the kernel, nor if it would be safe to do this. Anyway, you can also process one file at a time by the method you mentioned or by a simple for loop, e.g.
Code:
for file in *.txt ; do grep "pattern" $file ; done
if some filename contains blank spaces, you have to set the IFS (input field separator) in order to avoid names splitted in two or more parts. A similar approach applies when you use find with the -print0 option to prevent this behavior.
The "argument list too long" error is due to a kernel limitation relative to the number of characters that can be processed in the arguments list. I don't know if this limitation can be tweaked without recompiling the kernel, nor if it would be safe to do this. Anyway, you can also process one file at a time by the method you mentioned or by a simple for loop, e.g.
Code:
for file in *.txt ; do grep "pattern" $file ; done
if some filename contains blank spaces, you have to set the IFS (input field separator) in order to avoid names splitted in two or more parts. A similar approach applies when you use find with the -print0 option to prevent this behavior.
thx reply , use "find" can solve my problem , however , other users have the same problem , I don't want to change the command they are used to apply , on the other hand , I hv many scripts need to change if use find , can advise other solution ? thx
I'm afraid there is not any other solution. The limit is set as a kernel parameter (ARG_MAX) and in linux the only way to increase this value is to recompile the kernel itself. But not advisable, since it is related to memory allocation during the execution of commands.
You can retrieve this value for your system by issuing
Code:
getconf ARG_MAX
or looking at /usr/include/linux/limits.h
You can find a good explanation of the ARG_MAX limit in the Coreutils FAQ, here and an even more detailed one, here.
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.