LinuxQuestions.org
Share your knowledge at the LQ Wiki.
Home Forums Tutorials Articles Register
Go Back   LinuxQuestions.org > Forums > Non-*NIX Forums > Programming
User Name
Password
Programming This forum is for all programming questions.
The question does not have to be directly related to Linux and any language is fair game.

Notices


Reply
  Search this Thread
Old 01-13-2012, 02:08 PM   #1
qweeak
LQ Newbie
 
Registered: Jan 2012
Posts: 24

Rep: Reputation: 2
Too many arguments error


Hi Guys,

When i run some bash commands, i get "Too many arguments" error. I'm kind of generalizing stuff here. I know we can pipe the output to another command and it works fine. For ex :

rm -rf * -- error
ls * | xargs rm -rf -- success

Some of the commands output cann't be correctly piped. So is there a method to avoid this error other than piping or looping. Will setting ARG_MAX to huge number will work ? Which is the best shell for this kind of operation involving huge no of arguments ?
 
Click here to see the post LQ members have rated as the most helpful post in this thread.
Old 01-13-2012, 03:07 PM   #2
T3RM1NVT0R
Senior Member
 
Registered: Dec 2010
Location: Internet
Distribution: Linux Mint, SLES, CentOS, Red Hat
Posts: 2,385

Rep: Reputation: 477Reputation: 477Reputation: 477Reputation: 477Reputation: 477
@ Reply

Hi qweeak,

Welcome to LQ!!!

The command rm -rf * is working perfectly fine for me. Could you let us know where you are running this commands and what are the contents of that directory where you are running this command.

It will be good if you can let us know the distribution you are using.
 
Old 01-14-2012, 12:53 AM   #3
Nominal Animal
Senior Member
 
Registered: Dec 2010
Location: Finland
Distribution: Xubuntu, CentOS, LFS
Posts: 1,723
Blog Entries: 3

Rep: Reputation: 948Reputation: 948Reputation: 948Reputation: 948Reputation: 948Reputation: 948Reputation: 948Reputation: 948
Quote:
Originally Posted by qweeak View Post
Some of the commands output cann't be correctly piped.
Yes they can, if you use ASCII NUL (zero byte) as the separator. For example,
Code:
find . -mindepth 1 -maxdepth 1 -print0 | xargs -r0 command...
will run command... splitting the directory item names in the current directory in as many groups as necessary. It will work for all possible file names, and all possible command-line arguments, regardless of what characters they may contain. (You should make sure you run that using the C or POSIX locale, i.e. set both LANG and LC_ALL environment variables to C. That way the commands will work correctly regardless of the character set used for the file names, even when more than one character set is used.)

If you have multiple parallel data streams, you will have to use temporary files. There, too, you'll need to use NULs as separators to support all possible strings.

For binary data, you'll need to use a separate file for each logical parameter.

You see, all POSIX systems use the NUL (\0) as an end-of-string mark, internally. It is the only byte value you cannot use in a parameter string given to a syscall. (Binary data is only supported by specific syscalls, and even then, the length of the binary data is always exactly specified. In the command line, such binary data is always specified as encoded strings -- for example, consider IP addresses, MAC addresses, device numbers, and so on. Thus, you almost always need to only worry about passing string data correctly.)

The problem, of course, is that not all shells and utilities support NUL separators at all. (For example, I don't know of any shell that supports NUL as the internal field separator, IFS.)

Some commands, such as GNU tar ([FONT=Monospace]--null -T filename) can read file names from a file instead of command line, using NUL separators. This is important when you need to supply more parameters to a single command than the kernel might allow. Again, this is an extra feature only available in some commands -- although in general, it is actually quite simple to write a patch to add this functionality to most utilities.

Quote:
Originally Posted by qweeak View Post
Which is the best shell for this kind of operation involving huge no of arguments ?
Bash, because Bash read built-in command supports NUL separator internally (-d ""). GNU find has -print0 and supports also -printf "stuff\0" so you can use it for everything regarding directories and files. GNU awk (gawk) also supports RS="\0" and FS="\0" so it can be used to very easily filter result lists using NUL separators.

Here is a contrived example:
Code:
#!/bin/bash

# Make sure find et al. ignore file name character set, and Just Work.
export LANG=C LC_ALL=C

# "$WORK": automatically removed temporary directory for temp files.
WORK="$(mktemp -d)" || exit $?
trap "rm -rf '$WORK'" EXIT

( if [ $# -gt 0 ]; then
      # Supply all command line parameters to find
      find "$@" -print0
  else
      # No command line parameters, so do a default find
      find . -print0
  fi
) | while read -rd "" FILE ; do

  # Do something with each "$FILE".

  if file "$FILE" | grep -qe executable ; then
      printf '%s\0' "$FILE"
  fi

done > "$WORK/executables"

# List all files containing executable content,
# regardless of whether the files are marked executable or not.
xargs -r0 ls -ld < "$WORK/executables"
There are better ways to do the things the above scriptlet does. I only wanted to illustrate using a subshell as part of a pipe, a while loop to read (and also to emit) strings with NUL separators, and how to use a temporary file (in an automatically removed temporary directory) to store NUL-separated strings. These are usually enough to handle most situations.

(I considered using tar -czf - --null -T "$WORK/executable" as the last line instead; it would have emitted a gzipped tarball containing all executable files, even if there were millions of them -- certainly more than you can ever supply to a single command using command-line parameters.)

Because you provided no specifics, my answer is a bit more vague than I'd really prefer. If you can provide a little more details -- like whether the number of parameters is the actual problem, or if the parameters may contain characters that are easily mangled by shells, or if you need more than one parallel data channel --, I might be able to help you more.
 
4 members found this post helpful.
Old 01-24-2012, 08:25 AM   #4
qweeak
LQ Newbie
 
Registered: Jan 2012
Posts: 24

Original Poster
Rep: Reputation: 2
thanks guys.. really appreciate it..

Regards
qweeak
 
Old 01-24-2012, 04:22 PM   #5
Ramurd
Member
 
Registered: Mar 2009
Location: Rotterdam, the Netherlands
Distribution: Slackwarelinux
Posts: 703

Rep: Reputation: 111Reputation: 111
Quote:
Originally Posted by T3RM1NVT0R View Post
Hi qweeak,

Welcome to LQ!!!

The command rm -rf * is working perfectly fine for me. Could you let us know where you are running this commands and what are the contents of that directory where you are running this command.

It will be good if you can let us know the distribution you are using.
afaik maximum number of arguments is 1024 (give or take); Might be that over the years this number has increased. Seen the issue of "too many arguments" before, when we were having this application that upon each invocation made a small logfile in /tmp... it got invoked a few thousand times per day, so eventually they ran out of inodes. Cleaning that up was an exercise in itself. rm -rf would not work due to the amount of arguments... piping the output of ls (without the asterisk) and feeding it to xargs helped working around that.

This was not in reply to the OP, but showing one way of getting an issue like this, which is not so unlikely as it may seem. Nominal went out of his way for a -once again- very detailed post. Worthy reading fodder!
 
1 members found this post helpful.
Old 01-25-2012, 05:10 PM   #6
T3RM1NVT0R
Senior Member
 
Registered: Dec 2010
Location: Internet
Distribution: Linux Mint, SLES, CentOS, Red Hat
Posts: 2,385

Rep: Reputation: 477Reputation: 477Reputation: 477Reputation: 477Reputation: 477
@ Reply

@ Nominal Animal,

Thank you for sharing the solution it is much faster then the one I used. This is what I did:

1. Created many file (not sure how many but till the point I start getting too many arguments error on rm -rf) by touch.
2. Start deleting them using find /test -delete. Where /test is the directory under which I created those files.

@ Ramurd,

Thanks for the explanation.
 
  


Reply



Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off



Similar Threads
Thread Thread Starter Forum Replies Last Post
apache error : Multiple <Files> arguments not (yet) supported. sunlinux Linux - Server 4 11-06-2011 09:23 AM
[SOLVED] .sh doesn't handle [ ] processing right? error: [: too many arguments jonathansfl Linux - Software 3 12-09-2010 09:53 AM
RPC: can't encode arguments error on ubuntu 9.10 64bit platform pradipmkulkarni Linux - Software 0 02-08-2010 05:32 AM
error: too few arguments to function ‘register_chrdev_region’ kulturfenster Linux - Kernel 1 11-29-2009 07:48 AM
RPC Error : Can't encode arguments devnux Programming 1 06-19-2008 06:12 PM

LinuxQuestions.org > Forums > Non-*NIX Forums > Programming

All times are GMT -5. The time now is 06:10 AM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Open Source Consulting | Domain Registration