LinuxQuestions.org

LinuxQuestions.org (/questions/)
-   Linux - Newbie (https://www.linuxquestions.org/questions/linux-newbie-8/)
-   -   lpr: error - too many files - (https://www.linuxquestions.org/questions/linux-newbie-8/lpr-error-too-many-files-657213/)

swazi.lee 07-21-2008 10:47 AM

lpr: error - too many files -
 
Hi, I have a server running RH Linux Enterprise, on this box we need to print more than 4000 .ps files. The command we have been using is
lpr *.ps (and even lp *.ps).
The problem is we get the error "lpr: error - too many files -filename" repeated over and over.
If we break the job into batches of 500 or less they print, but we want to do them all in one go.

Any help would be appreciated.

matthewg42 07-21-2008 11:53 AM

It's probably your shell which tries to expand the pattern before calling the program, but encounters some limit to the list of arguments.

You can get round it by making a list of files to process (e.g. using find), and feed these into xargs with lpr as the command argument to xargs.

First here's how to do it:

Code:

find . -name '*.ps' -maxdepth 1 -print0 |xargs -0 lpr
Now for some explaining... xargs reads a list of arguments from standard input, splits them into groups and then executes the specified command (in this case lpr) for each group.

find simply prints a list of files/directories on standard output (which we pipe into the standard input of xargs).

The arguments to find are:
  • . means start finding files in the current directory.
  • -name '*.ps' tells find to only print files whose name matches the pattern '*.ps'. The reason this doesn't trigger the long argument list problem which the original command causes is that the pattern is quoted and send to find as a literal string. In the original command, the shell tries to expand the pattern before passing the result to lpr.
  • -maxdepth 1 don't go into sub-directories - just find things in the current directory.
  • -print0 instead of printing one result per line of output, separate the results with the ASCII NUL character. This can prevent problems when there are weird characters in the file names, and is generally a good idea, even if it isn't strictly necessary in most cases.

The -0 argument to xargs tells it to expect NUL delimited input (normally it expects arguments to be separated by whitespace or new lines).

I have to confess that this "argument list too long" thing is an irritation, and the work around I propose here is rather convoluted. I can't remember exactly where the limitation comes from (maybe it's just historical), but I wish it would be removed.

swazi.lee 07-24-2008 05:43 AM

Thanks for the reply. I ran the command
find . -name '*.ps' -maxdepth 1 -print0 |xargs -0 lpr
but still got the same error lpr: error - too many files - filename
Have tried changing maxjobs setting in cups.conf but it didnt help either.

chrism01 07-24-2008 05:58 AM

If you change the setting, you have to restart the cups daemon/service.

service cupsd restart

matthewg42 07-24-2008 09:57 AM

Ah, so cups has some limit of number of jobs? Didn't know that.

If you cannot raise the limit enough to cope with all of them, you might need to submit them in smaller groups, waiting for one group to finish before starting the next group.

One way to do this would be to create a list of files, split that into chunks, and then run lpr for each group of files, one group at a time.

e.g.
Code:

find . -name '*.ps' -maxdepth 1 > full_list
split --lines=100 full_list toprint_[/CODE]
This will create a bunch of files with the prefix "toprint_", each containing the names of 100 files to print.

This command will send one of the files to the print system, and prompt you to press return when that group is done:
Code:

for f in toprint_*; do cat "$f" |xargs lpr ; echo "press RETURN to continue" ; read r ; done

Tinkster 07-24-2008 10:49 AM

How about a 30 second sleep to make it less tedious ;}

matthewg42 07-24-2008 11:17 AM

Quote:

Originally Posted by Tinkster (Post 3225067)
How about a 30 second sleep to make it less tedious ;}

It would probably be better not to assume how long it is going to take - what if the paper runs out? Then it will be a massive pain in the backside to work out which ones have printed and which have not.

If it must be totally automatic, then it would be better to replace the manual interaction with something like:

Code:

for f in toprint_*; do
  cat "$f" |xargs lpr
  echo "waiting for print queue to clear"
  waiting=yes
  while [ "$waiting" = "yes" ]; do
    sleep 10
    grep -q "your_printer_id is ready" && waiting=no
  done
done

...where your_printer_id is set according to what your printer is called. Just do lpq in a terminal to find out the ID.

The reason I didn't suggest it right away is that I'm not sure when this "your_printer_id is ready" message is appearing... if it does this as soon as there is even 1 space in the print queue, then you should set the number of files per group to half the maximum to prevent overflows...

It's all rather fiddly. For a one time task, as little manual labour is probably less hassle.

Tinkster 07-24-2008 01:05 PM

Heh. I wouldn't want to press enter 4000 times, that's for sure.

matthewg42 07-24-2008 02:49 PM

4000 / maximum simultaneous submissions. If that's 500 files at a time that makes:
Code:

4000 / 500 = 8
It's not so many time to press return. Especially when you consider that you still have to handle the physical bulk of the print out of 4000 files.

jschiwal 07-24-2008 02:58 PM

Look at the manpage for xargs. There are 2 or 3 options like -L to limit the number of arguments to handle at a time. (-l or -n) You can also limit the number of processes launched at once if the command launches in a subshell.

chrism01 07-24-2008 09:33 PM

Most printers/print queues just block when the printer runs out of paper, so no jobs, not even the current one is lost.
I've certainly had times (at work) when my job didn't print, checked printer, put new paper in, had to wait for the other 20 jobs piled up in front of mine...
:(


All times are GMT -5. The time now is 05:53 AM.