Linux - NewbieThis Linux forum is for members that are new to Linux.
Just starting out and have a question?
If it is not in the man pages or the how-to's this is the place!
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Introduction to Linux - A Hands on Guide
This guide was created as an overview of the Linux Operating System, geared toward new users as an exploration tour and getting started guide, with exercises at the end of each chapter.
For more advanced trainees it can be a desktop reference, and a collection of the base knowledge needed to proceed with system and network administration. This book contains many real life examples derived from the author's experience as a Linux system and network administrator, trainer and consultant. They hope these examples will help you to get a better understanding of the Linux system and that you feel encouraged to try out things on your own.
Click Here to receive this Complete Guide absolutely free.
Hi, I have a server running RH Linux Enterprise, on this box we need to print more than 4000 .ps files. The command we have been using is lpr *.ps (and even lp *.ps).
The problem is we get the error "lpr: error - too many files -filename" repeated over and over.
If we break the job into batches of 500 or less they print, but we want to do them all in one go.
Now for some explaining... xargs reads a list of arguments from standard input, splits them into groups and then executes the specified command (in this case lpr) for each group.
find simply prints a list of files/directories on standard output (which we pipe into the standard input of xargs).
The arguments to find are:
. means start finding files in the current directory.
-name '*.ps' tells find to only print files whose name matches the pattern '*.ps'. The reason this doesn't trigger the long argument list problem which the original command causes is that the pattern is quoted and send to find as a literal string. In the original command, the shell tries to expand the pattern before passing the result to lpr.
-maxdepth 1 don't go into sub-directories - just find things in the current directory.
-print0 instead of printing one result per line of output, separate the results with the ASCII NUL character. This can prevent problems when there are weird characters in the file names, and is generally a good idea, even if it isn't strictly necessary in most cases.
The -0 argument to xargs tells it to expect NUL delimited input (normally it expects arguments to be separated by whitespace or new lines).
I have to confess that this "argument list too long" thing is an irritation, and the work around I propose here is rather convoluted. I can't remember exactly where the limitation comes from (maybe it's just historical), but I wish it would be removed.
Thanks for the reply. I ran the command find . -name '*.ps' -maxdepth 1 -print0 |xargs -0 lpr
but still got the same error lpr: error - too many files - filename
Have tried changing maxjobs setting in cups.conf but it didnt help either.
How about a 30 second sleep to make it less tedious ;}
It would probably be better not to assume how long it is going to take - what if the paper runs out? Then it will be a massive pain in the backside to work out which ones have printed and which have not.
If it must be totally automatic, then it would be better to replace the manual interaction with something like:
for f in toprint_*; do
cat "$f" |xargs lpr
echo "waiting for print queue to clear"
while [ "$waiting" = "yes" ]; do
grep -q "your_printer_id is ready" && waiting=no
...where your_printer_id is set according to what your printer is called. Just do lpq in a terminal to find out the ID.
The reason I didn't suggest it right away is that I'm not sure when this "your_printer_id is ready" message is appearing... if it does this as soon as there is even 1 space in the print queue, then you should set the number of files per group to half the maximum to prevent overflows...
It's all rather fiddly. For a one time task, as little manual labour is probably less hassle.
Look at the manpage for xargs. There are 2 or 3 options like -L to limit the number of arguments to handle at a time. (-l or -n) You can also limit the number of processes launched at once if the command launches in a subshell.
Most printers/print queues just block when the printer runs out of paper, so no jobs, not even the current one is lost.
I've certainly had times (at work) when my job didn't print, checked printer, put new paper in, had to wait for the other 20 jobs piled up in front of mine...