LinuxQuestions.org
Welcome to the most active Linux Forum on the web.
Go Back   LinuxQuestions.org > Forums > Linux Forums > Linux - Newbie
User Name
Password
Linux - Newbie This Linux forum is for members that are new to Linux.
Just starting out and have a question? If it is not in the man pages or the how-to's this is the place!

Notices


Reply
  Search this Thread
Old 07-21-2008, 10:47 AM   #1
swazi.lee
LQ Newbie
 
Registered: Jul 2008
Posts: 2

Rep: Reputation: 0
lpr: error - too many files -


Hi, I have a server running RH Linux Enterprise, on this box we need to print more than 4000 .ps files. The command we have been using is
lpr *.ps (and even lp *.ps).
The problem is we get the error "lpr: error - too many files -filename" repeated over and over.
If we break the job into batches of 500 or less they print, but we want to do them all in one go.

Any help would be appreciated.
 
Old 07-21-2008, 11:53 AM   #2
matthewg42
Senior Member
 
Registered: Oct 2003
Location: UK
Distribution: Kubuntu 12.10 (using awesome wm though)
Posts: 3,530

Rep: Reputation: 63
It's probably your shell which tries to expand the pattern before calling the program, but encounters some limit to the list of arguments.

You can get round it by making a list of files to process (e.g. using find), and feed these into xargs with lpr as the command argument to xargs.

First here's how to do it:

Code:
find . -name '*.ps' -maxdepth 1 -print0 |xargs -0 lpr
Now for some explaining... xargs reads a list of arguments from standard input, splits them into groups and then executes the specified command (in this case lpr) for each group.

find simply prints a list of files/directories on standard output (which we pipe into the standard input of xargs).

The arguments to find are:
  • . means start finding files in the current directory.
  • -name '*.ps' tells find to only print files whose name matches the pattern '*.ps'. The reason this doesn't trigger the long argument list problem which the original command causes is that the pattern is quoted and send to find as a literal string. In the original command, the shell tries to expand the pattern before passing the result to lpr.
  • -maxdepth 1 don't go into sub-directories - just find things in the current directory.
  • -print0 instead of printing one result per line of output, separate the results with the ASCII NUL character. This can prevent problems when there are weird characters in the file names, and is generally a good idea, even if it isn't strictly necessary in most cases.

The -0 argument to xargs tells it to expect NUL delimited input (normally it expects arguments to be separated by whitespace or new lines).

I have to confess that this "argument list too long" thing is an irritation, and the work around I propose here is rather convoluted. I can't remember exactly where the limitation comes from (maybe it's just historical), but I wish it would be removed.
 
Old 07-24-2008, 05:43 AM   #3
swazi.lee
LQ Newbie
 
Registered: Jul 2008
Posts: 2

Original Poster
Rep: Reputation: 0
Thanks for the reply. I ran the command
find . -name '*.ps' -maxdepth 1 -print0 |xargs -0 lpr
but still got the same error lpr: error - too many files - filename
Have tried changing maxjobs setting in cups.conf but it didnt help either.
 
Old 07-24-2008, 05:58 AM   #4
chrism01
LQ Guru
 
Registered: Aug 2004
Location: Sydney
Distribution: Centos 6.9, Centos 7.3
Posts: 17,374

Rep: Reputation: 2383Reputation: 2383Reputation: 2383Reputation: 2383Reputation: 2383Reputation: 2383Reputation: 2383Reputation: 2383Reputation: 2383Reputation: 2383Reputation: 2383
If you change the setting, you have to restart the cups daemon/service.

service cupsd restart
 
Old 07-24-2008, 09:57 AM   #5
matthewg42
Senior Member
 
Registered: Oct 2003
Location: UK
Distribution: Kubuntu 12.10 (using awesome wm though)
Posts: 3,530

Rep: Reputation: 63
Ah, so cups has some limit of number of jobs? Didn't know that.

If you cannot raise the limit enough to cope with all of them, you might need to submit them in smaller groups, waiting for one group to finish before starting the next group.

One way to do this would be to create a list of files, split that into chunks, and then run lpr for each group of files, one group at a time.

e.g.
Code:
find . -name '*.ps' -maxdepth 1 > full_list
split --lines=100 full_list toprint_[/CODE]
This will create a bunch of files with the prefix "toprint_", each containing the names of 100 files to print.

This command will send one of the files to the print system, and prompt you to press return when that group is done:
Code:
for f in toprint_*; do cat "$f" |xargs lpr ; echo "press RETURN to continue" ; read r ; done
 
Old 07-24-2008, 10:49 AM   #6
Tinkster
Moderator
 
Registered: Apr 2002
Location: in a fallen world
Distribution: slackware by choice, others too :} ... android.
Posts: 23,067
Blog Entries: 11

Rep: Reputation: 910Reputation: 910Reputation: 910Reputation: 910Reputation: 910Reputation: 910Reputation: 910Reputation: 910
How about a 30 second sleep to make it less tedious ;}
 
Old 07-24-2008, 11:17 AM   #7
matthewg42
Senior Member
 
Registered: Oct 2003
Location: UK
Distribution: Kubuntu 12.10 (using awesome wm though)
Posts: 3,530

Rep: Reputation: 63
Quote:
Originally Posted by Tinkster View Post
How about a 30 second sleep to make it less tedious ;}
It would probably be better not to assume how long it is going to take - what if the paper runs out? Then it will be a massive pain in the backside to work out which ones have printed and which have not.

If it must be totally automatic, then it would be better to replace the manual interaction with something like:

Code:
for f in toprint_*; do 
  cat "$f" |xargs lpr
  echo "waiting for print queue to clear"
  waiting=yes
  while [ "$waiting" = "yes" ]; do
    sleep 10
    grep -q "your_printer_id is ready" && waiting=no
  done
done
...where your_printer_id is set according to what your printer is called. Just do lpq in a terminal to find out the ID.

The reason I didn't suggest it right away is that I'm not sure when this "your_printer_id is ready" message is appearing... if it does this as soon as there is even 1 space in the print queue, then you should set the number of files per group to half the maximum to prevent overflows...

It's all rather fiddly. For a one time task, as little manual labour is probably less hassle.
 
Old 07-24-2008, 01:05 PM   #8
Tinkster
Moderator
 
Registered: Apr 2002
Location: in a fallen world
Distribution: slackware by choice, others too :} ... android.
Posts: 23,067
Blog Entries: 11

Rep: Reputation: 910Reputation: 910Reputation: 910Reputation: 910Reputation: 910Reputation: 910Reputation: 910Reputation: 910
Heh. I wouldn't want to press enter 4000 times, that's for sure.
 
Old 07-24-2008, 02:49 PM   #9
matthewg42
Senior Member
 
Registered: Oct 2003
Location: UK
Distribution: Kubuntu 12.10 (using awesome wm though)
Posts: 3,530

Rep: Reputation: 63
4000 / maximum simultaneous submissions. If that's 500 files at a time that makes:
Code:
4000 / 500 = 8
It's not so many time to press return. Especially when you consider that you still have to handle the physical bulk of the print out of 4000 files.

Last edited by matthewg42; 07-24-2008 at 02:52 PM.
 
Old 07-24-2008, 02:58 PM   #10
jschiwal
LQ Guru
 
Registered: Aug 2001
Location: Fargo, ND
Distribution: SuSE AMD64
Posts: 15,733

Rep: Reputation: 671Reputation: 671Reputation: 671Reputation: 671Reputation: 671Reputation: 671
Look at the manpage for xargs. There are 2 or 3 options like -L to limit the number of arguments to handle at a time. (-l or -n) You can also limit the number of processes launched at once if the command launches in a subshell.
 
Old 07-24-2008, 09:33 PM   #11
chrism01
LQ Guru
 
Registered: Aug 2004
Location: Sydney
Distribution: Centos 6.9, Centos 7.3
Posts: 17,374

Rep: Reputation: 2383Reputation: 2383Reputation: 2383Reputation: 2383Reputation: 2383Reputation: 2383Reputation: 2383Reputation: 2383Reputation: 2383Reputation: 2383Reputation: 2383
Most printers/print queues just block when the printer runs out of paper, so no jobs, not even the current one is lost.
I've certainly had times (at work) when my job didn't print, checked printer, put new paper in, had to wait for the other 20 jobs piled up in front of mine...
 
  


Reply


Thread Tools Search this Thread
Search this Thread:

Advanced Search

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off



Similar Threads
Thread Thread Starter Forum Replies Last Post
lpr: error - scheduler not responding! Atmchicago Slackware 14 06-06-2009 04:15 AM
server-error-service-unavailable lpr printing TheBrick Linux - General 1 07-11-2006 07:04 AM
what is the different between lp, lpr, llpr, lpr.cups... lnthai2002 Linux - Software 5 04-13-2006 04:09 PM
confused... lpr/ lpr-cups shadow.blue Linux - Software 1 04-06-2004 05:15 PM
lpr command error mikeshn Linux - Software 0 11-17-2003 12:04 PM

LinuxQuestions.org > Forums > Linux Forums > Linux - Newbie

All times are GMT -5. The time now is 08:30 PM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Facebook: linuxquestions Google+: linuxquestions
Open Source Consulting | Domain Registration