AIXThis forum is for the discussion of IBM AIX.
eserver and other IBM related questions are also on topic.
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
I have created a print queue with attachment type "other" which is defined as a "user defined backend". I would like to leave this queue in a down state most of the day and only bring it up for short periods of time. However, when jobs have been queued in this queue I cannot bring it back up. Any other type of queue that I have setup does not have this problem. Is there something I am doing wrong or can does the queue have to be empty to be brought back up?
I've never seen this particular problem, but there's always a way around it with a relatively simple shell script using lpstat, grep, awk, cancel, and enable if the queue must be empty before bringing it up.
It would be ugly, but it would work.
How much of a problem would it be to lose the jobs in the queue?
Eh, that's kinda my problem. I can't lose the jobs that are in the queue. I have a perl script tied to the "backend" of the queue. This script picks up whatever I send to the queue and FTPs it to a specified location. The vendor requested that I send the files at a specific time. So, I was going to have a shell script bring up the queue at that time and then disable when it was finished processing the jobs. This would allow me to send jobs to the queue as needed, but be assured they were only being sent when the vendor wanted them to be.
I've asked around my department and the answer I usually got was:
What is this guy's Perl code doing, and why can't he send the "print jobs" to a flat file that gets FTP'd with cron at the appropriate time?
So I'm guessing that the Perl code is doing some sort of formatting or filtering? (We have some C code that does that for special print jobs).
This is an interesting problem and I'd love to help, but I'm stumped at the moment.
Do the jobs get written to text files that remain on the hard drive after being printed? We have some apps that create files in /tmp before sending them to print. If a job fails, we can just grep for it in /tmp and resubmit. These are large, important reports so people get upset if they can't be retrieved. We then have to use cron jobs to clean up /tmp periodically.
The perl code is encrypting the file before sending it off and doing a few other checks to make sure it is the correct file. The "print jobs" are essentially a flat file. The application we are using holds all of it's reports in a report directory (each report is its own file). When someone prints a report to my queue my perl code picks up the path of the report from stdout, encrypts the file and sends it to its destination.
Yep, makes sense. You're sumitting a job to a print queue which is really a Perl script that then catches the path to the file from stdout, encrypts the file, and sends to a destination.
The problem is that the destination company only wants a single batch of all the jobs sent at one time during the day. Yet, you can't let people just submit jobs to the downed queue and let them sit there because with jobs in the queue the queue won't come back up correctly. I think that's it, right?
I see two options, and both assume that you have the ability to modify the Perl code.
1. Have the Perl script FTP the files to another local server and then have that server send out the batch to the external vendor at the specified time via cron. This way the queue can stay up all the time and process the jobs as normal. This involves minimal change to the Perl code because you're only changing the FTP destination. The FTP cron job on the additional server can then be Perl or simply a shell script.
2. Modify the Perl code so that the encryption and formatting is one piece of code that can be run at the time of job submission and then take the last part of the code, make a separate script, and cron the FTP portion of the transaction.
It seems to me that this is easier to work on in the code side than the "printer queue side."
Yea, I think you're right. I think I'm going to just have the script move them to another directory instead of FTPing them and then have cron invoke the ftp process at the specified time. Thanks for the help dood!
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.