LinuxQuestions.org
Review your favorite Linux distribution.
Home Forums Tutorials Articles Register
Go Back   LinuxQuestions.org > Forums > Linux Forums > Linux - Software
User Name
Password
Linux - Software This forum is for Software issues.
Having a problem installing a new program? Want to know which application is best for the job? Post your question in this forum.

Notices


View Poll Results: Is a central Job automation/scheduling tool needed?
Yes 2 7.14%
No 1 3.57%
Why? We got cron! 16 57.14%
Sure, makes jobs control better. 9 32.14%
Voters: 28. You may not vote on this poll

Reply
  Search this Thread
Old 11-03-2006, 01:29 AM   #1
eisman
Member
 
Registered: Jul 2004
Location: The Netherlands
Distribution: Sidux , Debian, Haiku, PC-BSD, CentOS --> XenExpress
Posts: 77

Rep: Reputation: 16
Job Automation program


I wonder if there is a job automation program for linux.

I do not mean cron, but a seperate program, where you can create/modify/delete/... jobs for serveral machines (pc's).
and can be started by the scheduling by the server.

These jobs should be in a central database, could be monitored graphicly. So all jobs of all servers/clients in one database.

Does anyone know such a tool/scheduling system?
 
Old 11-04-2006, 09:49 AM   #2
MensaWater
LQ Guru
 
Registered: May 2005
Location: Atlanta Georgia USA
Distribution: Redhat (RHEL), CentOS, Fedora, CoreOS, Debian, FreeBSD, HP-UX, Solaris, SCO
Posts: 7,831
Blog Entries: 15

Rep: Reputation: 1669Reputation: 1669Reputation: 1669Reputation: 1669Reputation: 1669Reputation: 1669Reputation: 1669Reputation: 1669Reputation: 1669Reputation: 1669Reputation: 1669
Haven't used them on Linux specifically but there are such tools for UNIX and I imagine they have Linux equivalents. They are expensive but in a shop where you're doing a lot of batch job work especially jobs that do things on different servers they are invaluable for doing such scheduling.

The one I've used most often used to be called Maestro but is now call Tivoli Workload Scheduler.

In the larger shops I was in we used it for all sorts of batch jobs. It was good because you could create a "schedule" that had multiple tasks ("jobs") within it. These would have dependencies (or not as you wish) so that we could insure if the job on machine C failed it would do the one on machine D until that was resolved.

While it was bought for batch processing (e.g. customer bill routines) it was quite handy for doing systems administration tasks such as scheduling backups to make them dependent on other schedules completing. (e.g. if the batch processing didn't finish we didn't really want to put the the database into hot standby mode for the BCV split because it would have impacted performance in trying to finish the batch job. We could hold it until the batch processing issue was completed or even cancel it altogether if the window was to later after the batch job finished.) In this way you can make schedules dependent on each other (or not).

Cron is a good tool for in box scheduling but not for cross box dependencies like those needed for large batch processing and sophisticate backups that use things such as BCVs and alternate servers for mounting the copies. It can be done (e.g. by having cron on one machine send an "at" job to another) but only by kluges and it doesn't really give much control (either comment out the job or let it run).
 
Old 11-04-2006, 10:05 AM   #3
matthewg42
Senior Member
 
Registered: Oct 2003
Location: UK
Distribution: Kubuntu 12.10 (using awesome wm though)
Posts: 3,530

Rep: Reputation: 65
You can do this with cron and ssh... Have your scheduling server crontab contain remote execution commands using ssh, and configure ssh so you don't need to provide a password interactively.

Simple, secure, and more importantly, pre-existing and stable. Writing new code is fun, but you'll alway have bugs, so it makes sense to re-use an existing, well known mechanism if you can.

I could imagine wanting to make a new system if I had requirements not met by the cron+ssh solution. Perhaps multiple job dependency/flow, and a nice graphical flow chart to show where in this flow the system is... Also the cron solution isn't so good at handling a changing cluster (fail-overs etc), but I suspect this could be easily rectified with a simple job wrapper script which chose an actual server based on a functional server name and availability.

If you have something implemented, and you're trying to work out if there is market for it, please post details of what it can do that cron+ssh can't. I have a personal interest in these scheduling systems as I worked for many years on big iron batch systems. They were never 100% satisfying to me, and I have often considered making something myself.
 
Old 11-04-2006, 10:08 AM   #4
matthewg42
Senior Member
 
Registered: Oct 2003
Location: UK
Distribution: Kubuntu 12.10 (using awesome wm though)
Posts: 3,530

Rep: Reputation: 65
Quote:
Originally Posted by jlightner
The one I've used most often used to be called Maestro but is now call Tivoli Workload Scheduler.
That's one I've used. Didn't know it's been re-named. Not in the places I was working in... guess we had an old version.
 
Old 11-04-2006, 08:19 PM   #5
MensaWater
LQ Guru
 
Registered: May 2005
Location: Atlanta Georgia USA
Distribution: Redhat (RHEL), CentOS, Fedora, CoreOS, Debian, FreeBSD, HP-UX, Solaris, SCO
Posts: 7,831
Blog Entries: 15

Rep: Reputation: 1669Reputation: 1669Reputation: 1669Reputation: 1669Reputation: 1669Reputation: 1669Reputation: 1669Reputation: 1669Reputation: 1669Reputation: 1669Reputation: 1669
The question isn't so much whether it can be done using existing (and free things) but rather how much time/effort you want to spend managing/supporting things. The TWS/Maestro was really cool in the environments where I used it. These were places where we had dozens of UNIX/Linux servers and lots of batch processes. In those places we typically used it to also schedule backups rather than using NetBackup scheduling. (Which by the way might have benefited some of the folks that are complaining about scheduling issues in NBU 6 now.)

It was a lot simpler to go look at the master server and see what scheduled had failed and go directly to the job in that schedule than it is to figure out which one of the 10 servers involved in stopping applications, stopping databases, starting BCVs, splitting BCVs, mounting the BCVs on another servers, and starting the backup from the master. And this was just for one Production environment. Add in all your other backups and other Production environments (one job I was at had 7 Production environments) and you can see this can come in quite handy. Especially since it has logs so that you can quickly trace what happened in the log for the stopped job/schedule.

Just checked and verified TWS is available for Linux:
http://www-306.ibm.com/software/tivo...cheduler-apps/

As to the new name - that was one of the banes of my past two job searches. People would ask if I'd worked on something and I'd say "no" only to find out later that someone had renamed the Product. At least with Maestro it was done because BMC that made it got bought by Tivoli (which is a subdivision of IBM). What annoys me is when marketing types at a company rename a product for no good reason. HP is one of the worst.

Of course OEM is another fun thing. I've worked on the same model tape library at 3 of my last 4 jobs but it had a different name at each of them (STK original name, Sun's name for it at another [before Sun bought STK of course] and HP's Suretsore name for it.) Headhunters often can't get that so on resumes I'd typically list all the names so they'd see it when they did their keyword searches.
 
Old 11-05-2006, 03:34 AM   #6
matthewg42
Senior Member
 
Registered: Oct 2003
Location: UK
Distribution: Kubuntu 12.10 (using awesome wm though)
Posts: 3,530

Rep: Reputation: 65
Yeah, I think you're right that a home-brew solution might not scale as well as one of these big products. I guess it's a matter of working out what is right for the job. Certainly using cron and ssh isn't going to give you a nice graphical display of the status without spending some time writing that...

In general therefore I suppose the answer to the question is "yeah, the more software the merrier", although developing/porting something like a big cross machine scheduling system might not have enough mass appeal to be worth the effort. I suppose in the case of TWS/Maestro they thought it was worth it.
 
Old 11-05-2006, 08:52 AM   #7
trickykid
LQ Guru
 
Registered: Jan 2001
Posts: 24,149

Rep: Reputation: 269Reputation: 269Reputation: 269
You could check out http://www.tildeslash.com/monit/

It's more of a monitoring tool but you could probably script it to start/edit/schedule running services and processes on remote machines from one central location.
 
Old 11-05-2006, 09:35 AM   #8
matthewg42
Senior Member
 
Registered: Oct 2003
Location: UK
Distribution: Kubuntu 12.10 (using awesome wm though)
Posts: 3,530

Rep: Reputation: 65
I once wrote a very simple replacement for a multiple dependency job work-flow controller using make. It worked like a treat, except there was no mechanism to visualize what was going on, and it wasn't possible to "hold" a job or re-run specific jobs (you just had to with kill make and/or it for it stop, and then re-run it to catch failed jobs.)

It would be super if there was a tool which could generate Graphviz-like dependency charts from make output, and update the graph as the processes complete, especially if there was some mechanism to graphically select a job, update it state - rerun etc.

Anyone know of such a tool?
 
Old 11-06-2006, 02:12 AM   #9
eisman
Member
 
Registered: Jul 2004
Location: The Netherlands
Distribution: Sidux , Debian, Haiku, PC-BSD, CentOS --> XenExpress
Posts: 77

Original Poster
Rep: Reputation: 16
I work with SMA OpConXPS, but atm it is only available for windows.
It works quite nice and has some graphic displays of how your jobs run and so on.
I was looking for something like that for linux.
TWS is not a free product , but perhaps worth a try?
 
Old 11-06-2006, 02:15 AM   #10
eisman
Member
 
Registered: Jul 2004
Location: The Netherlands
Distribution: Sidux , Debian, Haiku, PC-BSD, CentOS --> XenExpress
Posts: 77

Original Poster
Rep: Reputation: 16
Quote:
Originally Posted by matthewg42
Yeah, I think you're right that a home-brew solution might not scale as well as one of these big products. I guess it's a matter of working out what is right for the job. Certainly using cron and ssh isn't going to give you a nice graphical display of the status without spending some time writing that...

In general therefore I suppose the answer to the question is "yeah, the more software the merrier", although developing/porting something like a big cross machine scheduling system might not have enough mass appeal to be worth the effort. I suppose in the case of TWS/Maestro they thought it was worth it.

Well I think that it is worth to have such a program for linux, because linux is getting more and more into the server bussness. So a good multi-platform job automation program would be needed to overview all running jobs on serveral servers.
 
Old 11-06-2006, 02:17 AM   #11
eisman
Member
 
Registered: Jul 2004
Location: The Netherlands
Distribution: Sidux , Debian, Haiku, PC-BSD, CentOS --> XenExpress
Posts: 77

Original Poster
Rep: Reputation: 16
Are there any examples how you can use cron with ssh to remotely start jobs on a different server?
And will the remote server get a message back of how the job status! (Finished, failed, running)?
 
Old 11-06-2006, 03:24 AM   #12
teebones
Member
 
Registered: Aug 2005
Location: /home/teebones
Distribution: sometimes this, sometimes that..
Posts: 502

Rep: Reputation: 56
Cron baby!
 
Old 11-06-2006, 05:24 AM   #13
matthewg42
Senior Member
 
Registered: Oct 2003
Location: UK
Distribution: Kubuntu 12.10 (using awesome wm though)
Posts: 3,530

Rep: Reputation: 65
Quote:
Originally Posted by eisman
Are there any examples how you can use cron with ssh to remotely start jobs on a different server?
And will the remote server get a message back of how the job status! (Finished, failed, running)?
Step 1: Set up your remote machine to allow ssh logins without an interactive password (i.e. using public key authentication).

Step 2: Edit your crontab on the central server and add a line something this:
Code:
15,30,45 * * * * ssh remotehost /path/to/executable
In this case, /path/to/executable will be ran on remotehost three times an hour at 15, 30 and 45 minutes past the hour. The error level of the remote process is reported to cron as usual, as is any output. However, there's no simple way to tell if the process is running, other than parsing the output of ps.

If you want to have some sort of logging system, you'd probably make a wrapper script on the local system which took the remote host name and executable path (and any arguments) as arguments and re-directed output to some suitably named log-file, returning the error level to cron as necessary. The "running" status could be determined from the log file (if you have standard "process started" and "process ended" messages), or you could have the wrapper script update some other file, a database entry... whatever you like.

At this point I suppose you're getting in to the territory of a "real" scheduling system, so wrapping it up in with cron may or may not make sense, depending on what you want to do. Certainly, cron doesn't cater for dependencies between jobs, but then maybe make will help you there. It's not always necessary to start from scratch, even if it does seem like more fun.

The more "off the shelf" proven components you can use for the difficult/fiddly bits, the faster you'll be able to knock something together, and the less silly bugs you'll generate. Of course, your creation might look very odd and ugly, and not be so easy to understand as a new bespoke solution. There's a judgment to be made. There's always the point that making a new system is FUN, and a good way to learn on the side of rolling your own.
 
Old 11-07-2006, 06:35 AM   #14
eisman
Member
 
Registered: Jul 2004
Location: The Netherlands
Distribution: Sidux , Debian, Haiku, PC-BSD, CentOS --> XenExpress
Posts: 77

Original Poster
Rep: Reputation: 16
Quote:
Originally Posted by matthewg42
Step 1: Set up your remote machine to allow ssh logins without an interactive password (i.e. using public key authentication).

Step 2: Edit your crontab on the central server and add a line something this:
Code:
15,30,45 * * * * ssh remotehost /path/to/executable
In this case, /path/to/executable will be ran on remotehost three times an hour at 15, 30 and 45 minutes past the hour. The error level of the remote process is reported to cron as usual, as is any output. However, there's no simple way to tell if the process is running, other than parsing the output of ps.

Thanks for your anwser I will look into it.

What I think/want is some kind of storing the job information in a database, say mysql, and start jobs when the schedule requires it. So if you got one central point, you could make dependencies on the jobs. And it would be nice if there was a graphical tool to monitor these things.

I found a program for it, http://www.ortro.net/doku.php
 
Old 11-07-2006, 06:53 AM   #15
matthewg42
Senior Member
 
Registered: Oct 2003
Location: UK
Distribution: Kubuntu 12.10 (using awesome wm though)
Posts: 3,530

Rep: Reputation: 65
Sounds just like Maestro.

If you're serious about implementing something like this, I might be interested in collaborating on such a project. If you are, mail me at matthew<at>porpoisehead<dot>net
 
  


Reply



Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off



Similar Threads
Thread Thread Starter Forum Replies Last Post
Verifying a program is running by cron job, if not starting it. AsstInterests Linux - General 8 04-25-2006 02:48 PM
automation bong.mau Programming 2 09-06-2005 03:26 PM
Cdogg Automation goofyheadedpunk Linux - General 1 01-03-2005 06:56 PM
Permission Automation 0.o Linux - General 3 10-04-2004 01:32 PM
Problem displaying program output when driven by a cron job garmon Linux - Newbie 1 07-22-2004 02:34 PM

LinuxQuestions.org > Forums > Linux Forums > Linux - Software

All times are GMT -5. The time now is 12:02 AM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Open Source Consulting | Domain Registration