Perl - How to do multiple child processes in parallel?
I'm looking for a way in Perl to be able to take a list of servers, ssh multiple commands to it and store the results. If I do this process serially, sometimes one server will hang the whole script and if it doesn't, it still takes hours to complete.
I'm thinking what I need to do is make a parent loop that calls out a separate process that passes the server name to the child sub process and then executes all the commands I have defined in its own process. If one server 'hangs', at least that won't stop the script from doing all the other servers in the list. I'm guessing using the fork() command would serve me best, however, all the online descriptions I have found have been vague at best. Any suggested reading that gives a better description, or anyone have a better suggestion? I've done this type of thing plenty on Windows with pushing psexec out in a loop, just not Unix. Devon |
Quote:
|
Why should he launch with system(), if all he needs is a SSH connection? system() launches a shell, which is a waste. DevonB,, do you understand the semantics of the fork + exec process, in general? If no, then you can find quite a lot of literature online to explain it. Ask back here for help with the parts that aren't clear.
In Perl, the usage is almost identical to what you would do in C, only easier (like most things in Perl vs.C). The basic premise is that you can create an arbitrary number of child processes (using fork()), and each of them can exec() another process, thereby turning the child into a different program. That's how all processes are launched in Unix/Linux. To get you started: Code:
# When we launch child processes, we don't care when they die; we --- rod. |
Quote:
|
Still, if one exists on forking, there is a convenient module: http://search.cpan.org/~rybskej/forks-0.34/lib/forks.pm .
|
All times are GMT -5. The time now is 07:11 PM. |