Is there any software/way for Linux to run one command and get it executed at 10 web servers?
Linux - ServerThis forum is for the discussion of Linux Software used in a server related context.
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
For me at home i just run ssh in a for loop to my machines. Never thought to look at other solutions. Something like this?
Code:
for machine in "$@" ; do
ssh "$USER"@"$machine" "command" &
done
exit 0
I'm hoping someone will suggest a better alternative than this. Seems like a hack, but it works for my usage.
If multiple command lines to run then use a function.
Code:
myfunction () {
ssh "$USER"@"$1" "command"
ssh "$USER"@"$1" "command2"
ssh "$USER"@"$1" "command3"
}
for machine in "$@" ; do
myfunction "$machine" &
done
exit 0
Maybe it is more straight to bundle the commands
Code:
mycommands="
command
command2
command3
"
for machine in "$@" ; do
ssh "$USER"@$machine" "$mycommands" &
done
BTW bash puts "$machine" as job name.
So the jobs command is not giving good information.
Every other shell does it better, substitutes each $machine to the value before putting it into the job list!
Distribution: openSUSE, Raspbian, Slackware. Previous: MacOS, Red Hat, Coherent, Consensys SVR4.2, Tru64, Solaris
Posts: 2,800
Rep:
Quote:
Originally Posted by frankbell
Take a look at clusterssh. I have heard it highly recommended by a Linux sysadmin.
Heh... I might have been that admin. However, I just built clusterssh on a recent Tumbleweed installation and it failed to pass all the tests. I installed it anyway and it fails to pass along my credentials and prompts for a password in all the xterms it pops up. (Oh good! More debugging!) If it even pops them up, that is. Tools like Ansible (recommended in another reply) might be OK for some things but being able to do an ad hoc command on a bunch of systems and inspect the responses is invaluable. I haven't seen any Linux distributions include clusterssh/cssh in any of their repositories. Pity.
I used clusterssh in the past to manage 6 servers, both physical and containers. Clusterssh worked great but required me to install too many perl dependencies on Slackware. I switched to "tmux with tm", a helper script found here: https://blog.ganneff.de/2013/03/tmux...ust-nicer.html and found it worked for me without needing additional dependencies.
I like both clusterssh and "tmux with tm" for a small group of servers. They both can multiplex commands to all servers simultaneously or switch back and forth from non-multiplex mode to isolate commands to a specific server. Since I found "tmux with tm" I haven't looked back to use clusterssh.
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.