Linux - NewbieThis Linux forum is for members that are new to Linux.
Just starting out and have a question?
If it is not in the man pages or the how-to's this is the place!
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
I have a php web page running on a low powered web server (home use only) that writes FFMPEG commands to a file for later processing by a more powerfull machine. It possible to write a shell script, to run on the more powerfull machine, that will read the first line of the queue (all relivant files are available on an samba share), remove that line from the file, execute it and repeat until the file is empty. I could then run this on multiple computers to encode at the same time (I have potentially a couple of thousand files to encode).All machines are running on Current Debian.
It's certainly possible, but it might require some custom processing. Also, you'll have to be careful that both systems don't try to update the file at once. I'm not sure how well CIFS handles file locking, so that might be an issue.
You might want to look at free distributed queuing systems such as TORQUE, Open Grid Engine, or SLURM. They're not quite designed for your use case, but they could be used. I think most big render farms use DrQueue, so that one might be worth checking out too, but I have no personal experience with it.
Thanks btmiller I'll check out those queuing systems.
To avoid the possible issue of accessing a file by 2 machines at once I could flip the whole thing around a bit. If the server pinged a list if possible IP's and from that created a list of active machines, it could iterate through them and using ssh, first test the client machines 5 min average CPU load using upstart if it is less than say 80% it would send the FFMPEG command and then move onto the next ip address, if all client machines are over 80% load wait 60 sec and repeat.
It would mean setting up preshared keys which I've never done but there are countless tutorials on line.
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.