Linux - SoftwareThis forum is for Software issues.
Having a problem installing a new program? Want to know which application is best for the job? Post your question in this forum.
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Introduction to Linux - A Hands on Guide
This guide was created as an overview of the Linux Operating System, geared toward new users as an exploration tour and getting started guide, with exercises at the end of each chapter.
For more advanced trainees it can be a desktop reference, and a collection of the base knowledge needed to proceed with system and network administration. This book contains many real life examples derived from the author's experience as a Linux system and network administrator, trainer and consultant. They hope these examples will help you to get a better understanding of the Linux system and that you feel encouraged to try out things on your own.
Click Here to receive this Complete Guide absolutely free.
I have this script that works fine when run sequentially. It calls several sub scripts that do the leg work. The sub scripts DO NOT have dependencies directly to one another. The are interested in a few of the same .a files it seems.
I keep running into an inconsistent (but frequent) problem where the .a file cannot be found by scons.. even though when I look it is there.
I am sending 8 scripts to the background and waiting for them all to complete. It seems different scripts return the error each time but they all run into the same issue with this file (and occasionally a different error), but it changes each time I run it.
Could this be related to running the scripts in parallel?
Any ideas on how I can diagnose this as well?
I am running CentOs7 in vsphere with 8 processors and 32GB memory available to it.
Could this be related to running the scripts in parallel?
Yes, when you run processes in parallel you have to allow for the fact that any file that is written to by more than one process can only be accessed by one process at a time. Each process using such a file has to obtain exclusive control of the file before writing to the file and release control when it is finished with the file.
You can use either lock files or mutex locks but every process has to use the same method, i.e. you can't mix lock files and mutexes for the same item being controlled.
The following links will give you a grounding in the theory of how lock files and mutexes work. Note that mutexes can be used for a wide range of non-sharable items in addition to files.
Yes, when you run processes in parallel you have to allow for the fact that any file that is written to by more than one process can only be accessed by one process at a time. Each process using such a file has to obtain exclusive control of the file before writing to the file and release control when it is finished with the file.
You can use either lock files or mutex locks but every process has to use the same method, i.e. you can't mix lock files and mutexes for the same item being controlled.
The following links will give you a grounding in the theory of how lock files and mutexes work. Note that mutexes can be used for a wide range of non-sharable items in addition to files.
Thanks Steve. So after reading up on file locking some I (think) I see what my problem is. I do not need to write to the .a file, I only need to read from it so it surprises me that I cannot have multiple processes looking at the same file at the same time since all the data will stay the same.. unless the first process to access the file puts a lock on it, I do not know if this is the case or how to tell.
Besides diving (very) deep into the code and putting a checkFileIsFree function before it accesses it is there a way I can manage the files accessing the same file at the same time, like a queue or something?
Thanks Steve. So after reading up on file locking some I (think) I see what my problem is. I do not need to write to the .a file, I only need to read from it so it surprises me that I cannot have multiple processes looking at the same file at the same time since all the data will stay the same.. unless the first process to access the file puts a lock on it, I do not know if this is the case or how to tell.
Besides diving (very) deep into the code and putting a checkFileIsFree function before it accesses it is there a way I can manage the files accessing the same file at the same time, like a queue or something?
Thanks,
Scott
How big is the file? Would it be reasonable to slurp the entire file into a variable: Perl style? If not, there are other techniques.
You should probably pipe the STDERR and STDOUT outputs of each process to its own file so that you can see what they might be reporting. Be sure to code them in such a way that they don't "assume" that something will work "merely because it always did."
It is possible to open a file for shared access, provided that you use locks or other means to synchronize access to it and you make sure to flush buffers to disk before releasing control.
If you want several processes to read a file at the same time, you may need to specify "shared access" when opening the file.
Besides diving (very) deep into the code and putting a checkFileIsFree function before it accesses it is there a way I can manage the files accessing the same file at the same time, like a queue or something?
Yes, you can use mutexes. The advantage of a mutex is that it is all done in memory which is much faster than writing lock files to a hard drive and then later deleting them.
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.