Anyway to tell if a compile was succesful in the past?
Linux - SoftwareThis forum is for Software issues.
Having a problem installing a new program? Want to know which application is best for the job? Post your question in this forum.
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
Anyway to tell if a compile was succesful in the past?
Hi folks,
I wonder if its possible to tell if a compile already has been done?
Assumming you have a shell script where you get various sources and build them. While getting the script to behave nice you always start it over and over again and it compiles over and over again. Is gcc able to shorten this process? Or would I have to query for any specific file to tell that the program is already compiled?
While getting the script to behave nice you always start it over and over again and it compiles over and over again. Is gcc able to shorten this process?
You have heard of "make" haven't you? If you have a properly written makefile, source files that have already been compiled successfully won't be re-compiled again. But stuff that failed to compile will.
I can't tell from your description above if you are trying to debug a problem compiling your source code, or debug a problem in your makefile dependencies. If you've got a buggy makefile, you could have all kinds of failures - things could re-compile needlessly, things that still need to be compiled might be missed, etc. You have to set up your dependencies correctly in the makefile.
@haertig its not problem with the make file or the build not finishing. Let me explain a bit better what I'm hunting after.
Code:
#/bin/bash
# download some c sources we need for our project
wget some.sourcecode/coolproject
wget some.even/coolerproject
wget this.project/is/the/bomb
# untar the projects
...
# cd into each dir and compile
cd coolproject
make && make install
# create some very cool stuff using the compiled sources
do some_crazy stuff
# we are done now
echo "your überproject is done sire"
Take this pseudo pseudo code. We get some stuff and compile it. Everything compiles fine. Just at the point of "do some_crazy stuff" we refine and sharpen our shell script some more. So everytime we run this script the compilation would be done too. And thats the point where I need info. The first run is just compile everything. But on the second run I would not need to compile cause it already has been compiled. But as far as I've seen the compile starts over and over again on every call to the script. No real problem just time consuming.
Hmm, my choice would be to not have all that stuff in one script. Downloading source is one thing, compiling it is another, and executing the results is a third - I don't put all those seperate operations into one script myself. But you obviously know what you're doing and have a reason for it.
make will not normally re-compile stuff that doesn't need re-compiling (given that you have a well written makefile). However, you are re-downloading and re-un'tarring your source each time, so make sees that as newly updated source and thus recompiles it.
Even if you can control source file timestamps so that make doesn't re-compile, you will still be wasting a lot of time re-downloading and re-un'tarring.
What I would do is remove the downloading, un'tarring and make'ing from your script. Use the script only to execute the result. And then put all the other operations into a seperate script that is invoked via cron at some "who cares?" time in the middle of the night. You could improve that cron-initiated script to only re-download when a new version of the source is available, or you could just leave it inefficiently doing things over and over again in the middle of the night (when nobody really cares anyway).
Hmm, my choice would be to not have all that stuff in one script. Downloading source is one thing, compiling it is another, and executing the results is a third - I don't put all those seperate operations into one script myself. But you obviously know what you're doing and have a reason for it.
Seems I can hide me incapacity quite well That would be my approach too but right now I'm using another ones script so I first need to understand it completly before I go rip it apart. Also not sure if its worth it.
Quote:
Originally Posted by haertig
make will not normally re-compile stuff that doesn't need re-compiling (given that you have a well written makefile). However, you are re-downloading and re-un'tarring your source each time, so make sees that as newly updated source and thus recompiles it.
The actual script uses git so its not redownloading it all. I was just over simplifying my pseudo pseudo script. The well written makefile got me started. There is a make clean at the start of every build. Also okay if run for the first time, it just destroyes everything on further compiles. IIRC. Guess I takle that after breaking the script into pieces.
Quote:
Originally Posted by haertig
Even if you can control source file timestamps so that make doesn't re-compile, you will still be wasting a lot of time re-downloading and re-un'tarring.
What I would do is remove the downloading, un'tarring and make'ing from your script. Use the script only to execute the result. And then put all the other operations into a seperate script that is invoked via cron at some "who cares?" time in the middle of the night. You could improve that cron-initiated script to only re-download when a new version of the source is available, or you could just leave it inefficiently doing things over and over again in the middle of the night (when nobody really cares anyway).
There is a make clean at the start of every build.
Well that would do it. "make clean" removes all the intermediate files - object files, etc. - that are used to determine if something needs to be recompiled or not. When you say "make clean", yeah, you're telling make to recompile everything again.
Thanks for clarfying. So I found the bad guy. AS I hope this to be the last test run of the script with my changes I can start taking it apart and put some markers on individual steps so they can be skipped if allready done.
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.