[SOLVED] Rewrite a program using the 'Unix philosophy'
ProgrammingThis forum is for all programming questions.
The question does not have to be directly related to Linux and any language is fair game.
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
A few years ago I had written a program in Python/PYGtk for use in transferring mp3 albums stored on my hard drive to my mp3 player (it had started of as a way to ensure that the albums tracks were copied to mp3 player in order for correct playback but grew into allowing me to select album by id3 tags i.e. genre, artist etc).
I've been meaning for a year to add new features but I have been put off because the programs structure is a bit of mess and I've haven't done anything in Python/PyGtk for ages so I would need to go through the pyGTK to see how it works again. Since then I have got into using the command line as much as possible as well as using Vim. With this has got me thinking about the other possibilities of interacting with programs. I also recently came across the Unix philosophy of program design and it opened by eyes about designing programs to be broken down into small easily maintained sub-programs as well as using text streams where possible (if my understanding of it is correct) . With this, it got me thinking, as my mp3 transfer isn't too big, that it might be better to rewrite it than having to go through it and figuring out how it works and where to make changes.
The changes I am thinking of making of making is as follows:
* Use C programming language (I've been tinkering with it on and off and this will give me an opportunity to do something with it)
* The mp3 id3 tag information is currently stored in my mySql database. I intend to replace that with a text file and use AWK to extract information.
* Write a program that scans all mp3 albums on my hard drive and store it in a text file.
* Write a program that searches for any new mp3 albums and updates text file.
* Write a program that takes input via the command line i.e. an id3 tag value, searches the text file and outputs all albums including the file location whose tag values that inputted tag.
* Write a main program that uses the above programs and copies the selected album to mp3 player.
As I will be using external programs like AWK and Sed, it should mean I will use less code and hopefully make it easier to maintain in future.
Obviously I will have to do sit down and do a bit of thinking in terms of design before I attempt it but in the meantime, I was wondering, would it be better to write the main program in C or write it is a Bash shell script?
Bash like other interpreter`scripting languages are sometimes also referred to as prototyping languages.
Meaning you can quickly flesh out an idea and have a working prototype much easier than going directly into a language like C.
Also, if you wish you can then write the program in C after you have a working Bash version.
This will ease the writing process. Having a complete and functioning layout and being able to go element by element until you have written out the necessity of any Bash code or any system calls to awk() or sed().
You talk of tinkering on and off with C and I think this would be a great way to develop your skills further in an approachable way.
Last edited by cin_; 02-25-2012 at 04:30 AM.
Reason: gramm`err
Obviously I will have to do sit down and do a bit of thinking in terms of design before I attempt it but in the meantime, I was wondering, would it be better to write the main program in C or write it is a Bash shell script?
IMHO, if you intend to call external programs from your main, it is better to write it as a shell script (or perhaps python, perl, or something like that). While C has means of launching external programs, I have always found it somwhat clumsy. In C, you would be better off using functions from libraries you can link to your program. Also, if you intend to use your main program as a wrapper/interface for those external apps and all the work will be performed by them, there's little performance to be gained by writing it in C.
I also recently came across the Unix philosophy of program design and it opened by eyes about designing programs to be broken down into small easily maintained sub-programs as well as using text streams where possible (if my understanding of it is correct) .
I don't think that the point of this philosophy is to break your own programs down into idiosyncratic single-purpose programs, but rather to supply an OS end-user with a set of simple, single-purpose tools with which to build more complex functionality. This makes scripting more powerful because the smaller programs provide more advanced "generic" functionality to otherwise-skeletal scripting languages. With that in mind, breaking your existing code down into smaller parts could be helpful, but putting those parts into separate executable files only makes sense if you can achieve more robust functionality by executing those parts from within a scripting language. That probably isn't the case if it's practical to use C to string together all of the smaller executables. Sometimes it's useful to have a text file with bash functions that you . from multiple front-end scripts. Of course, there are some front-end programs like k3b and kvpnc that really only execute command-line programs, but those programs are independently useful.
Kevin Barry
Where I am coming from with the 'Unix design philosophy' my existing program was based around having a small problem which could get information from a master file that contained all the id3 tags of the mp3 files stored on my hard drive. With this program, I could then write another program which would that for the mp3 transfer program. It got me thinking that I could then use the id3 tag program for other scripts in the future if I need it like for instance batch tag maintenance or even a custom built mp3 player.
I was leaning towards using a scripting language, preferably Bash, as the main program but the problem is how would I would interact with it? At the moment my existing program has a list tree structure. I've started using Vimperator plugin on Firefox and quite impressed with the way when you type into the command line and it displays a list of auto-completed command. I thought that would be a great idea as well as using keyboard combinations. I'm not sure if there is a Graphical Tool-kit for the CLI, like Zenity, that could achieve this?
Check out this post and thread for some interaction tips.
Basically you will want to have all of your individual scripts in one folder, make sure the scripts have execute priviledges...
Code:
# chmod a+x FILE
... the `a` means all groups and users, if you need to you can specify further: check out # man chmod; and `x` is for execute
... add that folder to your PATH and then you can call the scripts from your terminal: rxvt(), aterm(), gnome-console(); command line like any other program.
At the command line there is an auto-complete feature that while you type the name of the program you would like to run if you hit the TAB button once it will complete a unique string, or twice will show all possible options.
To call another script that you have written and whose path has been added to PATH from a script you are writing use the backticks ``. This tells Bash to call whatever is in them as a command, and returns the standard output.
I'll immediately say this: use Perl, Ruby, Python, PHP, Haskell ... basically anything but bash-scripts or straight-C.
You have a marvelous feature in the shell: #!commandname, otherwise known as "shebang."
With it, you can write "a command" in any language that you please, and the end-user will never know the difference. But each of these languages provides, in addition to a "real" programming environment, access to various large libraries of well-tested software for doing just about everything. ("I can write an entire web server in one line...")
Instead of building from scratch, you stand upon the shoulders of giants.
I've found bash scripts the easiest to maintain and reuse. If I don't like the script I cut out what I need, paste it in a new script and I'm done. In C ... well, let's just say if you're an expert, you can do about the same.
I disagree with sundialsvcs, there's nothing wrong with C and bash. I use them both and they supplement each other very well. You can use them together to surpass the efficiency, maintainability and readability of any of those languages listed there. It will also be 100% backwards compatible if written properly ... the same cannot be said for those listed.
My only mild self-defense would be this ... "Language Implementors are Gods."
Some of the very best "C"-programming out there, anywhere, will be found in the guts of a commercial-grade language implementation. And you can leverage all of that, "effortlessly."
Furthermore: these languages have vast libraries of bulletproof code, covering just about any subject imaginable, which you can simply "install and use."
The execution-time "price" that you pay is, frankly, inconsequential. "At X-billion operations per second, no one can hear you scream."
Hence... while I amnot attempting in any way whatsoever to "refute" what has previously been said, I do cordially submit that, "I rest my case." Try it for yourself, as so many others have done, and very quickly you will see precisely what I mean. (And, the technical limits and tradeoffs will be instantaneously apparent.) There is ample room for both opinions, of course: both of them are worthy; and they are truly in parallel, not in conflict.
One of the single most important maxims that I ever encountered, and which I found in the pages of a book that described a Paradox for DOS tool-set, was this TRVTH:
Quote:
Actum Ne Agas: Do Not Do A Thing Already Done.
If there is to be any one overwhelming feature of the Linux/Unix environments, versus Microsoft Windows (say... and, "no hard feelings, Redmond...") it is the enormous power of "shebang," and of the plethora of absolutely zero-cost language that can be used freely thereby.
I dig the debate, but hardly understand comp sci arguments.
Let the code speak for itself.
You swear by Perl, Python, PHP, C... I can agree on all accounts.
But let your argument be a small code sample implementing your defense.
This would allow the OP, and future readers brought here by search, to not only have an introduction of how to begin the work, but also allow to choose the syntax the reader would be most comfortable with.
Last edited by cin_; 02-28-2012 at 11:06 PM.
Reason: gramm`err
Thanks everybody for providing additional comments.
I've been giving this some more thought and I've kind of went full circle. I think the best way forward is just to create a standalone program that can take input from command-line arguments, query a SQL database containing the id3 tags then output the results. I can then redesign my program so that it calls this external program and puts the output into an array or other variable for use.
While I appreciate the real issue is that my original program is not properly designed, I like the idea, as a casual programmer, of having a program that can do a specific program and know how it works i.e. command-line arguments and text output so that I can take this and quickly write scripts for something I require. I can understand the same could apply for program libraries but I sometimes don't do coding for months and it is kinda hard to get motivated and dive into API's to try and figure out how it works and to put it into the program that I want.
I dig the debate, but hardly understand comp sci arguments.
...
But let your argument be a small code sample implementing your defense.
There really isn't anything to "defend." Just an aside. My only two points are, first, that you have lots of languages to choose from, all of them free, all of them equally available. And, second, that it is the surrounding libraries of code .. written by other people, and rigorously tested in production by other people .. that are the real pay-dirt of the entire proposition. The code that you actually have to write ought to be minimal; even sparse. The payback is huge, and it will certainly change your entire approach to many problems (including this one).
For example, in the case of Perl: http://search.cpan.org/search?query=mp3&mode=all. There you go... at the moment, 247 modules to choose from. ("And that's just for Perl.") You know that all of them will only install themselves on your system if every one of perhaps hundreds of rigorous self-tests have passed ... both for themselves and for every module they might depend on. You can search to see every bug-report anyone has ever posted about them, on any environment. You can look through the self-tests that are automatically run, all the time on every such module, in every major computing environment that's out there. In other words: you know that they will work, and that they will work on your system, because they justdid.
And so, you begin your project from that starting point rather than a blank screen.
"Priceless."
Last edited by sundialsvcs; 02-29-2012 at 08:54 AM.
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.