LinuxQuestions.org
Download your favorite Linux distribution at LQ ISO.
Home Forums Tutorials Articles Register
Go Back   LinuxQuestions.org > Forums > Non-*NIX Forums > Programming
User Name
Password
Programming This forum is for all programming questions.
The question does not have to be directly related to Linux and any language is fair game.

Notices


Reply
  Search this Thread
Old 06-22-2006, 07:36 PM   #1
adilturbo
Member
 
Registered: Jun 2006
Location: morocco
Posts: 195

Rep: Reputation: 30
deviding a program into modules


hi all,
i have divided my program into modules.in the main program i used functions
that are defined in cpp files (eg:function1.cpp,function2.cpp........)and in the main program i have for example something like this:

Code:
int main(){
int i=function1();//use function1 from function1.cpp
bool is_ok=function2();//use function2 from function2.cpp
//....................
return 0;
}
so how can i tell to the main program to call the functions that each one exists in its cpp file.

i heard that, that can be done by making my program.
any ideas or suggestions are welcom

thank you.
 
Old 06-22-2006, 07:57 PM   #2
taylor_venable
Member
 
Registered: Jun 2005
Location: Indiana, USA
Distribution: OpenBSD, Ubuntu
Posts: 892

Rep: Reputation: 43
Normally the way one does this is by putting the prototypes for functions in header (.h) files, which are then included (#include "header.h") in any file wanting to call those functions. So you'd have function1.h and function2.h (which include the prototypes, i.e. "void foo(int bar);") and then function1.cpp and function2.cpp (which have the real definitions, i.e. "void foo(int bar) { return bar; }"). You function1.cpp and function2.cpp have to #include their respective headers as well. Then in your main file you use the include macro ("#include "function1.h"" and "#include "function2.h"" on separate lines) and then compile all the source (.cpp) files together into one binary. Or, compile each .cpp file individually into an object file, then link all the object files together (again, just use g++ with the file names) into a single binary.

When doing this it may help to remember that g++ is a single-pass compiler, which means that it has to know everything about your source code in a linear manner. That means that you have to give a prototype for a function before you can (safely) use it. With this method, you aren't really telling your program what unique file that function exists in; you're just telling it how to pass parameters and what to expect back. Which function actually gets executed (and or more correctly, how the function gets executed) is determined at linking time, when all dependencies are uniquely and finally resolved.

The process of "making" a program is more general, and usually refers to using a Makefile along with the Make program to generate a binary. A Makefile is like a recipe for building a complex program based on dependencies. More info can be found by doing an internet search for "GNU Make".
 
Old 06-23-2006, 12:42 AM   #3
elyk1212
Member
 
Registered: Jan 2005
Location: Chandler, AZ USA
Distribution: Mandrake/Mandriva 10.2
Posts: 186

Rep: Reputation: 30
Hi adilturbo, is there a reason you want to have the function separated from the main cpp file? For an assignment, fun/tinkering or something? I am just asking since it looks like you just want one function in each file.
 
Old 06-23-2006, 09:57 AM   #4
adilturbo
Member
 
Registered: Jun 2006
Location: morocco
Posts: 195

Original Poster
Rep: Reputation: 30
hi elyk1212,i'm trying to devide my program into modules ,because it is adviced to do,why?
suppose u have a big project with handreds of code lines,so while compiling u are getting handreds of errors so how can u correct those errors , but when u have modules the work is easier, i mean debuging and testing a single function,so you can easly use a driver for each function,which means writing a short auxiliry program whose purpose is to provide the necessary input for the function,call it,and evaluating the result.
so by using drivers each function can be isolated,and studied by itself,and therby errors can often be spotted quickly.

divide and conquer is the best solution for complex problems.

hi taylor_venable,i am going to try with the ideas u gave me and answer you.

thank you

Last edited by adilturbo; 06-23-2006 at 10:00 AM.
 
Old 06-23-2006, 01:49 PM   #5
sundialsvcs
LQ Guru
 
Registered: Feb 2004
Location: SE Tennessee, USA
Distribution: Gentoo, LFS
Posts: 10,649
Blog Entries: 4

Rep: Reputation: 3934Reputation: 3934Reputation: 3934Reputation: 3934Reputation: 3934Reputation: 3934Reputation: 3934Reputation: 3934Reputation: 3934Reputation: 3934Reputation: 3934
You can see this sort of thing throughout, say, the Linux source code, where related subroutines are put into separate modules; but they are grouped logically; not "one file per routine."

In the C language, file-boundaries are significant because "private" routines and variables can be declared which are visible only to other routines in the same source-file.
 
Old 06-23-2006, 01:55 PM   #6
elyk1212
Member
 
Registered: Jan 2005
Location: Chandler, AZ USA
Distribution: Mandrake/Mandriva 10.2
Posts: 186

Rep: Reputation: 30
You're right, it is always best (for me too, anyhow) to debug one functionality and function at a time. However, I usually have the functions I am testing within the same file (if appropriate, e.g. similar functionality). This is especially the case, if I am using an OO approach.

There are definitely cases I could see having many separate files (do so all the time), however, I was asking what your design/requirement thoughts were since it appears you have only one function per file. It seems like a unique way to do things. I would likely use the different files to organize related tools, in sort of a hierarchal way. But, even if these singular functions are several thousands of lines, debugging should be no problem if contained within a file with related functions and utilities. You would just do as you were mentioning, and isolate test cases to each function one at a time. Most compile time errors would be conveniently addressed by line number, making their resolution a 'trivial' task (as always, 'trivial' to any debugging is hopeful at best).
 
Old 06-23-2006, 02:00 PM   #7
elyk1212
Member
 
Registered: Jan 2005
Location: Chandler, AZ USA
Distribution: Mandrake/Mandriva 10.2
Posts: 186

Rep: Reputation: 30
Quote:
Originally Posted by sundialsvcs
You can see this sort of thing throughout, say, the Linux source code, where related subroutines are put into separate modules; but they are grouped logically; not "one file per routine."

In the C language, file-boundaries are significant because "private" routines and variables can be declared which are visible only to other routines in the same source-file.
Yes, Yes, my point exactly. I also did not even think about the scope of the name resolutions, but this is also important as you will be 'extern'ing your brains out for constants, and any globals symbols you may have. Separate files where 'appropriate'.
 
Old 06-23-2006, 06:05 PM   #8
adilturbo
Member
 
Registered: Jun 2006
Location: morocco
Posts: 195

Original Poster
Rep: Reputation: 30
thank you so much taylor_venable,it works with g++,that by assembling all programs in 1 main.
thank you all.

one more question: imaging we have a lot of files func1.cpp,func2.cpp,func3.cpp........
so have i to type
Quote:
g++ func1.cpp func2.cpp func3.cpp ..... -o main
?
did not u find it heavy?

Last edited by adilturbo; 06-23-2006 at 06:09 PM.
 
Old 06-23-2006, 06:25 PM   #9
taylor_venable
Member
 
Registered: Jun 2005
Location: Indiana, USA
Distribution: OpenBSD, Ubuntu
Posts: 892

Rep: Reputation: 43
It would be very tedious (and a huge pain) to have to type each source file individually. In that case you could use a shell glob (I don't think it's a very good idea, but you could: `gcc *.cpp -o main`), or use Make.

With Make, you have a couple of options: the first is to use a suffix rule like shown at http://www.gnu.org/software/make/man...l#Suffix-Rules (this may not be a good idea because it's rather inflexible) and the second is to simply list each file along with its prerequisites and the compilation rules that apply to it.

This last option is the best, because the Makefile (when carefully and correctly written) will ensure that a minimum number of compilation steps is performed to get the correct result. Plus you always know exactly what is going to be compiled and in what order. The downside is that you have to write a rule for each source file, but the upside is that you only have to write it once.
 
  


Reply



Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off



Similar Threads
Thread Thread Starter Forum Replies Last Post
FATAL: Could not load /lib/modules/2.6.14-default/modules.dep no such file exist dr_zayus69 Linux - Software 3 12-26-2005 06:44 PM
Calling User-space program functions from Kernel modules shivanu Programming 1 03-05-2005 02:11 PM
Slack9 - no PPP (can't open dependencies file /lib/modules/2.4.18/modules.dep) bluehz Slackware 1 05-04-2003 02:32 PM
Re: modprobe: Note: /etc/modules.conf is more recent than lib/modules/2.4.9/modules.d Andy.M Linux - General 1 01-24-2002 01:50 AM
Re: modprobe: Note: /etc/modules.conf is more recent than lib/modules/2.4.9/modules.d Andy.M Linux - Newbie 2 01-24-2002 01:40 AM

LinuxQuestions.org > Forums > Non-*NIX Forums > Programming

All times are GMT -5. The time now is 03:49 PM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Open Source Consulting | Domain Registration