Programming This forum is for all programming questions.
The question does not have to be directly related to Linux and any language is fair game. |
Notices |
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
Are you new to LinuxQuestions.org? Visit the following links:
Site Howto |
Site FAQ |
Sitemap |
Register Now
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
|
 |
|
05-14-2009, 11:28 PM
|
#1
|
Member
Registered: Apr 2009
Location: Boston MA
Distribution: CentOS 6.2 x86_64 GNU/Linux
Posts: 59
Rep:
|
calculate the average of cells in columns in separate txt files
Hi there,
I have file1.txt that contains:
and file2.txt that contains:
Code:
10.5
6
7.5
10.5
10.5
Both files have the same number of rows, only one column.
Now I'm looking for a script that can write me file_avg.txt that contains the average of the cells in the columns above:
One more thing: I have 5 files each with on column as above and I'd like to get an average (and if this works maybe more files)
Any help is appreciated a lot!
Last edited by Mike_V; 05-15-2009 at 01:31 PM.
|
|
|
05-15-2009, 12:06 AM
|
#2
|
Senior Member
Registered: Aug 2006
Posts: 2,697
|
what have you tried?
|
|
|
05-15-2009, 12:15 AM
|
#3
|
Member
Registered: Apr 2009
Location: Boston MA
Distribution: CentOS 6.2 x86_64 GNU/Linux
Posts: 59
Original Poster
Rep:
|
I've tried searching the forum and checked the "similar threads" below... but no luck.
Other than that nothing. I'm hoping there is a relatively easy solution, and that one of you have that "easy" solution... I know it can be done with Matlab but I'd have to learn basic Matlab first... which may take a lot of time.
Last edited by Mike_V; 05-15-2009 at 12:16 AM.
|
|
|
05-15-2009, 12:30 AM
|
#4
|
Senior Member
Registered: Aug 2006
Posts: 2,697
|
you have been exposed to shell scripting in your previous posts. i don't believe you can't at least produce something now. remember awk? you can use it to do calculation like this. also, the bc tool. programming languages like Python, Perl too can be used. What have you learnt so far since 20++ posts ago?
|
|
|
05-15-2009, 12:47 AM
|
#5
|
Member
Registered: Apr 2009
Location: Boston MA
Distribution: CentOS 6.2 x86_64 GNU/Linux
Posts: 59
Original Poster
Rep:
|
c'mon...
|
|
|
05-15-2009, 02:00 AM
|
#6
|
LQ Guru
Registered: Aug 2004
Location: Sydney
Distribution: Rocky 9.x
Posts: 18,443
|
Give it a try. We'll help you debug, but we're not going to do it for you.
|
|
|
05-15-2009, 02:12 AM
|
#7
|
Senior Member
Registered: Sep 2003
Posts: 3,171
Rep: 
|
It certainly is an easy enough problem. I'd probably do it in C just because I could do it fastest that way. When I say that I must also note that I have "in the can" routines to search the directory for the relevant files, build a list (including their sizes), then open them one by one. But it is also easy enough in bash, and trivial in PHP, Perl, or Python.
|
|
|
05-15-2009, 03:17 AM
|
#8
|
LQ Guru
Registered: Oct 2005
Location: $RANDOM
Distribution: slackware64
Posts: 12,928
|
I suggest using 'paste' to paste the files together, then it should be easy to use 'awk' to find the averages on a per-line basis.
|
|
|
05-15-2009, 08:49 AM
|
#9
|
Member
Registered: Apr 2009
Location: Boston MA
Distribution: CentOS 6.2 x86_64 GNU/Linux
Posts: 59
Original Poster
Rep:
|
OK, OK: after a night of sleep:
Code:
paste file1.txt file2.txt | awk '{ sum = $1 + $2 ; avg = sum / 2 ; print avg }' > file_avg.txt
and it didn't take much sleep, as you can imagine... just the gawk manual
http://www.gnu.org/software/gawk/manual/gawk.html
and some little adjustments.
Cheers!
Last edited by Mike_V; 05-18-2009 at 09:46 AM.
|
|
|
05-15-2009, 11:04 AM
|
#10
|
LQ 5k Club
Registered: Aug 2005
Distribution: OpenSuse, Fedora, Redhat, Debian
Posts: 5,399
|
Nicely done Mike_V. Not only did you accomplish your objective, but by my estimation, you pretty much nailed the definitive solution. Sometimes a little RTFMing goes a long way.
--- rod.
|
|
|
05-15-2009, 11:26 AM
|
#11
|
Senior Member
Registered: May 2005
Posts: 4,481
|
Quote:
Originally Posted by Mike_V
OK, OK: after a night of sleep:
paste file1.txt file2.txt | awk '{ sum = $1 + $2 ; avg = sum / 2 ; print avg }' > file_avg.txt
and it didn't take much sleep, as you can imagine... just the gawk manual
http://www.gnu.org/software/gawk/manual/gawk.html
and some little adjustments.
Cheers!
|
Once in Oregon I saw a license plate surrounded by plastic frame with "True men read manuals" on it.
|
|
|
05-15-2009, 12:23 PM
|
#12
|
LQ Guru
Registered: May 2005
Location: boston, usa
Distribution: fedora-35
Posts: 5,330
|
Quote:
Originally Posted by Mike_V
Hi there,
I have file1.txt that contains:
and file2.txt that contains:
Code:
10.5
6
7.5
10.5
10.5
Both files have the same number of rows, only one column.
Now I'm looking for a script that can write me file_avg.txt that contains the average of the cells in the columns above:
One more thing: I have 5 files each with on column as above and I'd like to get an average (and if this works maybe more files)
Any help is appreciated a lot!
|
i dont see how the average of 4 and 6 is 10. do you mean you want the sum of the two feilds ?
___________________
Quote:
Originally Posted by Mike_V
OK, OK: after a night of sleep:
paste file1.txt file2.txt | awk '{ sum = $1 + $2 ; avg = sum / 2 ; print avg }' > file_avg.txt
and it didn't take much sleep, as you can imagine... just the gawk manual
http://www.gnu.org/software/gawk/manual/gawk.html
and some little adjustments.
Cheers!
|
also, + 1 for at least making an attempt.
Last edited by schneidz; 05-15-2009 at 12:26 PM.
|
|
|
05-15-2009, 01:17 PM
|
#13
|
LQ Guru
Registered: Oct 2005
Location: $RANDOM
Distribution: slackware64
Posts: 12,928
|
Quote:
Originally Posted by schneidz
i dont see how the average of 4 and 6 is 10. do you mean you want the sum of the two feilds ?
___________________
also, + 1 for at least making an attempt.
|
that was probably a mistake, the rest make sense
Also, good job on finding a solution yourself, it isn't all that hard.
|
|
|
05-15-2009, 01:37 PM
|
#14
|
Member
Registered: Apr 2009
Location: Boston MA
Distribution: CentOS 6.2 x86_64 GNU/Linux
Posts: 59
Original Poster
Rep:
|
Quote:
Originally Posted by schneidz
i dont see how the average of 4 and 6 is 10. do you mean you want the sum of the two feilds ?
also, + 1 for at least making an attempt.
|
yeah... (6+4)/2=5, sorry.
Quote:
Originally Posted by Sergei Steshenko
Once in Oregon I saw a license plate surrounded by plastic frame with "True men read manuals" on it.
|
If all manuals were as clear as that gawk one, I would agree.
|
|
|
05-15-2009, 01:42 PM
|
#15
|
Senior Member
Registered: May 2005
Posts: 4,481
|
Quote:
Originally Posted by Mike_V
yeah... (6+4)/2=5, sorry.
If all manuals were as clear as that gawk one, I would agree.
|
If all people first at least tried to read manuals and then asked questions on things that were not clear to them, I would agree with you.
|
|
|
All times are GMT -5. The time now is 09:13 AM.
|
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.
|
Latest Threads
LQ News
|
|