LinuxQuestions.org

LinuxQuestions.org (/questions/)
-   Programming (http://www.linuxquestions.org/questions/programming-9/)
-   -   Converting the contents of a column with the output of the date command using awk (http://www.linuxquestions.org/questions/programming-9/converting-the-contents-of-a-column-with-the-output-of-the-date-command-using-awk-906094/)

mystupidquestion 10-02-2011 03:00 PM

Converting the contents of a column with the output of the date command using awk
 
Hello,
I have a beginner text processing/bash/awk question. I have a text file of several columns, the output of sar. Here is a sample:

01:50:01 PM all 2.01 0.00 0.57 1.08 0.00 96.33
02:00:01 PM all 0.55 0.00 0.24 0.16 0.00 99.05
02:10:01 PM all 0.22 0.00 0.17 0.16 0.00 99.45
02:20:02 PM all 0.43 0.91 1.59 3.08 0.00 93.99


Column one has the time in the 12 hour format. I need the format of each item in column one to be in the 24 hour format. I know the date command will convert this for me:

date --date="07:40:01 PM" +%T

Returns: 19:40:01

I *think*, awk is the tool I need to use here. My game plan is: for each line, get the value of column 1. Run the value through the date command, and output the result plus the rest of the line to standard out or a file.
I've gotten awk to hand me the time in 24 hour format, but I'm stuck on using the date command on the returned field within awk. How do I do this? I looked in awk's man page, and I see there is a system function, but it returns the exit value. I want whats on standard out, not exit status, and it want to append other colums to the output. Any pointers/advice is appreciated.

colucix 10-02-2011 03:11 PM

Code:

awk '{("date --date=" $1 $2 " +%T") | getline time; print time}' file
This uses 4.9.6 Using getline into a Variable from a Pipe from the GNU awk manual.

Tinkster 10-02-2011 03:33 PM

Slight modification:

Code:

awk '{("date --date=" $1 $2 " +%T") | getline $1;$2=""; print $0}' file
13:50:01  all 2.01 0.00 0.57 1.08 0.00 96.33
14:00:01  all 0.55 0.00 0.24 0.16 0.00 99.05
14:10:01  all 0.22 0.00 0.17 0.16 0.00 99.45
14:20:02  all 0.43 0.91 1.59 3.08 0.00 93.99



Cheers,
Tink

mystupidquestion 10-02-2011 04:06 PM

Great. So this means....
 
Thanks, so, am I correct in interpreting this statement:
awk '{("date --date=" $1 $2 " +%T") | getline $1;$2=""; print $0}' file

means:
For each line, send the first and second column's (time plus AM/PM) entry to date. Pipe the date command's output to get line, and assign the result to column 1, right? Then, for the value of column 2 (AM or PM) set to a null string. Is $field="" the preferred way to remove a column? Are the other columns shifted left when you do this?.Are there not two sets of field separators when you set $2 to ""? Does it even matter? Lastly, print $0 is telling awk to print the current line, after your modifications.

Thanks!

Tinkster 10-02-2011 04:22 PM

Quote:

Originally Posted by mystupidquestion (Post 4488349)
Thanks, so, am I correct in interpreting this statement:
awk '{("date --date=" $1 $2 " +%T") | getline $1;$2=""; print $0}' file

means:
For each line, send the first and second column's (time plus AM/PM) entry to date. Pipe the date command's output to get line, and assign the result to column 1, right? Then, for the value of column 2 (AM or PM) set to a null string. Is $field="" the preferred way to remove a column? Are the other columns shifted left when you do this?.Are there not two sets of field separators when you set $2 to ""? Does it even matter? Lastly, print $0 is telling awk to print the current line, after your modifications.

Thanks!

You got that all right.

You gain one space from the $2="", but from awk's perspective that won't matter,
as it treats any number of any whitespace (spaces & tabs) as separator by default.

If you're concerned you could explicitly print all fields you need
individually. There's always more than one way to do it ;}
Code:

awk '{("date --date=" $1 $2 " +%T") | getline time;print time" "$3" "$4" "$5" "$6" "$7" "$8" "$9}' file


All times are GMT -5. The time now is 08:24 AM.