Programming This forum is for all programming questions.
The question does not have to be directly related to Linux and any language is fair game.
Notices
Welcome to
LinuxQuestions.org , a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free.
Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
Are you new to LinuxQuestions.org? Visit the following links:
Site Howto |
Site FAQ |
Sitemap |
Register Now
If you have any problems with the registration process or your account login, please
contact us . If you need to reset your password,
click here .
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a
virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month.
Click here for more info.
10-08-2013, 03:08 PM
#1
Member
Registered: Feb 2008
Distribution: Oracle Linux
Posts: 32
Rep:
CSV formatting
Hi,
i have a csv like below:
USA
1 newyork
2 wasington
UK
1 London
2 Manchester
Japan
1 Tokyo
i need to format it like this:
1 USA Newyork
2 USA wasington
3 UK LONDOn
4 UK Manchester
5 Japan Tokyo
preferable lang shell script. can anyone please help me.............
10-08-2013, 03:33 PM
#2
LQ Guru
Registered: Sep 2009
Location: Perth
Distribution: Manjaro
Posts: 10,007
Well I am not seeing many c's in the c sv, but that aside, what have you done so far to solve the problem?
1 members found this post helpful.
10-08-2013, 04:30 PM
#3
Senior Member
Registered: Apr 2010
Location: Apex, NC, USA
Distribution: Mint 17.3
Posts: 1,881
With this InFile ...
Code:
USA
1 New York
2 Washington
3 Chicago
4 Los Angeles
UK
1 London
2 Manchester
Japan
1 Tokyo
... this
awk ...
Code:
awk '{if ($0!~/[0-9]/) country=$0; else {$1=$1" "country; print}}' $InFile >$OutFile
... produced this OutFile ...
Code:
1 USA New York
2 USA Washington
3 USA Chicago
4 USA Los Angeles
1 UK London
2 UK Manchester
1 Japan Tokyo
Daniel B. Martin
10-12-2013, 01:32 PM
#4
Member
Registered: Feb 2008
Distribution: Oracle Linux
Posts: 32
Original Poster
Rep:
thanks daniel. can you please help me to remove trailing "," from the line like below:
2013-10-13 06:00:00, USA, NEWYORK,123,,,,,,,,,,,,
10-12-2013, 02:02 PM
#5
Senior Member
Registered: Apr 2010
Location: Apex, NC, USA
Distribution: Mint 17.3
Posts: 1,881
Quote:
Originally Posted by
scream
thanks daniel. can you please help me to remove trailing "," from the line like below:
2013-10-13 06:00:00, USA, NEWYORK,123,,,,,,,,,,,,
Try this ...
Daniel B. Martin
10-13-2013, 04:09 PM
#6
Member
Registered: Feb 2008
Distribution: Oracle Linux
Posts: 32
Original Poster
Rep:
thanks again daniel. i have used "sed -i -e 's/,*$//' OutFile" but didnt got any changes
10-13-2013, 04:19 PM
#7
LQ Guru
Registered: Jul 2003
Location: Birmingham, Alabama
Distribution: SuSE, RedHat, Slack,CentOS
Posts: 26,634
Quote:
Originally Posted by
scream
thanks again daniel. i have used "sed -i -e 's/,*$//' OutFile" but didnt got any changes
Well, as grail asked you: can you show us what
YOU have done/tried to solve this?? Can you not experiment and think about the solutions/tools given to you so far, to find the solution on your own?
That's the point of homework, isn't it?
10-13-2013, 07:14 PM
#8
Senior Member
Registered: Apr 2010
Location: Apex, NC, USA
Distribution: Mint 17.3
Posts: 1,881
Quote:
Originally Posted by
scream
i have used "sed -i -e 's/,*$//' OutFile" but didnt got any changes
Take a close look at your code. The
sed needs some kind of input to chew on. Where did you give it the input?
Daniel B. Martin
10-14-2013, 01:55 PM
#9
Member
Registered: Feb 2008
Distribution: Oracle Linux
Posts: 32
Original Poster
Rep:
Quote:
Originally Posted by
TB0ne
Well, as grail asked you: can you show us what YOU have done/tried to solve this?? Can you not experiment and think about the solutions/tools given to you so far, to find the solution on your own?
That's the point of homework, isn't it?
Dear TB0ne,
i am trying to develop a script to format a csv file (cu.csv) and load into database. my file look like below
cu.csv
(group--121 XYZ) ,,,,,,,,,,,,,,,,,,,,,,,,,
2013-09-12 01:00:00+06:00,9.95,20.61,463,2.15,4.45,6.60,1,15,,,,,,,,,,,,,,,,,
2013-09-12 02:00:00+06:00,5.1,13.09,463,1.10,2.83,3.93,1,15,,,,,,,,,,,,,,,,,
2013-09-12 03:00:00+06:00,2.36,7.88,463,0.51,1.70,2.21,0,15,,,,,,,,,,,,,,,,,
2013-09-12 04:00:00+06:00,1.65,6.21,463,0.36,1.34,1.70,0,15,,,,,,,,,,,,,,,,,
i need the output (NFILE) like
NFILE
2013-09-12 01:00:00 121 9.95 20.61 463 2.15 4.45 6.60 1 15
2013-09-12 02:00:00 121 5.1 13.09 463 1.10 2.83 3.93 1 15
I have used below code to format the output
awk '{if ($1!~/[0-9]/) c=$2; else {$1=$1","c","; print $0}}' cu.csv >OutFile
sed -i -e 's/group--//g' OutFile
sed -i -e 's/ //g' OutFile
sed -i -e 's/+06:00//g' OutFile
after all this till now i got
(OutFile)
2013-09-12,121,01:00:00,9.95,20.61,463,2.15,4.45,6.60,1,15,,,,,,,,,,,,,,,,,
2013-09-12,121,02:00:00,5.1,13.09,463,1.10,2.83,3.93,1,15,,,,,,,,,,,,,,,,,
2013-09-12,121,03:00:00,2.36,7.88,463,0.51,1.70,2.21,0,15,,,,,,,,,,,,,,,,,
2013-09-12,121,04:00:00,1.65,6.21,463,0.36,1.34,1.70,0,15,,,,,,,,,,,,,,,,,
i need to remove the "," and re-arrange the fields. i tried below code but 1st row remains same
awk '{FS=",";print $1"\t"$3"\t"$2"\t"$4"\t"$5"\t"$6"\t"$7"\t"$8"\t"$9"\t"$10"\t"$11"\t"$12;}' OutFile >NFile
(NFile)
2013-09-12,121,01:00:00,9.95,20.61,463,2.15,4.45,6.60,1,15,,,,,,,,,,,,,,,,,
2013-09-12 02:00:00 121 5.1 13.09 463 1.10 2.83 3.93 1 15
2013-09-12 03:00:00 121 2.36 7.88 463 0.51 1.70 2.21 0 15
2013-09-12 04:00:00 121 1.65 6.21 463 0.36 1.34 1.70 0 15
can you tell me , where i missed something?
---------- Post added 10-15-13 at 12:56 AM ----------
Quote:
Originally Posted by
danielbmartin
Take a close look at your code. The sed needs some kind of input to chew on. Where did you give it the input?
Daniel B. Martin
Martin
Outfile is the input file. i tried to modify that.
10-14-2013, 02:16 PM
#10
Senior Member
Registered: Apr 2010
Location: Apex, NC, USA
Distribution: Mint 17.3
Posts: 1,881
Quote:
Originally Posted by
scream
Outfile is the input file. i tried to modify that.
Starting with InFile2 ...
Code:
2013-09-12,121,01:00:00,9.95,20.61,463,2.15,4.45,6.60,1,15,,,,,,,,,,,,,,,,,
2013-09-12,121,02:00:00,5.1,13.09,463,1.10,2.83,3.93,1,15,,,,,,,,,,,,,,,,,
2013-09-12,121,03:00:00,2.36,7.88,463,0.51,1.70,2.21,0,15,,,,,,,,,,,,,,,,,
2013-09-12,121,04:00:00,1.65,6.21,463,0.36,1.34,1.70,0,15,,,,,,,,,,,,,,,,,
... this code ...
Code:
Path=$(cut -d'.' -f1 <<< ${0})
InFile2=$Path"inp2.txt"
sed -i -e 's/,*$//' $InFile2
... changed InFile2 to this ...
Code:
2013-09-12,121,01:00:00,9.95,20.61,463,2.15,4.45,6.60,1,15
2013-09-12,121,02:00:00,5.1,13.09,463,1.10,2.83,3.93,1,15
2013-09-12,121,03:00:00,2.36,7.88,463,0.51,1.70,2.21,0,15
2013-09-12,121,04:00:00,1.65,6.21,463,0.36,1.34,1.70,0,15
This code snippet assumes you have the files InFile2 and the script itself in the same folder.
Daniel B. Martin
10-14-2013, 03:08 PM
#11
Senior Member
Registered: Oct 2008
Distribution: Debian sid
Posts: 2,683
Code:
awk 'BEGIN{FS=","}{print $1"\t"$3"\t"$2"\t"$4"\t"$5"\t"$6"\t"$7"\t"$8"\t"$9"\t"$10"\t"$11"\t"$12;}' OutFile >NFile
if you don't mind trailing tabs
Code:
awk 'BEGIN{FS=",";OFS="\t"}{t=$3;$3=$2;$2=t;print}' OutFile
if you don't want the trailing tabs
Code:
awk 'BEGIN{FS=","}{t=$3;$3=$2;$2=t}{printf $1}{for (i=2;i<12;i++)printf "\t"$i}{printf "\n"}' OutFile
10-16-2013, 02:29 PM
#12
Member
Registered: Feb 2008
Distribution: Oracle Linux
Posts: 32
Original Poster
Rep:
thanks Firerat and martin. my problem solved.
All times are GMT -5. The time now is 04:58 AM .
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know .
Latest Threads
LQ News