ProgrammingThis forum is for all programming questions.
The question does not have to be directly related to Linux and any language is fair game.
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
I have a CSV file containing people with addresses.
From this file I need to create 2 output files:
1) people who live in a single person household (one person per address)
2) people who live with other people at the same address
(I do not want to filter out duplicates, just split people into 2 different output files.)
I have a CSV file containing people with addresses.
From this file I need to create 2 output files:
1) people who live in a single person household (one person per address)
2) people who live with other people at the same address
(I do not want to filter out duplicates, just split people into 2 different output files.)
The address is identified by columns 11-16.
What's the best way forward?
Help us to help you. Provide a sample input file (10-15 lines will do). Construct a sample output file which corresponds to your sample input and post both samples here. With "InFile" and "OutFile" examples we can better understand your needs and also judge if our proposed solution fills those needs.
Your description contains ambiguities, at least to my eye. Reading it literally, both output files contain only personal names, and no addresses. Is that what you really want?
I suspect it's a case for awk but I am not particularly good at awk scripts.
And then say:
Quote:
I have already isolated the addresses with cut.
It's a shame that grep -c doesn't work in conjunction with -f rturning a count for each line in the filter file!
Reading these it is hard to workout if you have made an attempt or not or are stuck but don't want to show where?
Again, provide examples, as suggested by Daniel, but then also show what code you have already done and where you are stuck?
This is not your first post, so one would think you know how to ask questions and what the usual response is from the people on the forums when you don't
seem to show any attempt.
Jackson Browne,13 Park Ave
Hank Williams,713 Violet Ave
Lou Reed,5 Lynn Rd
Gene Vincent,5 Lynn Rd
Ray Charles,13 Park Ave
Tom Petty,3333 Morningside Dr
Carly Rae Jepsen,9 Chestnut Ln
Willie Nelson,713 Violet Ave
Bob Dylan,713 Violet Ave
... this code ...
Code:
awk -F, -v W1=$Singles -v W2=$Cohabs \
'{a[$2]=a[$2] ", "$1; c[$2]++}
END{for (j in a) {OutRec=j" => "substr(a[j],3);
if (c[j]==1) print OutRec >W1
else print OutRec >W2}}' $InFile
echo "Singles ..."; cat $Singles; echo "End Of File"
echo "Cohabs ..."; cat $Cohabs; echo "End Of File"
... produced this result ...
Code:
Singles ...
9 Chestnut Ln => Carly Rae Jepsen
3333 Morningside Dr => Tom Petty
End Of File
Cohabs ...
5 Lynn Rd => Lou Reed, Gene Vincent
713 Violet Ave => Hank Williams, Willie Nelson, Bob Dylan
13 Park Ave => Jackson Browne, Ray Charles
End Of File
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.