[SOLVED] Remove the duplicate and count the line!!
Linux - NewbieThis Linux forum is for members that are new to Linux.
Just starting out and have a question?
If it is not in the man pages or the how-to's this is the place!
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
Thanks for your guidance for a new beginning in my linux world..:-)
Code:
$TTL 900
biz. IN SOA a.anish.biz. hostmaster.anish.biz. (
45266628 ; Serial
900 ; Refresh
900 ; Retry
604800 ; Expire
86400 ) ; Minimum
;
; Generation start time = SEP-06-2011 21:06:03
; TLD RECORDS
a.anish.biz. 518400 IN A 165.154.124.45
b.anish.biz. 518400 IN A 165.154.125.45
; A RECORDS
NS1.000A IN A 209.190.16.82
NS2.000A IN A 72.36.219.162
;
; NS RECORDS
0--0 IN NS DOCS03.RZONE.DE.
0--0 IN NS SHADES04.RZONE.DE.
0--1 IN NS 01.DNSV.JP.
0--1 IN NS 02.DNSV.JP.
ZZZZZZZZZZZZZZZZZZ IN NS DOCS09.RZONE.DE.
ZZZZZZZZZZZZZZZZZZ IN NS SHADES17.RZONE.DE.
; Generation end time = SEP-06-2011 21:07:42
; END OF FILE
from this sample file also i need the same output but this file quite different compare to old..
I want the output like this
Code:
0--0
0--1
ZZZZZZZZZZZZZZZZZZ
Total domains 3
iam woring for this i can able to remove the duplicates only not able to skip the first and last few lines..that also comes in count :-(
Basically the idea is you need to find whatever is the common item in all files you are looking at or a group of items
which are unique (these can be separated like /COM|NS/) and place that at the start as I have shown.
Now finally i need a idea alone to complete my script.
this process is we downlad a zone file and count the values and compare the current and previous zone files values then only we get the, how many new domains and all,
for example
i have to files named
08092011.com
09092011.com
and in file 08092011.com
Code:
awk '/Total/' u.txt | cut -f2 -d: |sed 's/^ *//g'
using this command i get the value alone
4
and in 09092011.com using the same process i get the value
6
now i want the output like subtract the file2 - file1 = 2
but this proces i can able to do normally..but if i suppose download one new in sep 10 then that file will be named as 10092011.com
and then the same process this time need to compare
09092011.com
10092011.com
this both file.
I don't know how to get the latest file to compare with downloaded file... need your guidance to complete this script.
awk '/Total/{if(tot)print "diff is",tot - $NF;else tot = $NF}'$(</var/tmp/v1) $(</var/tmp/v2)
using this code i can able to get the output
and
Code:
awk 'NR==FNR{a[$0];next} $0 in a' file1 file2
this is the command for intersection two files right? why i asking the question means, i did the same intersection process using database also but both the value are differs man plaese share your ideas, and thanks for your patience to answering my questions till now..need your guidance to solve this thread
Your second awk says that there are lines in file2 that are exactly the same as file1. If the lines differ in any way
they will not be printed. So essentially yes it will show you the lines that are the same.
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.