Perl > don't overwrite or stop if already present.
ProgrammingThis forum is for all programming questions.
The question does not have to be directly related to Linux and any language is fair game.
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
Perl > don't overwrite or stop if already present.
Hi i have a Pcap reader i did in perl but i am very new to perl. I have been running the script on tcp dumps manually and deleting the old ones after they are put into my table on mysql. The code does all this but i need help to either have it added to move the file it dumps or the code to not add a file that has already been added.
Could someone help me understand which method should be used and help by adding it to my code?
MY ORIGINAL PCAP READER-
Code:
#!/usr/bin/perl
use DBI;
use Net::TcpDumpLog;
use NetPacket::Ethernet;
use NetPacket::IP;
use NetPacket::TCP;
use Net::Pcap;
use strict;
use warnings;
my $log;
#Login to mysql
my $dbh = DBI->connect('DBI:mysql:events:10.1.10.129', 'root', 'root'
) || die "Could not connect to +database: $DBI::errstr";
my $dir = 'C:/Documents and Settings/jordant/Desktop/Dump';
opendir(DIR, $dir) or die $!;
while (my $file = readdir(DIR)) {
#Use a regular expression to find files ending in .pcap
next unless ($file =~ m/\.pcap$/);
$log = Net::TcpDumpLog->new();
$log->read("$dir/$file");
#INFO from PCAP file
foreach my $index ($log->indexes) {
my ($length_orig, $length_incl, $drops, $secs, $msecs) = $log->header($index);
my $data = $log->data($index);
my $eth_obj = NetPacket::Ethernet->decode($data);
next unless $eth_obj->{type} == NetPacket::Ethernet::ETH_TYPE_IP;
my $ip_obj = NetPacket::IP->decode($eth_obj->{data});
next unless $ip_obj->{proto} == NetPacket::IP::IP_PROTO_TCP;
my $tcp_obj = NetPacket::TCP->decode($ip_obj->{data});
#Get date time stamp of packet
my ($sec,$min,$hour,$mday,$mon,$year,$wday,$yday,$isdst) = localtime($secs + $msecs/1000);
$mon+=1;
my $time = sprintf("%02d-%02d %02d:%02d:%02d",
$mon, $mday, $hour, $min, $sec);
#Info in Table
$dbh->do( "INSERT INTO TCPdump (Date,Source,Destination,Packets,Port,Server)
values (
'$time',
'$ip_obj->{src_ip}',
'$ip_obj->{dest_ip}',
'$ip_obj->{len}',
'$tcp_obj->{dest_port}',
'agslnx1')");
}
close(DIR)
}
Maybe move the file in another directory after it has been processed ?
Code:
my $dir = 'C:/Documents and Settings/jordant/Desktop/Dump';
# dir to store already processed files
my $bdir = 'C:/Documents and Settings/jordant/Desktop/Backup';
...
...
rename "$dir/$file", "$bdir/$file";
}
close(DIR);
}
That would be one way but is there a way i could run it and it wouldnt over write my old ones?
Quote:
Originally Posted by Cedrik
Maybe move the file in another directory after it has been processed ?
Code:
my $dir = 'C:/Documents and Settings/jordant/Desktop/Dump';
# dir to store already processed files
my $bdir = 'C:/Documents and Settings/jordant/Desktop/Backup';
...
...
rename "$dir/$file", "$bdir/$file";
}
close(DIR);
}
It doesn't really overwrite the file. This is what i mean when i say that. The information is getting put into a mySQL database table that i created. If you happen to have that pcap file already in the mySQL database it will still read that pcap file and put into mySQL table. Thus making it have the same entry twice in the database.
Yep i was just thinking for saftey reasons if something happend to be in there twice so it wouldnt duplicate info. I did use what you put tho and for now it seems to be working wonderfully. Thank You so much.
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.