question about grepping something
Hello all,
i am trying to split up an xml file, and i want to use the grep command to do so. My problem is that when i do this : grep -n "</bf:log>" ev_14567-20050215_0956.xml that i get an out put like this : 2404628:</bf:log> Because i want to do some calculations in a script with the rule-number (for using head and tail), i need the output to be this : 2404628 So actually i just wanna chop off the ":</bf:log>" piece. (and be able to use 2404628 as a numerical value in a script) I have looked in the forums, and i saw that the awk command was advised, but after trying for a while i can not get the output the way that i want it. Can someone please give me a hand ? Thank you very much. |
grep blah | cut -f 1 -d:
|
Thanks a lot ! It's working !! (ofcourse it is you'll think :D )
|
;)
Doh!! even simpler.... grep -l (minus ell) |
I hope u can help me with the following as well if i may ask.
I have a list of xml files that need to be parsed by a Java parser (from selectbf). When the files are too big it does not funtion anymore. So i decided to write a "tiny" "easy" bash-script that would split up the xml-files and make sure all the tags are correct. Now, on paper it looks pretty good, i've got it all worked out. I only need a piece of code (:D ) My current problem is that i have 3 xml files in a directory : -rw-r--r-- 1 grave grave 129800403 Mar 1 18:19 ev_14567-20050215_0956.xml -rw-r--r-- 1 grave grave 4305438 Mar 1 18:19 ev_14567-20050218_2158.xml -rw-r--r-- 1 grave grave 2109 Mar 1 18:19 voorbeeld_part1.xml The criterium for a file to be split up into smaller ones is 30 MB. So if it is bigger than 30 MB it needs to be copied to a special directory where the splitting process will take place. However i can't get my loop in bash working correctly since it moves all the xml-files instead of just the one from 129 mb. Here is my code : #!/bin/bash #This script will (try to) split xml-logfiles from BF1942 into decent, usable ones of 30 mb in size. #detection of how many xml-files we have that are bigger than 30 MB cause they need to be split up #how many xmlfiles are in the directory to start with ? #First we set all variables to zero, just making sure ;) number_of_xmlfiles=0 filesize_kb=0 filesize_lines=0 filesize_limit_kb=30000 filename_bigxml=0 filename_anyfile=0 path_to_all_xmlfiles="/home/grave/download/logs/xmltest/" dest_path_big_xmlfiles="/home/grave/download/logs/xmltest/test2" ftp_url="ftpserver" ftp_pass="password" ftp_user="username" i=0 #here is some room for the ftp-procedure wget ftp://username:password@ftpserver_ur...942/logs/*.xml #lets make a textfile that contains all the xmlfiles, big and small, just to see how many we got listing=`ls /home/grave/download/logs/xmltest/*.xml` echo "$listing">listing.txt #and echo it out as a check, we can remove or comment this later, right now its usefull for error handling echo "$listing" #how many xml files do we have exactly number_of_xmlfiles=`wc -l listing.txt|cut -f1 -dl` echo "$number_of_xmlfiles" # Next we're gonna copy all xml files bigger than 30 mb to a separate (parsing) directory # However we need to define the xml file that the script is currently looking at : while [ "$i" -le "$number_of_xmlfiles" ] && [ "$filesize_kb" -gt "$filesize_limit_kb" ]; do current_filename=`head -n "$i" listing.txt` # RIGHT HERE IT GOES WRONG echo "$current_filename" echo "$i" echo "$number_of_xmlfiles" echo "$filesize_kb" echo "$filesize_limit_kb" mv "$path_to_all_xmlfiles"/*.xml "$dest_path_big_xmlfiles"/ echo dit lijkt goed te gaan i=$(($i+1)) done #filesize_lines=`wc -l |cut -f1 -de` #this is for later on, neglect it for now echo $filesize exit I have tried to comment it as good as possible, and the programming might be just a 'bit' crappy, but please if someone can tell me what is wrong with my loop, i'm all ears. The splitting of the files is something i will have to deal with later on. I try to take one step at a time. If someone has time, please let me know what im doing wrong here. |
I didn't want to try and debug the loop :)
How about this? find -size +30000k -exec mv {} /home/grave/download/logs/xmltest/test2/. \; |
:cry: i just found it :newbie:
Guess i've been staring at it for too long. |
All times are GMT -5. The time now is 12:16 AM. |