LinuxQuestions.org

LinuxQuestions.org (/questions/)
-   Programming (https://www.linuxquestions.org/questions/programming-9/)
-   -   sed replace string (https://www.linuxquestions.org/questions/programming-9/sed-replace-string-559325/)

octeto 06-05-2007 09:32 AM

sed replace string
 
Hi all

I am a very novice unix user, i help managing a web hosting server and recently got infected by a worm or malware that introduced several lines in almost all index (htm, html, php, and so on) files of our domains. These string in most cases doesnt affect and doesnt appear on the webs affected but for obvious reasons we would like to clean them.
In another forum i got the following suggestion:

QUOTE:
If in infected page attached some links and all of them is identical (i.e. it is something like:

URL(somedomain. com. index .htm)
(spaced since i cant post URls)
then you can use standard nix tools like find, grep or sed and replace this code with something neutral. just search google for "sed delete last string" "sed replace string" or something else.
EN OF QUOTE.

I have never used this sed command and am not sure if this can help.

Plainl the question is if there would be a command to perform this cleanup and also if you can suggest a way to find the worm or malware that produced this (since clenaning manually doesnt work, because it reappears after a while).



Thanks,

any help would be highly appreciated.

regards
Eduardo

taylor_venable 06-05-2007 10:18 AM

How about:
Code:

for x in `find /var/www/htdocs -name 'index.*'`; do
  echo "Working on file: $x"
  sed -i.bak 's|domain.tld/index.htm|mysite.tld/index.html|g' $x
done

This changes the text "domain.tld/index.htm" to "mysite.tld/index.html" in all files with names starting "index." located beneath /var/www/htdocs. The originals are kept in files with the same name plus a .bak extension.

octeto 06-05-2007 01:31 PM

Thanks for the help!

this is working:

#!/bin/sh
for file in /root/abc123a/public_html/coppermine/*
do
sed 's/taubetapi.org/hacked/g' $file > $file.new
mv $file.new $file
done
exit 0


Problem is the line for file... gives notice for folders, we need a recursive command to browse all the folders without identifying them separetely.

chrism01 06-06-2007 12:21 AM

The 'find' cmd is recursive.
If the prob keeps re-occuring, you ned to fix that first, even if it means a re-install.
Check the Security forum for tips eg chkrootkit and rkhunter tools, also Tripwire.
You really need to make sure the systems are up to date and keep them that way.

syg00 06-06-2007 02:09 AM

Might as well keep all the help in Brisbane ...
Try (watch the backticks);
for file in ` find /root/abc123a/public_html/coppermine/ -iname "*" `


All times are GMT -5. The time now is 07:40 PM.