Originally Posted by pan64
I do not really understand what do you need, but here are some tips:
for dir in /home/shanaka/test1/*/public_html
if [ ! -e $dir/robots.txt ]
ls -l /home/shanaka/test1/*/public_html/ | mail -s "404: file not found" firstname.lastname@example.org
cp "$file" "$dir" # you may try to use link (ln) instead of copy...
result=$(diff $file $dir/$file ) # probably?
# if [ ! -e 0 ] ### <<<< what is this ?
if [ "$result" -gt 0 ] # or what?
echo $dir is not same
cp "$file" "$dir"
thanks you very much friends,
what i realy want is check robots.txt file in all web root, if it is there then, theck whether it is disallow or not. if it is not, i need to copy robots.txt file to not available web directories.
this is my result for previous code.
/home/shanaka/test1/project1/public_html is not same
/home/shanaka/test1/project2/public_html is not same
/home/shanaka/test1/project3/public_html is not same
/home/shanaka/test1/project4/public_html is not same
/home/shanaka/test1/project5/public_html is not same
so here only "/home/shanaka/test1/project1/public_html is not same directory file only differet, but it is showing all the directory as wrong,
there for i want to grep different one only. and copy file to those directories.