Linux - GeneralThis Linux forum is for general Linux questions and discussion.
If it is Linux Related and doesn't seem to fit in any other forum then this is the place.
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
I need a little assistance. I need this script to delete all files that are not that of *.uz including files starting with a . such as, but not limited to, .htaccess.
Code:
#!/bin/bash
SRC=/home/ftp/uploads/uz/
DEST=/home/uz/www/
set -e
cd $SRC || exit 1
ls -l -- *.uz | awk '{print $3, $4, $NF}' | while read user group file
do
if [ ! -f $DEST/${file} ] ; then
{
mv -- ${file} $DEST/${file}
chown ${user}:${group} $DEST/${file}
}
else
rm -f -- $SRC/${file}
fi
done
for i in $(ls | grep -v ".uz")
do
rm -f -- "$i"
done
I believe it does delete all files, but for some reason I think it's skipping the files beginning with a . I have some bot I believe that's gun hoe on logging into my anonymous FTP server and uploading a .htaccess file in all the directories. It's not that big of a deal, as it's only an FTP server...so the file doesn't get moved to a HTTP environment, but it's annoying that I have to go in and manually delete them. I want the script to do it for me.
If I was to edit the script to do that, this is what I would do. Is this correct?
Code:
#!/bin/bash
SRC=/home/ftp/uploads/uz/
DEST=/home/uz/www/
set -e
cd $SRC || exit 1
ls -l -- *.uz | awk '{print $3, $4, $NF}' | while read user group file
do
if [ ! -f $DEST/${file} ] ; then
{
mv -- ${file} $DEST/${file}
chown ${user}:${group} $DEST/${file}
}
else
rm -f -- $SRC/${file}
fi
done
for i in $(ls | grep -v ".uz")
do
rm -f -- "$i"
done
rm -Rf -- *
I just think, that if by accident this script was ran outside of the source directory, like..the root directory for example, it could really destroy my server.
Another problem I've facing with this script. If any of the file names contain a space, it stops the script. It will only read the information after the space, so when it goes to move it, it returns a "not found" error and stops the script.
What can I do, so that it will keep the file names, with spaces, together?
Well I am using the find command towards the top of the script, but the last portion of the script ran just fine but since I made the change to use the find command, instead of the previous ls command, now it's telling me the last part of the script is ending too soon? Why?
There is something wrong with this section:
Code:
for i in $(ls | grep -v ".uz")
do
rm -f -- "$i"
done
I tried to change it to this, but I got the same error:
Code:
for i in $(find | grep -v ".uz")
do
rm -f -- "$i"
done
I don't see why you need to use find in that way personally. I recommend (with least change to how your code behaves) going back to you original ls version and making it look like this:
I don't see why you need to use find in that way personally. I recommend (with least change to how your code behaves) going back to you original ls version and making it look like this:
Code:
for i in $(ls -1 | grep -v ".uz"); do
Regards,
Alunduil
well I have the do on a next line, so I added the ; at the end of that line then tested the script again. Still getting the same error.
./move_test.sh: line 22: unexpected EOF while looking for matching `"'
I then removed the do, that was on the next line and changed the line to look like the one you suggested. Still the same error.
What am I doing wrong? I get this error:
./move_test.sh: line 22: unexpected EOF while looking for matching `"'
The trasposed } and " in mv -- "${file}" $DEST/"${file"} are wrong. BTW, none of the { and } are actually required; they are not wrong but not necessary either.
What can I do, so that it will keep the file names, with spaces, together?
This will cope with spaces and any other possible character in file names
Code:
while IFS= read -r -d '' file
do
commands using "$file"
done < <(find . -type f -iname '*.uz' -print0)
EDIT: when testing a script that may cause data loss it is prudent to echo the mv, rm whatever commands at first so you can see that the script is generating the correct commands. When ready to run for real, remove the echo in front of the mv, rm whatever.
#!/bin/bash
SRC=/home/uploadu/uploads/redirect/
DEST=/home/uzutfi/public_html/
set -e
cd $SRC || exit 1
find -iname "*.uz" | while read file
do
if [ ! -f $DEST/"${file}" ] ; then
{
mv -- "${file}" "$DEST/${file}"
chown ${user}:${group} "$DEST/${file}"
}
else
rm -f -- "$SRC/${file}"
fi
done
for i in $(ls -l | grep -v "*.uz");
do
rm -f -- "$i"
done
PS
All the data this was moving around and deleting I have a backup of it. So nothing could have been lost, just real life problems the script has ran into with users uploading custom files with weird, unstandardized names.
So far it has solved a bunch of issues.
File names that begin with -, I have fixed with the -- option and file names with spaces I have solved with the find command, I also had a problem with any files that contained a uz in them, they would have ignored with the last rm command. So files that were that of .uz2 extensions were ignored, got that solved as well.
Any other suggestions, specifically security issues that may be encountered?
Would removing the { and } brackets make the script run faster? Currently the script is executed every 5 minutes.
Well, I'm not worried about it being as fast as possible. Just wondering if using the { } in those areas will make this particular script slower. If so, I can easily remove them. If they wont make any difference, then the script I'm currently using is plenty fast enough. Usually only ~10 files every 5 minutes can be uploaded every 5 minutes by a client, depending on the connection speed. The only time it has to move a bunch of files is when the script breaks, though I think I have solved all those issues now. I had to work with ~2000 files just now and it only took a couple minutes to complete them all. Even though I ran the script about 4 or 5 times, with the same files to ensure it was working correctly.
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.