Directory Traverse & Rename Script
I am running this script in cygwin, but since I found the original here, I figured it would be OK to post here.
I'm using this to extract the contents of a version control system folder dump into something usable. It was previously gziped up (eg 1.1.gz) but that's been fixed. The way my modified script works now, it runs through all subdirectories and properly renames files with names like "1.1" and changes them to be foldername.ext, which is perfect. It also has the bonus of overwriting a 1.1 file with a 1.2 file (which happens to be a newer version) which is also the behavior I want. What isn't perfect is when it does something I don't want, like this: mv PATH\fileone.ext PATH.ext fileone.ext will NEVER start with "1" and I'd like to stop the script from performing any commands on a file that begins with "1" Here's the directory structure Code:
C:\PATH Code:
#!/bin/bash -Wayne |
First you say that you want it to process files that start with a 1
Quote:
Quote:
find -type f -name "[!1]*" |
Cygwin will access the C:\ drive like "/cygdrive/c/". It would help if you posted a segment from the results of a find command so we can see what it looks like exactly. Then describe the "after" better with an example.
Also, it appears that FOLDERPATH is supposed to be replaced with the base directory to start the search. You are taking it literally. I would suggest adding a line in the top like "FOLDERPATH=/cygdrive/c/projects" and then changing the last line to "done < <(find "$FOLDERPATH" -type f)". This will use a variable in the last line which is defined before the loop starts. Code:
echo mv "$file" "$dirname/$newname" Is the index command a bash function, which you haven't posted? There is a C library function index() which returns a pointer to the first instance of a character in a string. You might consider writing your own function that checks for a (.[[:digit:]]+)+ pattern at the end of a string. Look at the "=~" conditional operator. |
Quote:
I did get it functional, but ran into the ugly beast of spaces in the filename. So I echoed all commands into a file, which I piped into bash. Worked well. I ended up doing this in two parts, and the old version of my script was overwritten... anywho 80,000 files later it ran into problems with "too long of filename", but that was with files that were in nested folders 30 deep. Here's what it eventually ended up like, I'm pretty damn sure this is NOT optimal with the grep and cut commands, but it did the job. Code:
#!/bin/bash Code:
mv "Base/Sub/Test File Name.ext,d/1.1" "Base/Sub/Test File Name.ext" |
All times are GMT -5. The time now is 01:30 AM. |