Quote:
Originally Posted by ericlindellnyc
I would like to shorten some long paths in a deeply nested directory structure. The directory structure itself starts several levels down from root.
So let's say i have a directory structure with path depths varying from 1 to 15 (counting the directory structure as level 1).
I want to find all paths that go to depth of at least 5.
& then snip each path at level 5 and move it (along with its subdirectories) to level 2.
The net result of all this will have been to reduce maxdepth by 3 (i.e., from level 15 down to 12).
I don't want to flatten the directory structure -- that's more flattening than I need. I just need to flatten it a little so that it doesn't create errors in certain operating systems doing certain procedures.
I'd also like to include echoing each step to the standard output to make sure I'm not doing this on the entire file system.
Is there a way to do this?
Thanks.
EVL
|
I believe the "solution" that I used at the time of original post, 2017, was to move any directory at
-mindepth 15 -maxdepth 15
or something similar. But this posed 2 problems . .
1) It didn't check for duplicate names before moving to target directory.
Recently (5 years after OP) I learned how to append incrementing numbers in python.
2) It didn't determine a file's nesting depth. I can use the awkward -maxdepth & -mindepth from bash, guessing at the depth and then seeing what happens. This flattened only part of a huge directory.
More recently, I've researched how to find path length in python. This guy on reddit/learnpython describes my experience . .
Quote:
I tried googling "Python path length" but all the results are about Python max path length or stuff about physics.
Then I tried "Python get path size" but again everything was about how to get the total size of a directory in bytes.
|
Suppose I have a directory structure that includes /a/b/c/d/ee/FN1.ext -- and it also includes several other files and subdirectories with a path like /a/b/c/FN2.ext -- but no longer than that. In that case, the first one, /a/b/c/d/e/FN1.ext would be the longest -- of path length 5. (Or 6 if you count the file).
I'd like this output . .
Code:
longest path in directory structure is 5
Knowing that, I can then "prune the tree" by moving all files with a path length greater than Maxdepth to a target directory, appending incrementing number.
This may make the directory structure easier to search, back up, remove duplicates, and transfer to other file systems.
There's also a variation where the number returned is not the integer number of components in the directory path -- but rather the number of characters (alphanumeric, punctuation, delimiters) in the entire path. So the character count for
is 13.
Then I can meet NTFS limit of 4096 characters for a path.
I can use my script to move empty folders to a target folder. But instead of testing if folder is empty, I'd test for whether its length exceeds some limit.
Code:
import os
import shutil
counter = 0
for root, dirs, files in os.walk("."):
for dirName in dirs:
dirPath = os.path.join(root, dirName)
### if len(os.listdir(dirPath)) == 0: [THIS CONDITIONAL TO BE REPLACED WITH TEST OF PATH LENGTH] ###
counter += 1
incrDN = dirName + '_' + str(counter)
shutil.move(dirPath, "/Volumes/TM3gbBu/destDirsMTsPy/" + incrDN)