[SOLVED] shorten long paths in deeply nested directory structure
Linux - NewbieThis Linux forum is for members that are new to Linux.
Just starting out and have a question?
If it is not in the man pages or the how-to's this is the place!
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
shorten long paths in deeply nested directory structure
I would like to shorten some long paths in a deeply nested directory structure. The directory structure itself starts several levels down from root.
So let's say i have a directory structure with path depths varying from 1 to 15 (counting the directory structure as level 1).
I want to find all paths that go to depth of at least 5.
& then snip each path at level 5 and move it (along with its subdirectories) to level 2.
The net result of all this will have been to reduce maxdepth by 3 (i.e., from level 15 down to 12).
I don't want to flatten the directory structure -- that's more flattening than I need. I just need to flatten it a little so that it doesn't create errors in certain operating systems doing certain procedures.
I'd also like to include echoing each step to the standard output to make sure I'm not doing this on the entire file system.
Or -- lets talk about the "errors in certain operating systems doing certain procedures" -- perhaps flattening the directory structure is not the best solution to the problem.
Or -- lets talk about the "errors in certain operating systems doing certain procedures" -- perhaps flattening the directory structure is not the best solution to the problem.
Great idea .. how do I translate that into a solution?
BTW, my printout of the directory tree suggests nesting up to 20 levels deep.
I'd probly wanna flatten it to maybe 10 levels -- even if it's an OS problem I can circumvent.
Which procedures of which OS might be problematic with deeply nested directories?
So if you have a directory /path/to/some/of/my/really/cool/computer/files you want to move of/my/really/cool/computer/files to /path/to/cool/computer/files? What if there's a directory with that name already there? Merge them, or rename it by adding 1, 2, 3 etc.?
Great idea .. how do I translate that into a solution?
BTW, my printout of the directory tree suggests nesting up to 20 levels deep.
I'd probly wanna flatten it to maybe 10 levels -- even if it's an OS problem I can circumvent.
Which procedures of which OS might be problematic with deeply nested directories?
Thanks much for your help.
Eric
I'm basically suggesting that the solution to your problems might not be to flatten the directory structure, but lie elsewhere. I could be wrong, of course, or you may not care to investigate another solution.
If you do, please explain what problems you are having that you're attributing to the depth of your directory structure. That is: Which procedures are causing errors because of the long path names? What are you trying to do that not working because of long path names?
If there's a directory with that name already there, rename with incrementing number.
Quote:
Originally Posted by Laserbeak
So if you have a directory /path/to/some/of/my/really/cool/computer/files you want to move of/my/really/cool/computer/files to /path/to/cool/computer/files? What if there's a directory with that name already there? Merge them, or rename it by adding 1, 2, 3 etc.?
Good point .. Probly rename with incrementing number.
what problems you are having that you're attributing to the depth of your directory structure
[QUOTE=scasey;5729562]I'm basically suggesting that the solution to your problems might not be to flatten the directory structure, but lie elsewhere. I could be wrong, of course, or you may not care to investigate another solution.
If you do, please explain what problems you are having that you're attributing to the depth of your directory structure. That is: Which procedures are causing errors because of the long path names? What are you trying to do that not working because of long path names?
I SEE YOUR POINT .. One thing is copying for the sake of backup. Searching for files by content.
But even more, just I never know what I'm going to try to do with the files in future -- or on which system (I use mac/win/linux/android). So I just kinda wanna know that whatever I try to do in future will go without a hitch -- at least not cuz excessive nesting depth.
BTW -- the whole thing right now is on exfat file system hooked up to mac (easy to reconnect to linux, though)
I SEE YOUR POINT .. One thing is copying for the sake of backup. Searching for files by content.
But even more, just I never know what I'm going to try to do with the files in future -- or on which system (I use mac/win/linux/android). So I just kinda wanna know that whatever I try to do in future will go without a hitch -- at least not cuz excessive nesting depth.
BTW -- the whole thing right now is on exfat file system hooked up to mac (easy to reconnect to linux, though)
Thank you.
Eric
Ahh yes. I have experienced those kinds of issues, but the problem has been the number of files the command tried to return, not the depth of the file structure. Flattening the structure won't reduce the number of files. Can you show us what happens?
problem is number of files -- not depth of nesting
Quote:
Originally Posted by scasey
Ahh yes. I have experienced those kinds of issues, but the problem has been the number of files the command tried to return, not the depth of the file structure. Flattening the structure won't reduce the number of files. Can you show us what happens?
I'm going to try to remember or duplicate what happened & post it then .... For now, I seem to recall copying all files from a failing drive, but all files didn't copy, & I believe nesting may have been the issue. I'd like to toss the old drive, but not till all files have copied.
For now, I just have a general memory of something going wrong when data was deeply nested -- but not when i reduced the nesting.
I don't recall how many files there were in either case.
I'd be curious if there's a max number of files one can use in one operation on mac/linux/win etc .... If there is such limit, it would be a good thing to know.
I've heard of a limit on path length, i.e., number of characters. But the previous 256 (+ or -) has been transcended on windows & doesn't exist on mac or linux, I believe.
e.g. create a directory at level 5 called 'some-links' which points to a some deeply nested "parent directory"
It will ease your navigation through loads of files.
For example:-
Code:
mkdir ~/Desktop/Links
ls -l ~/Desktop/Links/Links
lrwxrwxrwx 1 jeremy jeremy 18 Nov 26 2015 Link to avi -> /mnt/filestore/avi
lrwxrwxrwx 1 jeremy jeremy 24 Nov 26 2015 Link to documents -> /mnt/filestore/documents
lrwxrwxrwx 1 jeremy jeremy 37 Nov 26 2015 Link to Manuals -> /mnt/filestore/books/Computer Manuals
lrwxrwxrwx 1 jeremy jeremy 23 Nov 26 2015 Link to pictures -> /mnt/filestore/pictures
lrwxrwxrwx 1 jeremy jeremy 10 Nov 26 2015 Link to video -> /mnt/video
drwxr-xr-x 2 jeremy jeremy 4096 Feb 6 16:54 PC-problems
then,
Code:
ls -l ~/Desktop/Links/'Link to video/'
total 54980
drwxrwxr-x 28 nobody public 4096 Mar 25 23:35 iplayer
-r--r--r-- 1 jeremy public 45196229 May 10 2013 MVI_0357.mp4
drwxrwxr-x 10 nobody public 4096 Jul 3 00:27 radio
-r--r--r-- 1 jeremy public 1930185 May 10 2013 Vid 001.mp4
-r--r--r-- 1 jeremy public 1680817 May 10 2013 Vid 002.mp4
-r--r--r-- 1 jeremy public 2798262 May 10 2013 Vid 003.mp4
-r--r--r-- 1 jeremy public 2097162 May 10 2013 Vid 004.mp4
-r--r--r-- 1 jeremy public 2565052 May 10 2013 Vid 006.mp4
drwxrwxr-x 10 nobody public 4096 Jul 20 16:20 video
Or -- lets talk about the "errors in certain operating systems doing certain procedures" -- perhaps flattening the directory structure is not the best solution to the problem.
One of the problems for backups is that it is easier to take a backup than to do a proper restore!
rsync has quite a complex set of possible parameters...
I use backup2l which produces daily tar.gz files, so in a dire emergency, I could do a "manual" restore; even if I'd lost all details of the backup parameters.
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.