LinuxQuestions.org

LinuxQuestions.org (/questions/)
-   Linux - Newbie (https://www.linuxquestions.org/questions/linux-newbie-8/)
-   -   shorten long paths in deeply nested directory structure (https://www.linuxquestions.org/questions/linux-newbie-8/shorten-long-paths-in-deeply-nested-directory-structure-4175608661/)

ericlindellnyc 06-26-2017 02:17 PM

shorten long paths in deeply nested directory structure
 
I would like to shorten some long paths in a deeply nested directory structure. The directory structure itself starts several levels down from root.

So let's say i have a directory structure with path depths varying from 1 to 15 (counting the directory structure as level 1).

I want to find all paths that go to depth of at least 5.

& then snip each path at level 5 and move it (along with its subdirectories) to level 2.

The net result of all this will have been to reduce maxdepth by 3 (i.e., from level 15 down to 12).

I don't want to flatten the directory structure -- that's more flattening than I need. I just need to flatten it a little so that it doesn't create errors in certain operating systems doing certain procedures.

I'd also like to include echoing each step to the standard output to make sure I'm not doing this on the entire file system.

Is there a way to do this?
Thanks.
EVL

Laserbeak 06-26-2017 02:23 PM

Is a Perl program acceptable to you?

scasey 06-27-2017 10:49 AM

Or -- lets talk about the "errors in certain operating systems doing certain procedures" -- perhaps flattening the directory structure is not the best solution to the problem.

ericlindellnyc 07-01-2017 03:36 PM

Perl Program to shorten directory paths
 
Quote:

Originally Posted by Laserbeak (Post 5727454)
Is a Perl program acceptable to you?

Yes, absolutely .. As long as I know how to execute it .. not sure I've ever run a Perl program.

ericlindellnyc 07-01-2017 03:40 PM

flatten directory structure vs errors in OS
 
Quote:

Originally Posted by scasey (Post 5727873)
Or -- lets talk about the "errors in certain operating systems doing certain procedures" -- perhaps flattening the directory structure is not the best solution to the problem.

Great idea .. how do I translate that into a solution?

BTW, my printout of the directory tree suggests nesting up to 20 levels deep.

I'd probly wanna flatten it to maybe 10 levels -- even if it's an OS problem I can circumvent.

Which procedures of which OS might be problematic with deeply nested directories?

Thanks much for your help.

Eric

Laserbeak 07-01-2017 03:53 PM

So if you have a directory /path/to/some/of/my/really/cool/computer/files you want to move of/my/really/cool/computer/files to /path/to/cool/computer/files? What if there's a directory with that name already there? Merge them, or rename it by adding 1, 2, 3 etc.?

scasey 07-01-2017 03:54 PM

Quote:

Originally Posted by ericlindellnyc (Post 5729557)
Great idea .. how do I translate that into a solution?

BTW, my printout of the directory tree suggests nesting up to 20 levels deep.

I'd probly wanna flatten it to maybe 10 levels -- even if it's an OS problem I can circumvent.

Which procedures of which OS might be problematic with deeply nested directories?

Thanks much for your help.

Eric

I'm basically suggesting that the solution to your problems might not be to flatten the directory structure, but lie elsewhere. I could be wrong, of course, or you may not care to investigate another solution.

If you do, please explain what problems you are having that you're attributing to the depth of your directory structure. That is: Which procedures are causing errors because of the long path names? What are you trying to do that not working because of long path names?

If not, please excuse me...

ericlindellnyc 07-20-2017 05:10 PM

If there's a directory with that name already there, rename with incrementing number.
 
Quote:

Originally Posted by Laserbeak (Post 5729560)
So if you have a directory /path/to/some/of/my/really/cool/computer/files you want to move of/my/really/cool/computer/files to /path/to/cool/computer/files? What if there's a directory with that name already there? Merge them, or rename it by adding 1, 2, 3 etc.?

Good point .. Probly rename with incrementing number.

Thanks.

Eric

ericlindellnyc 07-20-2017 05:16 PM

what problems you are having that you're attributing to the depth of your directory structure
 
[QUOTE=scasey;5729562]I'm basically suggesting that the solution to your problems might not be to flatten the directory structure, but lie elsewhere. I could be wrong, of course, or you may not care to investigate another solution.

If you do, please explain what problems you are having that you're attributing to the depth of your directory structure. That is: Which procedures are causing errors because of the long path names? What are you trying to do that not working because of long path names?


I SEE YOUR POINT .. One thing is copying for the sake of backup. Searching for files by content.

But even more, just I never know what I'm going to try to do with the files in future -- or on which system (I use mac/win/linux/android). So I just kinda wanna know that whatever I try to do in future will go without a hitch -- at least not cuz excessive nesting depth.

BTW -- the whole thing right now is on exfat file system hooked up to mac (easy to reconnect to linux, though)

Thank you.

Eric

scasey 07-20-2017 08:08 PM

Quote:

Originally Posted by ericlindellnyc (Post 5737795)
I SEE YOUR POINT .. One thing is copying for the sake of backup. Searching for files by content.

But even more, just I never know what I'm going to try to do with the files in future -- or on which system (I use mac/win/linux/android). So I just kinda wanna know that whatever I try to do in future will go without a hitch -- at least not cuz excessive nesting depth.

BTW -- the whole thing right now is on exfat file system hooked up to mac (easy to reconnect to linux, though)

Thank you.

Eric

Ahh yes. I have experienced those kinds of issues, but the problem has been the number of files the command tried to return, not the depth of the file structure. Flattening the structure won't reduce the number of files. Can you show us what happens?

ericlindellnyc 07-21-2017 01:44 PM

problem is number of files -- not depth of nesting
 
Quote:

Originally Posted by scasey (Post 5737840)
Ahh yes. I have experienced those kinds of issues, but the problem has been the number of files the command tried to return, not the depth of the file structure. Flattening the structure won't reduce the number of files. Can you show us what happens?

I'm going to try to remember or duplicate what happened & post it then .... For now, I seem to recall copying all files from a failing drive, but all files didn't copy, & I believe nesting may have been the issue. I'd like to toss the old drive, but not till all files have copied.

For now, I just have a general memory of something going wrong when data was deeply nested -- but not when i reduced the nesting.
I don't recall how many files there were in either case.

I'd be curious if there's a max number of files one can use in one operation on mac/linux/win etc .... If there is such limit, it would be a good thing to know.

I've heard of a limit on path length, i.e., number of characters. But the previous 256 (+ or -) has been transcended on windows & doesn't exist on mac or linux, I believe.

Thanks for your continued feedback.

JeremyBoden 07-21-2017 05:25 PM

Why not use a symbolic link?

e.g. create a directory at level 5 called 'some-links' which points to a some deeply nested "parent directory"
It will ease your navigation through loads of files.

For example:-
Code:

mkdir ~/Desktop/Links
ls -l ~/Desktop/Links/Links
lrwxrwxrwx 1 jeremy jeremy  18 Nov 26  2015 Link to avi -> /mnt/filestore/avi
lrwxrwxrwx 1 jeremy jeremy  24 Nov 26  2015 Link to documents -> /mnt/filestore/documents
lrwxrwxrwx 1 jeremy jeremy  37 Nov 26  2015 Link to Manuals -> /mnt/filestore/books/Computer Manuals
lrwxrwxrwx 1 jeremy jeremy  23 Nov 26  2015 Link to pictures -> /mnt/filestore/pictures
lrwxrwxrwx 1 jeremy jeremy  10 Nov 26  2015 Link to video -> /mnt/video
drwxr-xr-x 2 jeremy jeremy 4096 Feb  6 16:54 PC-problems

then,
Code:

ls -l ~/Desktop/Links/'Link to video/'
total 54980
drwxrwxr-x 28 nobody public    4096 Mar 25 23:35 iplayer
-r--r--r--  1 jeremy public 45196229 May 10  2013 MVI_0357.mp4
drwxrwxr-x 10 nobody public    4096 Jul  3 00:27 radio
-r--r--r--  1 jeremy public  1930185 May 10  2013 Vid 001.mp4
-r--r--r--  1 jeremy public  1680817 May 10  2013 Vid 002.mp4
-r--r--r--  1 jeremy public  2798262 May 10  2013 Vid 003.mp4
-r--r--r--  1 jeremy public  2097162 May 10  2013 Vid 004.mp4
-r--r--r--  1 jeremy public  2565052 May 10  2013 Vid 006.mp4
drwxrwxr-x 10 nobody public    4096 Jul 20 16:20 video


Habitual 07-22-2017 08:14 AM

Quote:

Originally Posted by scasey (Post 5727873)
Or -- lets talk about the "errors in certain operating systems doing certain procedures" -- perhaps flattening the directory structure is not the best solution to the problem.

Let's talk about backups. ;)

scasey 07-22-2017 11:19 AM

Quote:

Originally Posted by Habitual (Post 5738527)
Let's talk about backups. ;)

Certainly. I suspect a
Code:

cp -R /* /some/destination
got "too many arguments" sometime in the past, but (as you know), that's probably not the best way to make a backup.

I use rsnapshot, which uses rsync, to make off-site backups of my production server and same-site backups of my backup server.

JeremyBoden 07-22-2017 01:07 PM

One of the problems for backups is that it is easier to take a backup than to do a proper restore!

rsync has quite a complex set of possible parameters...

I use backup2l which produces daily tar.gz files, so in a dire emergency, I could do a "manual" restore; even if I'd lost all details of the backup parameters.


All times are GMT -5. The time now is 09:30 PM.