[SOLVED] merge 2 directories without deleting or overwriting files with same name
Linux - GeneralThis Linux forum is for general Linux questions and discussion.
If it is Linux Related and doesn't seem to fit in any other forum then this is the place.
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
merge 2 directories without deleting or overwriting files with same name
Normally I would just rysnc the data around, but i have a bunch of files from school that are on my laptop and my workstation. I'd like to merge the files, many with the same name, but not the same file. they are named by class & date.
example:
/path/to/class/call.08.06.2021.pdf
Ill have that file on both systems, but they may not be a true duplicate file. rsync will treat them as the same file and use the newer file and overwrite, potentially losing data, the older of the two files.
After I have everything merged into a single location, I will spend the man hours manually removing duplicate files.
ok, its been years since i mucked about with sshfs, ill play with that or setup NFS and see what way I can copy these files around. end of semester clean up time.
for i in ./*.pdf~1~*; do mv -- "$i" "${i//.pdf~1~/.pdf}" &> /dev/null; done
that is not working, do i need to \ out the ~ ???
or is that failing because it will create a file with the same name?
i have a rename script:
Code:
for i in ./*.pdf~1~*; do mv -- "$i" "${i//.pdf~1~/.pdf}" &> /dev/null; done
# replaces file names with 0000 - 9999 for .jpg, .png, & .mov, etc
a=1
for i in *.pdf; do
new=$(printf "%04d.pdf" "$a") #04 pad to length of 4
mv -- "$i" "$new" &> /dev/null
let a=a+1
done
# replace name. to prepend to all files that are *.jpg, etc...
# :%s/name./New.Name./g
for f in *.pdf
do
mv "$f" "Class.Foo.$f" &> /dev/null
done
If I was in this situation, I would rsync the entire set over to a complete new directory, then backup both sides, then do deduping/merging to another new location.
Sure, that might mean five copies of some files, but if the data is valuable I'd rather spend the disk space than risk losing it from a typo/misunderstanding.
(I'd also prefer to redirect unwanted output to temporary file(s) - which can be deleted on verified success - instead of sending anything into the /dev/null abyss.)
From a quick look at the script your trying to move the file to an already existing file in the same directory.
The real question is what do you want to do with the files with duplicate names.
From a quick look at the script your trying to move the file to an already existing file in the same directory.
The real question is what do you want to do with the files with duplicate names.
just add them to a numbering list that i can manually dig through and remove true duplicates. my bad for the naming scheme i use. i wget the files, than run that script. its the same script on both the laptop and workstation, thus the duplicate file naming, next semester ill add a flag to my laptop files to make merger and cleanup easier.
I agree with boughtonp until you determine how you want dig through the files make sure you have backups of both computers.
Not sure what you mean by add them to a numbering list? You can use cmp or diff to compare the duplicate named files and then deal with them as desired.
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.