LinuxQuestions.org
Visit Jeremy's Blog.
Go Back   LinuxQuestions.org > Forums > Linux Forums > Linux - General
User Name
Password
Linux - General This Linux forum is for general Linux questions and discussion.
If it is Linux Related and doesn't seem to fit in any other forum then this is the place.

Notices


Reply
  Search this Thread
Old 06-08-2021, 07:07 AM   #1
lleb
Senior Member
 
Registered: Dec 2005
Location: Florida
Distribution: CentOS/Fedora/Pop!_OS
Posts: 2,908

Rep: Reputation: 547Reputation: 547Reputation: 547Reputation: 547Reputation: 547Reputation: 547
merge 2 directories without deleting or overwriting files with same name


Normally I would just rysnc the data around, but i have a bunch of files from school that are on my laptop and my workstation. I'd like to merge the files, many with the same name, but not the same file. they are named by class & date.

example:

/path/to/class/call.08.06.2021.pdf

Ill have that file on both systems, but they may not be a true duplicate file. rsync will treat them as the same file and use the newer file and overwrite, potentially losing data, the older of the two files.

After I have everything merged into a single location, I will spend the man hours manually removing duplicate files.

I do not want to lose the data from either end.

Thank you
 
Old 06-08-2021, 07:57 AM   #2
michaelk
Moderator
 
Registered: Aug 2002
Posts: 21,814

Rep: Reputation: 4251Reputation: 4251Reputation: 4251Reputation: 4251Reputation: 4251Reputation: 4251Reputation: 4251Reputation: 4251Reputation: 4251Reputation: 4251Reputation: 4251
The GNU cp command has the backup option something like

cp --backup=numbered /source/* /destination
or
cp --backup=existing --suffix=.orig /source/* /destination

Last edited by michaelk; 06-08-2021 at 08:18 AM.
 
1 members found this post helpful.
Old 06-08-2021, 08:04 AM   #3
lleb
Senior Member
 
Registered: Dec 2005
Location: Florida
Distribution: CentOS/Fedora/Pop!_OS
Posts: 2,908

Original Poster
Rep: Reputation: 547Reputation: 547Reputation: 547Reputation: 547Reputation: 547Reputation: 547
thank you, if i use the first option you have, what would the output look like on a duplicate name file? i assume i can scp with the same arguments?
 
Old 06-08-2021, 08:19 AM   #4
michaelk
Moderator
 
Registered: Aug 2002
Posts: 21,814

Rep: Reputation: 4251Reputation: 4251Reputation: 4251Reputation: 4251Reputation: 4251Reputation: 4251Reputation: 4251Reputation: 4251Reputation: 4251Reputation: 4251Reputation: 4251
The first options appends ~1~ to the same file in the destination.

Option two only appends .orig to the same file in the destination.

As far as I know scp does not have any backup options however you might be able to use sshfs in conjunction with cp.

Last edited by michaelk; 06-08-2021 at 08:26 AM. Reason: more info.
 
1 members found this post helpful.
Old 06-08-2021, 08:24 AM   #5
lleb
Senior Member
 
Registered: Dec 2005
Location: Florida
Distribution: CentOS/Fedora/Pop!_OS
Posts: 2,908

Original Poster
Rep: Reputation: 547Reputation: 547Reputation: 547Reputation: 547Reputation: 547Reputation: 547
sadly getting errors:

Code:
$ scp --backup=numbered /home/user/UCF/* user@user:/home/user/UCF
unknown option -- -
usage: scp [-346BCpqrTv] [-c cipher] [-F ssh_config] [-i identity_file]
            [-J destination] [-l limit] [-o ssh_option] [-P port]
            [-S program] source ... target
cp will not allow for the copy into user@user:
 
Old 06-08-2021, 08:30 AM   #6
michaelk
Moderator
 
Registered: Aug 2002
Posts: 21,814

Rep: Reputation: 4251Reputation: 4251Reputation: 4251Reputation: 4251Reputation: 4251Reputation: 4251Reputation: 4251Reputation: 4251Reputation: 4251Reputation: 4251Reputation: 4251
Sorry I updated my previous post.

While scp does not have a backup option you might be able to use sshfs with cp. Otherwise you might need to use some backup utility like rsnapshot.
 
1 members found this post helpful.
Old 06-08-2021, 08:34 AM   #7
lleb
Senior Member
 
Registered: Dec 2005
Location: Florida
Distribution: CentOS/Fedora/Pop!_OS
Posts: 2,908

Original Poster
Rep: Reputation: 547Reputation: 547Reputation: 547Reputation: 547Reputation: 547Reputation: 547
ok, its been years since i mucked about with sshfs, ill play with that or setup NFS and see what way I can copy these files around. end of semester clean up time.
 
Old 06-08-2021, 08:38 AM   #8
michaelk
Moderator
 
Registered: Aug 2002
Posts: 21,814

Rep: Reputation: 4251Reputation: 4251Reputation: 4251Reputation: 4251Reputation: 4251Reputation: 4251Reputation: 4251Reputation: 4251Reputation: 4251Reputation: 4251Reputation: 4251
I just confirmed that cp --backup does work with sshfs.
 
1 members found this post helpful.
Old 06-08-2021, 09:28 AM   #9
lleb
Senior Member
 
Registered: Dec 2005
Location: Florida
Distribution: CentOS/Fedora/Pop!_OS
Posts: 2,908

Original Poster
Rep: Reputation: 547Reputation: 547Reputation: 547Reputation: 547Reputation: 547Reputation: 547
Code:
for i in ./*.pdf~1~*; do mv -- "$i" "${i//.pdf~1~/.pdf}" &> /dev/null; done
that is not working, do i need to \ out the ~ ???

or is that failing because it will create a file with the same name?

i have a rename script:

Code:
for i in ./*.pdf~1~*; do mv -- "$i" "${i//.pdf~1~/.pdf}" &> /dev/null; done
# replaces file names with 0000 - 9999 for .jpg, .png, & .mov, etc
a=1
for i in *.pdf; do
  new=$(printf "%04d.pdf" "$a") #04 pad to length of 4
  mv -- "$i" "$new" &> /dev/null
  let a=a+1
done
# replace name. to prepend to all files that are *.jpg, etc...
# :%s/name./New.Name./g
for f in *.pdf
do
    mv "$f" "Class.Foo.$f" &> /dev/null
done
 
Old 06-08-2021, 10:07 AM   #10
boughtonp
Senior Member
 
Registered: Feb 2007
Location: UK
Distribution: Debian
Posts: 1,618

Rep: Reputation: 1315Reputation: 1315Reputation: 1315Reputation: 1315Reputation: 1315Reputation: 1315Reputation: 1315Reputation: 1315Reputation: 1315Reputation: 1315

If I was in this situation, I would rsync the entire set over to a complete new directory, then backup both sides, then do deduping/merging to another new location.

Sure, that might mean five copies of some files, but if the data is valuable I'd rather spend the disk space than risk losing it from a typo/misunderstanding.

(I'd also prefer to redirect unwanted output to temporary file(s) - which can be deleted on verified success - instead of sending anything into the /dev/null abyss.)


Last edited by boughtonp; 06-08-2021 at 10:10 AM.
 
Old 06-08-2021, 10:23 AM   #11
michaelk
Moderator
 
Registered: Aug 2002
Posts: 21,814

Rep: Reputation: 4251Reputation: 4251Reputation: 4251Reputation: 4251Reputation: 4251Reputation: 4251Reputation: 4251Reputation: 4251Reputation: 4251Reputation: 4251Reputation: 4251
From a quick look at the script your trying to move the file to an already existing file in the same directory.
The real question is what do you want to do with the files with duplicate names.
 
Old 06-08-2021, 10:25 AM   #12
lleb
Senior Member
 
Registered: Dec 2005
Location: Florida
Distribution: CentOS/Fedora/Pop!_OS
Posts: 2,908

Original Poster
Rep: Reputation: 547Reputation: 547Reputation: 547Reputation: 547Reputation: 547Reputation: 547
Quote:
Originally Posted by michaelk View Post
From a quick look at the script your trying to move the file to an already existing file in the same directory.
The real question is what do you want to do with the files with duplicate names.
just add them to a numbering list that i can manually dig through and remove true duplicates. my bad for the naming scheme i use. i wget the files, than run that script. its the same script on both the laptop and workstation, thus the duplicate file naming, next semester ill add a flag to my laptop files to make merger and cleanup easier.
 
Old 06-08-2021, 10:33 AM   #13
michaelk
Moderator
 
Registered: Aug 2002
Posts: 21,814

Rep: Reputation: 4251Reputation: 4251Reputation: 4251Reputation: 4251Reputation: 4251Reputation: 4251Reputation: 4251Reputation: 4251Reputation: 4251Reputation: 4251Reputation: 4251
I agree with boughtonp until you determine how you want dig through the files make sure you have backups of both computers.

Not sure what you mean by add them to a numbering list? You can use cmp or diff to compare the duplicate named files and then deal with them as desired.
 
Old 06-08-2021, 10:34 AM   #14
lleb
Senior Member
 
Registered: Dec 2005
Location: Florida
Distribution: CentOS/Fedora/Pop!_OS
Posts: 2,908

Original Poster
Rep: Reputation: 547Reputation: 547Reputation: 547Reputation: 547Reputation: 547Reputation: 547
ok all, thanks. ill just mark this as success and go from here. i have enough info now to at least address the files.
 
Old 06-08-2021, 02:19 PM   #15
MadeInGermany
Senior Member
 
Registered: Dec 2011
Location: Simplicity
Posts: 1,805

Rep: Reputation: 814Reputation: 814Reputation: 814Reputation: 814Reputation: 814Reputation: 814Reputation: 814
Quote:
do i need to \ out the ~ ???
Yes, some shells might want to expand a ~ if not \escaped or within quotes.
Code:
for i in ./*".pdf~1~"*; do mv -- "$i" "${i//.pdf~1~/.pdf}" &> /dev/null; done
Now all ~ are within quotes.
BTW the * are outside the quotes - you want the shell to expand them.
 
  


Reply


Thread Tools Search this Thread
Search this Thread:

Advanced Search

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off



Similar Threads
Thread Thread Starter Forum Replies Last Post
deleting files in directories but with exclusions of some directories goodiemobster Linux - Newbie 9 10-08-2020 01:03 AM
Secure Deleting Files in Linux? Overwriting files with bits to secure erase? d9esco Linux - Newbie 9 09-02-2015 10:18 AM
[SOLVED] Overwriting free space or overwriting single files restored by photorec fcrok Linux - Security 22 09-15-2012 12:53 PM
Merge 2 same name directories in linux. nishith Linux - Server 12 09-25-2009 12:15 AM
How do I copy over directories to directories with the same name without overwriting? SentralOrigin Linux - General 1 03-14-2009 01:09 AM

LinuxQuestions.org > Forums > Linux Forums > Linux - General

All times are GMT -5. The time now is 12:37 AM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Open Source Consulting | Domain Registration