Linux - Software This forum is for Software issues.
Having a problem installing a new program? Want to know which application is best for the job? Post your question in this forum. |
Notices |
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
Are you new to LinuxQuestions.org? Visit the following links:
Site Howto |
Site FAQ |
Sitemap |
Register Now
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
|
|
|
04-30-2009, 07:52 PM
|
#1
|
Member
Registered: Aug 2008
Location: Yemen
Distribution: Fedora, CentOS, RedHat , OpenFiler, ESXI
Posts: 225
Rep:
|
Copy files>
Hi to all .
I have a folder that is around 230GB and need to copy it to an external .
Drive that hold's data = NTFS.
External = Fat32.
when i used the CP command it gives me errors due to the name of ther files in that directory. some files names are just strange and contain special charaters such as & % $ _ , as i remember the error was invalid arguments .
i tried to TAR the file but it stoped at 4GB , with an error "File larger than 4GB " ..
how could i do this.
thanks.
|
|
|
04-30-2009, 08:07 PM
|
#2
|
LQ Veteran
Registered: Nov 2005
Location: Annapolis, MD
Distribution: Mint
Posts: 17,809
|
Any current version of Linux should have no problem copying from NTFS to any other filesystem. I suspect that the problem is the special characters.
The general rule is to put the filename in quotes, like so:
cp "/path/funn&#filename" /destination/path/ (Depending on what the special characters are, you might need single quotes.)
You can also "escape" special characters. Suppose you had a file named "fred$dog". If you do:
cp fred$dog /dest/path
the shell will attempt to insert the value of the variable named "dog". To prevent that, do:
cp fred\$dog /dest/path
Personally, I would first rename the offending files. That way, you will not continue to have the same issue. Post some actual example of the "strange" filenames.
|
|
|
04-30-2009, 08:14 PM
|
#3
|
Member
Registered: Aug 2008
Location: Yemen
Distribution: Fedora, CentOS, RedHat , OpenFiler, ESXI
Posts: 225
Original Poster
Rep:
|
Quote:
Originally Posted by pixellany
Any current version of Linux should have no problem copying from NTFS to any other filesystem. I suspect that the problem is the special characters.
The general rule is to put the filename in quotes, like so:
cp "/path/funn&#filename" /destination/path/ (Depending on what the special characters are, you might need single quotes.)
You can also "escape" special characters. Suppose you had a file named "fred$dog". If you do:
cp fred$dog /dest/path
the shell will attempt to insert the value of the variable named "dog". To prevent that, do:
cp fred\$dog /dest/path
Personally, I would first rename the offending files. That way, you will not continue to have the same issue. Post some actual example of the "strange" filenames.
|
thank you for the advice . but i tried that.
now check this.
i have a folder named TEST . it has sub-folders and files which are around 400GB or so, you are looking at over 10000 files. i just need to copy the whole folder to an external drive. there should be a way to copy the whole folder .. thankx
|
|
|
04-30-2009, 08:17 PM
|
#4
|
Member
Registered: Aug 2008
Location: Yemen
Distribution: Fedora, CentOS, RedHat , OpenFiler, ESXI
Posts: 225
Original Poster
Rep:
|
Quote:
Originally Posted by maas187
thank you for the advice . but i tried that.
now check this.
i have a folder named TEST . it has sub-folders and files which are around 400GB or so, you are looking at over 10000 files. i just need to copy the whole folder to an external drive. there should be a way to copy the whole folder .. thankx
|
well imagine renaming and editing a 1000 files ? that would take so must time .
|
|
|
04-30-2009, 08:20 PM
|
#5
|
LQ Veteran
Registered: Nov 2005
Location: Annapolis, MD
Distribution: Mint
Posts: 17,809
|
What??--I suggested several things.
Quote:
there should be a way to copy the whole folder
|
Of course!!----cp -R foldername /destination/
The problem is that this command actually means: "find every individual file and copy it." So---special characters will be an issue.
|
|
|
04-30-2009, 08:37 PM
|
#6
|
Member
Registered: Aug 2008
Location: Yemen
Distribution: Fedora, CentOS, RedHat , OpenFiler, ESXI
Posts: 225
Original Poster
Rep:
|
Quote:
Originally Posted by pixellany
What??--I suggested several things.
Of course!!----cp -R foldername /destination/
The problem is that this command actually means: "find every individual file and copy it." So---special characters will be an issue.
|
yea i know , is there any other way to get those files , without running into this problem ?
thanks.
|
|
|
04-30-2009, 08:58 PM
|
#7
|
LQ Guru
Registered: Aug 2004
Location: Sydney
Distribution: Rocky 9.2
Posts: 18,391
|
First thing you need to do is get a list of all the filenames with special chars; something like
Code:
for file in `ls`
do
echo $file |egrep -e '&' -e '@' # add in all the special chars you need
done
That'll give you a list of all the filenames (with special chars), which you can redirect to a file.
You then need to decide what amendments you want to make and extend the script to do that for you.
|
|
|
04-30-2009, 09:10 PM
|
#8
|
LQ Newbie
Registered: Apr 2009
Location: República de Tejas, Centro
Distribution: Ubunut, Xubuntu, Dotsch/UX
Posts: 19
Rep:
|
Did I read the target drive is FAT32... does it have a file size limit or was that FAT16 that had like a 4GB file size limit?
|
|
|
04-30-2009, 09:29 PM
|
#9
|
Member
Registered: Aug 2008
Location: Yemen
Distribution: Fedora, CentOS, RedHat , OpenFiler, ESXI
Posts: 225
Original Poster
Rep:
|
Quote:
Originally Posted by chrism01
First thing you need to do is get a list of all the filenames with special chars; something like
Code:
for file in `ls`
do
echo $file |egrep -e '&' -e '@' # add in all the special chars you need
done
That'll give you a list of all the filenames (with special chars), which you can redirect to a file.
You then need to decide what amendments you want to make and extend the script to do that for you.
|
thank you very much . But again i wanted something in and out. just like a command that would just copy the whole thing one shot.
thank for the script .
Regards,
MaaS
|
|
|
04-30-2009, 09:42 PM
|
#10
|
Member
Registered: Aug 2008
Location: Yemen
Distribution: Fedora, CentOS, RedHat , OpenFiler, ESXI
Posts: 225
Original Poster
Rep:
|
Quote:
Originally Posted by chrism01
First thing you need to do is get a list of all the filenames with special chars; something like
Code:
for file in `ls`
do
echo $file |egrep -e '&' -e '@' # add in all the special chars you need
done
That'll give you a list of all the filenames (with special chars), which you can redirect to a file.
You then need to decide what amendments you want to make and extend the script to do that for you.
|
thank you very much . But again i wanted something in and out. just like a command that would just copy the whole thing one shot.
thank for the script .
Regards,
MaaS
|
|
|
04-30-2009, 09:55 PM
|
#11
|
Member
Registered: Aug 2008
Location: Yemen
Distribution: Fedora, CentOS, RedHat , OpenFiler, ESXI
Posts: 225
Original Poster
Rep:
|
I have another question .
could i use rsync for backup instead of CP ot TAR ? or would it also give errors on special characters or huge files such as 400GB . ???
|
|
|
04-30-2009, 10:26 PM
|
#12
|
Member
Registered: Aug 2007
Posts: 73
Rep:
|
i run 'dd' when copy a really large partition. then resize it with 'PM', or .....
|
|
|
04-30-2009, 10:33 PM
|
#13
|
Member
Registered: Aug 2008
Location: Yemen
Distribution: Fedora, CentOS, RedHat , OpenFiler, ESXI
Posts: 225
Original Poster
Rep:
|
Quote:
Originally Posted by ozminh
i run 'dd' when copy a really large partition. then resize it with 'PM', or .....
|
i tried dd , but dd only copy partitions . dose not copy folder .e.g
dd if=/home/test/data of=/mnt/external ? this wont work .
|
|
|
04-30-2009, 10:39 PM
|
#14
|
LQ Veteran
Registered: Nov 2005
Location: Annapolis, MD
Distribution: Mint
Posts: 17,809
|
Quote:
But again i wanted something in and out. just like a command that would just copy the whole thing one shot.
|
I think at least two of us have suggested that was not going to work----and explained why.
|
|
|
04-30-2009, 10:43 PM
|
#15
|
Senior Member
Registered: Aug 2006
Posts: 2,697
|
Quote:
Originally Posted by chrism01
Code:
for file in `ls`
do
echo $file |egrep -e '&' -e '@' # add in all the special chars you need
done
|
no need the "ls". its useless and will break on file names with spaces. also no need to call egrep (if using bash)
Code:
for file in *
do
mv "$file" "${file//[@^&]/}
done
|
|
|
All times are GMT -5. The time now is 03:31 AM.
|
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.
|
Latest Threads
LQ News
|
|