LinuxQuestions.org
Welcome to the most active Linux Forum on the web.
Go Back   LinuxQuestions.org > Forums > Linux Forums > Linux - Newbie
User Name
Password
Linux - Newbie This Linux forum is for members that are new to Linux.
Just starting out and have a question? If it is not in the man pages or the how-to's this is the place!

Notices


Reply
  Search this Thread
Old 09-17-2008, 03:46 PM   #1
jg1
LQ Newbie
 
Registered: Nov 2007
Location: UK
Posts: 6

Rep: Reputation: 0
Question Writing large archives to NTFS or ext3 volumes


Hi - this is a first post and from a new-newbie. Apols if it's been asked before but my attempts at searching have left me more confused.
Let me begin at the beginning...
For 6 months I have been running a "pilot" system on a 10 year-old home build running Xubuntu-gutsy. A really good experience but I'm still no good at command line working. My WinXP box has a serious h/w problem so I've offloaded everything on to a Dell inspiron running Ubuntu-hardy (8.04). So far so good. Most of the stuff works and I'm sure I'll crack the rest.
But when I come to do a backup I run into problems.
I've been used to using WinZip, RAR and HJSplit to create, unpack and join files but when I try to use the archiver in the standard Ubuntu-hardy distro it runs out of steam at 1.3GB saying "File too large" or "Max file size exceeded".

If I want to back up my personal data which sits on a TrueCrypt repository I need to deal with archives between 4-8GB. I've tried writing to the internal hard drive (ext3 - encrypted and non-encrypted), and to external Freecom drives of 250GB and 500GB formatted as NTFS or FAT32, both encrypted and unencrypted.
I've searched all sorts of forums and have tried various things but most demand command line working that I am not confident in using. I have also tried to download a number of non-standard packages that are supposed to work as a back end to the standard app (File Roller 2.22.3) but either I've done something wrong, or I'm barking up the wrong tree.

Can anybody help, pls?

Tks in anticipation/ desperation(!)
jg
 
Old 09-17-2008, 07:52 PM   #2
chrism01
LQ Guru
 
Registered: Aug 2004
Location: Sydney
Distribution: Centos 7.7 (?), Centos 8.1
Posts: 18,169

Rep: Reputation: 2680Reputation: 2680Reputation: 2680Reputation: 2680Reputation: 2680Reputation: 2680Reputation: 2680Reputation: 2680Reputation: 2680Reputation: 2680Reputation: 2680
I'd have to say that learning the cmd line will definitely pay off, esp for situations like this. Its not hard if you take it slowly ie one thing at a time.
This is a very good tutorial about Linux at the cmd line: http://rute.2038bug.com/index.html.gz

Also:
http://tldp.org/LDP/Bash-Beginners-G...tml/index.html
http://www.tldp.org/LDP/abs/html/

Normally if you've got large files that you need to move in pieces, you'd use the 'split' cmd to break them up, then cp (copy) or scp (scure copy between machines), then 'cat' to join them up again.
There is online help (man pages on your system) plus lots of examples on google.
to read a man page about a cmd its just

man <cmd_you_want_to_know> eg

man cp

or

man split

HTH

PS Welcome to LQ
 
Old 09-18-2008, 04:14 AM   #3
jg1
LQ Newbie
 
Registered: Nov 2007
Location: UK
Posts: 6

Original Poster
Rep: Reputation: 0
Smile

Quote:
Originally Posted by chrism01 View Post
I'd have to say that learning the cmd line will definitely pay off, esp for situations like this. Its not hard if you take it slowly ie one thing at a time.
This is a very good tutorial about Linux at the cmd line: http://rute.2038bug.com/index.html.gz
Chris
Tks for the quick response and for the refs. Rute looks really good and on the strength of this, I will certainly persevere with Cmd Line working. No doubt my skills will improve with time. I remember some of it from my days working with Unix in the 1980s but that's a long time ago!
Tonight I will try the archiving using tar and gzip from the cmd line to see if that helps, and obviously the cp/scp and cat functions will provide a temporary workaround.
My suspicion is that the problem stems from limitations in File Roller and the way it works which seems strange to me. It appears to create the archive in 650MB chunks, adding them into the archive as it goes. It does the first 2 fine but then craters when it tries to add the third one in.
Any thoughts?
jg
 
  


Reply


Thread Tools Search this Thread
Search this Thread:

Advanced Search

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off



Similar Threads
Thread Thread Starter Forum Replies Last Post
File too large - 5GB copied from ntfs to ext3 Ook Slackware 4 08-20-2008 02:34 PM
Verifying Rsync Backups of Large Volumes of Files mcgirvanmedia Linux - Server 2 06-03-2008 11:30 PM
File system for large volumes archangel_617b Linux - Enterprise 2 08-30-2007 04:06 PM
help with partitioning large volumes with LVM disorderly Linux - Server 0 08-28-2007 11:08 AM
how to enable writing to NTFS volumes nerdriva Mandriva 6 08-16-2007 02:35 PM

LinuxQuestions.org > Forums > Linux Forums > Linux - Newbie

All times are GMT -5. The time now is 08:54 PM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Open Source Consulting | Domain Registration