Share your knowledge at the LQ Wiki.
Go Back > Forums > Linux Forums > Linux - Newbie
User Name
Linux - Newbie This Linux forum is for members that are new to Linux.
Just starting out and have a question? If it is not in the man pages or the how-to's this is the place!


  Search this Thread
Old 12-06-2010, 07:18 AM   #1
LQ Newbie
Registered: Oct 2009
Posts: 27

Rep: Reputation: 0
tar not working with large number of files

Hi all i have 1 problem

in my middle of script i am using tar command
to tar some 1000 images and size of each image is 5MB.

all the images are provided as argument
as tar -cvf file.tar <all images as argument>

but my tar file file.tar does not contain all the images.

can anyone help me
thanks in advance
Old 12-06-2010, 07:23 AM   #2
LQ Guru
Registered: Sep 2009
Location: Perth
Distribution: Manjaro
Posts: 9,564

Rep: Reputation: 2901Reputation: 2901Reputation: 2901Reputation: 2901Reputation: 2901Reputation: 2901Reputation: 2901Reputation: 2901Reputation: 2901Reputation: 2901Reputation: 2901
How many does it contain?
How do you know all the images have been accounted for?
Are they all manual entries or is it somehow calculated?
If calculated, how?

hmmm ... maybe we need some more information??
Old 12-06-2010, 07:39 AM   #3
Registered: Jul 2008
Location: Chennai, India
Distribution: RHEL5, Ubuntu
Posts: 191

Rep: Reputation: 37

Which tar version are u using?

some tar versions will not be able to compress more than 2gb (eg.Solaris), you please search and download gnutar.
Old 12-06-2010, 08:01 AM   #4
Registered: Nov 2008
Location: Lower Saxony, Germany
Distribution: CentOS, RHEL, Solaris 10, AIX, HP-UX
Posts: 731

Rep: Reputation: 137Reputation: 137

tar should handle large files in any actual distribution. Did you try to take the folder containing the images as argument to tar and not the filenames by itself.
Old 12-06-2010, 08:03 AM   #5
Registered: Feb 2004
Location: Outside Paris
Distribution: Solaris 11.3, Oracle Linux, Mint
Posts: 9,724

Rep: Reputation: 434Reputation: 434Reputation: 434Reputation: 434Reputation: 434
Actually current Solaris tar is large file aware but assuming you are running Gnu/Linux and then Gnu tar, you might want to create a file containing the list of files to archive then use
tar -T list.txt -cvf file.tar
That would rule out a command line /environment overflow.

The file system you use to store the tar file might also causing the issue. For example FAT32 cannot store files larger than 4GB so only about 800 of your 1000 5MB images would fit in a single archive.


Thread Tools Search this Thread
Search this Thread:

Advanced Search

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off

Similar Threads
Thread Thread Starter Forum Replies Last Post
Renaming large number of files. checkmate3001 Linux - General 16 06-20-2009 12:16 PM
ext3 performance -- very large number of files, large filesystems, etc. td3201 Linux - Server 5 11-25-2008 10:28 AM
Syncing very large number of files to another server humbletech99 Linux - General 1 07-18-2008 08:58 AM
Deleting a large number of files msteudel Linux - General 4 01-26-2005 02:36 AM
Large Number of files? mikeshn Linux - Security 2 01-10-2004 07:11 AM > Forums > Linux Forums > Linux - Newbie

All times are GMT -5. The time now is 11:24 AM.

Main Menu
Write for LQ is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Facebook: linuxquestions Google+: linuxquestions
Open Source Consulting | Domain Registration