Linux - General This Linux forum is for general Linux questions and discussion.
If it is Linux Related and doesn't seem to fit in any other forum then this is the place. |
Notices |
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
Are you new to LinuxQuestions.org? Visit the following links:
Site Howto |
Site FAQ |
Sitemap |
Register Now
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
|
 |
10-05-2004, 07:09 AM
|
#1
|
LQ Newbie
Registered: Oct 2004
Distribution: RedHat 9
Posts: 11
Rep:
|
File size limit exceeded
Here is my problem. I am attempting to create an image of a 20 GB hard drive (via the dd command). However, during the imaging process, the dd command craps out after 16 GB and gives me a "File size limit exceeded" error. I have tried a number of different things like reformatiing the drive I am writting to and moving partitions around on the drive I am writting. I get the same error. I have plenty of space available (currently I have 53GB of free space that I can write to). I found a post talking about being able to extend this limit using the "ulimit command, but was curious if anyone has ran into the same problem and how they went about solving it. Here is a link to the post that I refered to above. Thanks!
|
|
|
10-05-2004, 11:12 AM
|
#2
|
LQ Newbie
Registered: Aug 2002
Location: Pittsburgh
Distribution: Debian
Posts: 24
Rep:
|
What filesystem are you using? If the size of the block address is too small, you won't be able to physically address more data, hence the limit.
|
|
|
10-05-2004, 11:31 AM
|
#3
|
LQ Newbie
Registered: Oct 2004
Distribution: RedHat 9
Posts: 11
Original Poster
Rep:
|
I am using Linux Ext2.
|
|
|
10-05-2004, 12:51 PM
|
#4
|
LQ Newbie
Registered: Aug 2002
Location: Pittsburgh
Distribution: Debian
Posts: 24
Rep:
|
Basically, the metadata for each file can only store so many names of blocks, so if your file takes more than that number, you're sunk. I suggest you reformat that partition with a larger block size so that you can hold more data per file.
A quick Google search claims that the maximum ext2 file size is 2GB or 4GB, depending on the website, but it doesn't mention the blocksize. If you're already at 16GB for one file then those website's math must not take into account large blocksizes.
|
|
|
10-06-2004, 07:34 AM
|
#5
|
Member
Registered: Aug 2004
Distribution: debian, SuSE
Posts: 365
Rep:
|
Type ulimit -a and see if you are restricted to creating files over aq certain size.
|
|
|
10-06-2004, 07:34 AM
|
#6
|
Member
Registered: Aug 2004
Distribution: debian, SuSE
Posts: 365
Rep:
|
Type ulimit -a and see if you are restricted to creating files over aq certain size.
|
|
|
All times are GMT -5. The time now is 05:26 AM.
|
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.
|
Latest Threads
LQ News
|
|