LinuxQuestions.org
View the Most Wanted LQ Wiki articles.
Go Back   LinuxQuestions.org > Forums > Linux Forums > Linux - General
User Name
Password
Linux - General This Linux forum is for general Linux questions and discussion.
If it is Linux Related and doesn't seem to fit in any other forum then this is the place.

Notices



Reply
 
Search this Thread
Old 07-27-2007, 01:44 AM   #1
Ratheeshshenoy
LQ Newbie
 
Registered: Jul 2007
Posts: 4

Rep: Reputation: 0
Red face Hep me---File size limit exceeded--error


Hi,

I am running a tool and its output is a text file,the tool stops throwing the error 'file size limit exceeded' once the size of file reaches 2GB.
file system is ext3
and Linux OS is Red hat enterprise AS
Please Suggest me a Linux OS version which supports LFS
 
Old 07-27-2007, 05:41 AM   #2
Tinkster
Moderator
 
Registered: Apr 2002
Location: in a fallen world
Distribution: slackware by choice, others too :} ... android.
Posts: 23,005
Blog Entries: 11

Rep: Reputation: 903Reputation: 903Reputation: 903Reputation: 903Reputation: 903Reputation: 903Reputation: 903Reputation: 903
Hi, and welcome to LQ,

And what do you mean when you say LFS, Linux from Scratch?
And what is the tool you're using?



Cheers,
Tink
 
Old 07-27-2007, 08:23 AM   #3
Ratheeshshenoy
LQ Newbie
 
Registered: Jul 2007
Posts: 4

Original Poster
Rep: Reputation: 0
Quote:
Originally Posted by Tinkster
Hi, and welcome to LQ,

And what do you mean when you say LFS, Linux from Scratch?
And what is the tool you're using?



Cheers,
Tink
I meant Large file System, the tool that am running is a third party tool
The command ulimit says its unlimited and there is no quota set and am logged in as root

I know the output file will come more than 4 GB but it stops after 2GB throwing the error 'file size limit exceeded'

Thanks
Ratheesh
 
Old 07-27-2007, 08:29 AM   #4
wjevans_7d1@yahoo.co
Member
 
Registered: Jun 2006
Location: Mariposa
Distribution: Slackware 9.1
Posts: 938

Rep: Reputation: 30
If ulimit is not the problem, then the problem is with the third party tool.

If it was written in C (which is quite likely), then they probably did not compile it to use LFS. Many developers either are unaware that they need to do something special to allow files to be larger than 2GB, or (worse) they are unaware that their customers might need to deal with files that large.
 
Old 07-27-2007, 08:38 AM   #5
Ratheeshshenoy
LQ Newbie
 
Registered: Jul 2007
Posts: 4

Original Poster
Rep: Reputation: 0
Quote:
Originally Posted by wjevans_7d1@yahoo.co
If ulimit is not the problem, then the problem is with the third party tool.

If it was written in C (which is quite likely), then they probably did not compile it to use LFS. Many developers either are unaware that they need to do something special to allow files to be larger than 2GB, or (worse) they are unaware that their customers might need to deal with files that large.
I have used the Linux enterprise AS . Is this is supported with the LFS? if not can I please know some versions of linux which are having LFS support
 
Old 07-27-2007, 08:58 AM   #6
wjevans_7d1@yahoo.co
Member
 
Registered: Jun 2006
Location: Mariposa
Distribution: Slackware 9.1
Posts: 938

Rep: Reputation: 30
It is almost certain that your system has Large File Support. But "almost certain" isn't good enough, I'm sure. (grin)

Run the following shell script. It will create a C program, compile it, and run it. That C program will attempt to create a file called data which is slightly over 2GB. If _FILE_OFFSET_BITS had not been defined, that program would have failed.

Hope this helps.

Code:
#!/bin/sh

cat > testlarge.c <<EOD
#define _FILE_OFFSET_BITS 64

#include <sys/stat.h>
#include <sys/types.h>
#include <fcntl.h>
#include <stdio.h>
#include <stdlib.h>
#include <unistd.h>

int main(void)
{
  int     jndex;
  int     pass;
  int     phyle;

  ssize_t write_result;

  char    kbuffer[1024];

  unlink("data");

  phyle=open("data",O_RDWR|O_CREAT,0600);

  if(phyle==-1)
  {
    perror("data");

    exit(1);
  }

  for(jndex=0;
      jndex<=1024*1024*2;
      jndex++
     )
  {
    write_result=write(phyle,
                       kbuffer,
                       sizeof(kbuffer)
                      );

    if(write_result!=sizeof(kbuffer))
    {
      fprintf(stderr,
              "\nwrite result was %d, not %d",
              write_result,
              sizeof(kbuffer)
             );

      break;   /* <--------- */
    }

    if(jndex%(128*1024)==0)
    {
      printf("%d ",(1024*1024*2-jndex)/(128*1024));
      fflush(stdout);
    }
  }

  printf("\n");

  if(close(phyle))
  {
    perror("close");

    exit(1);
  }

  return 0;

} /* main() */

EOD
cc testlarge.c -o testlarge

status=$?

if [[ $status != 0 ]]
then
  echo "oops -- compile errors"
else
  testlarge
fi
 
Old 07-30-2007, 06:47 AM   #7
Ratheeshshenoy
LQ Newbie
 
Registered: Jul 2007
Posts: 4

Original Poster
Rep: Reputation: 0
Quote:
Originally Posted by wjevans_7d1@yahoo.co
It is almost certain that your system has Large File Support. But "almost certain" isn't good enough, I'm sure. (grin)

Run the following shell script. It will create a C program, compile it, and run it. That C program will attempt to create a file called data which is slightly over 2GB. If _FILE_OFFSET_BITS had not been defined, that program would have failed.

Hope this helps.

Code:
#!/bin/sh

cat > testlarge.c <<EOD
#define _FILE_OFFSET_BITS 64

#include <sys/stat.h>
#include <sys/types.h>
#include <fcntl.h>
#include <stdio.h>
#include <stdlib.h>
#include <unistd.h>

int main(void)
{
  int     jndex;
  int     pass;
  int     phyle;

  ssize_t write_result;

  char    kbuffer[1024];

  unlink("data");

  phyle=open("data",O_RDWR|O_CREAT,0600);

  if(phyle==-1)
  {
    perror("data");

    exit(1);
  }

  for(jndex=0;
      jndex<=1024*1024*2;
      jndex++
     )
  {
    write_result=write(phyle,
                       kbuffer,
                       sizeof(kbuffer)
                      );

    if(write_result!=sizeof(kbuffer))
    {
      fprintf(stderr,
              "\nwrite result was %d, not %d",
              write_result,
              sizeof(kbuffer)
             );

      break;   /* <--------- */
    }

    if(jndex%(128*1024)==0)
    {
      printf("%d ",(1024*1024*2-jndex)/(128*1024));
      fflush(stdout);
    }
  }

  printf("\n");

  if(close(phyle))
  {
    perror("close");

    exit(1);
  }

  return 0;

} /* main() */

EOD
cc testlarge.c -o testlarge

status=$?

if [[ $status != 0 ]]
then
  echo "oops -- compile errors"
else
  testlarge
fi






Hi,

Thanks To all
I have run the command
dd if=/dev/zero of=largefile bs=100M count=21

Which created file greater than 2GB
so i can confirm that tool is having problem....

Again thanks to all for kind help...
and please let me know if there is more info. on this

Thanks and Regards
Ratheesh
 
  


Reply


Thread Tools Search this Thread
Search this Thread:

Advanced Search

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off


Similar Threads
Thread Thread Starter Forum Replies Last Post
File size limit exceeded debeus Slackware 4 11-07-2006 11:36 AM
File size limit exceeded wfernley Linux - Software 5 07-21-2006 04:31 PM
File size limit exceeded XaViaR Linux - General 4 09-21-2005 11:02 PM
file size limit exceeded mchitrakar Linux - General 2 10-12-2004 01:47 PM
File size limit exceeded npuetz Linux - General 5 10-06-2004 08:34 AM


All times are GMT -5. The time now is 01:06 AM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
identi.ca: @linuxquestions
Facebook: linuxquestions Google+: linuxquestions
Open Source Consulting | Domain Registration