fgetc - segmentation fault
i did a search for fgetc and this is the first link it returned:
http://www.linuxquestions.org/questi...ighlight=fgetc i am wondering why i seg-fault when argv[1] is a 7 gigabyte file. this is basically the same as cat-ing everything under ascii 127: Code:
#include "stdio.h" |
Well you're not checking to see if you managed to open the file.
|
Could it be a >2GB file problem? What are the HW and OS versions?
We still hae apps which dont go beyond 2GB files. ppanyam |
Code:
schneidz@lq:/home/schneidz> uname -a |
I dont really want to comment on this but since no one has replied, I take liberty to say what I think..
Did you compile with 64 bit option? I think you have to use -n64 switch with cc while compiling. Sorry again, I havent worked on AIX for 4 years now, but recently some of my friends told me their "old programs having 2GB file limitations will work if they are recompiled for 64 bit option." I havent tested it myself. Best of luck. ppanyam |
why int argc w/out using? newbie wuestion sorry
|
argc tell you the number of command line arguments that were passed in.
argv is a list of the actual arguments that were passed in. The first argument is always the name of the program. Hence: ls -l ls is the name of the program and -l is the first argument: argv[0] = ls argv[1] = -l |
Quote:
fyi my stock cap solution was to Code:
split -b 1000000000 file.txt |
Quote:
might be a waste but oh well. |
Quote:
this is what i got Code:
schneidz@lq:/home/schneidz> cc -g -n64 file.c -o file.x i think its Code:
cc -q64 |
All times are GMT -5. The time now is 05:07 PM. |