LinuxQuestions.org
Share your knowledge at the LQ Wiki.
Home Forums Tutorials Articles Register
Go Back   LinuxQuestions.org > Forums > Non-*NIX Forums > Programming
User Name
Password
Programming This forum is for all programming questions.
The question does not have to be directly related to Linux and any language is fair game.

Notices


Reply
  Search this Thread
Old 12-06-2007, 12:20 PM   #1
asimov
Member
 
Registered: Oct 2004
Location: Israel
Distribution: Debian
Posts: 37

Rep: Reputation: 15
[C/C++] large files (>4gb) and 32bits machines


Hi,

My program needs to handle very large binary files (bigger than 4GB), and I can't find a way to open those files for reading / writing on my old 32bit laptop.

when trying to use FILE* h=fopen64(filename, "rb"), i get only 4 bytes pointer.

it there some sort of FILE64*, or a way to overcome this issues on 32 bits machines?
 
Old 12-06-2007, 12:35 PM   #2
orgcandman
Member
 
Registered: May 2002
Location: new hampshire
Distribution: Fedora, RHEL
Posts: 600

Rep: Reputation: 110Reputation: 110
Quote:
Originally Posted by asimov View Post
Hi,

My program needs to handle very large binary files (bigger than 4GB), and I can't find a way to open those files for reading / writing on my old 32bit laptop.

when trying to use FILE* h=fopen64(filename, "rb"), i get only 4 bytes pointer.

it there some sort of FILE64*, or a way to overcome this issues on 32 bits machines?
I'm fighting a fire with pressure at 4 gals / second. I've changed my pressure gauge to 8 gals / second. Why didn't I get a new hose?

analogy aside, the file pointer is just a descriptor that carries stream information. The thing that overcomes the 32-bit limitation is buried deep within the read/seek/write api that is used with that file descriptor (I think there are separate fread64 and fwrite64 functions that exist, but I could be wrong here).

-Aaron
 
Old 12-06-2007, 01:00 PM   #3
asimov
Member
 
Registered: Oct 2004
Location: Israel
Distribution: Debian
Posts: 37

Original Poster
Rep: Reputation: 15
nice analogy

I thought the file handle/descriptor is a pointer to a position on the file. I'll look into fread64 and fwrite64.

thank you for your answer.

BTW, sometimes overpressure might explode the hose, then you'll really need a new one
 
Old 12-06-2007, 02:19 PM   #4
schneidz
LQ Guru
 
Registered: May 2005
Location: boston, usa
Distribution: fedora-35
Posts: 5,313

Rep: Reputation: 918Reputation: 918Reputation: 918Reputation: 918Reputation: 918Reputation: 918Reputation: 918Reputation: 918
by and by, fgetc() seems to play nicely with fopen64().
 
Old 12-06-2007, 02:55 PM   #5
asimov
Member
 
Registered: Oct 2004
Location: Israel
Distribution: Debian
Posts: 37

Original Poster
Rep: Reputation: 15
isn't fread64() more efficient then fgetc() for reading more than one character?

i need to read only portion of files. with performance as top priority, is there any more efficient way then using fopen64, fsetpos64/fseek, fread64/memcpy, fclose sequences?
 
Old 12-07-2007, 05:39 AM   #6
asimov
Member
 
Registered: Oct 2004
Location: Israel
Distribution: Debian
Posts: 37

Original Poster
Rep: Reputation: 15
and another question,

On windows, you can speed up reading from file by using sector-aligned reading (reading full sectors from disk). how that can be achieved with linux?

what is the equivalent of windows's GetDiskFreeSpace(), in order find out how much bytes i have on my HD per sector?
 
Old 12-07-2007, 06:07 AM   #7
matthewg42
Senior Member
 
Registered: Oct 2003
Location: UK
Distribution: Kubuntu 12.10 (using awesome wm though)
Posts: 3,530

Rep: Reputation: 65
This might be useful information about handling large files:

https://www.linuxquestions.org/quest...3/#post2951079
 
Old 12-07-2007, 09:11 AM   #8
bigearsbilly
Senior Member
 
Registered: Mar 2004
Location: england
Distribution: Mint, Armbian, NetBSD, Puppy, Raspbian
Posts: 3,515

Rep: Reputation: 239Reputation: 239Reputation: 239
you can speed up reading using the mmap system call.
which reads a file directly into memory.

it's easy too.

I've mmap'd a 160Meg file recently.

4 gig on a laptop should be fun
 
  


Reply

Tags
32bit, c++, file, fopen



Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off



Similar Threads
Thread Thread Starter Forum Replies Last Post
files > 4gb on dvd dohpaz General 6 04-24-2007 10:52 AM
4GB+ files on linux Cyber Maid Linux - Software 4 03-17-2007 08:25 PM
Need to move large file(s) to lots of machines on my network. suggestions? BrianK Linux - Networking 3 05-12-2006 01:25 AM
Files > 4GB on FAT32 Luke771 Linux - General 7 02-01-2006 05:43 AM
Burning 4GB+ Files? Matir Linux - Software 3 05-06-2005 05:00 PM

LinuxQuestions.org > Forums > Non-*NIX Forums > Programming

All times are GMT -5. The time now is 10:48 AM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Open Source Consulting | Domain Registration