LinuxQuestions.org
Review your favorite Linux distribution.
Go Back   LinuxQuestions.org > Forums > Linux Forums > Linux - Distributions > Slackware
User Name
Password
Slackware This Forum is for the discussion of Slackware Linux.

Notices

Reply
 
Search this Thread
Old 03-25-2005, 07:25 AM   #1
carboncopy
Senior Member
 
Registered: Jan 2003
Location: Malaysia
Distribution: Fedora Core, Slackware, Mac OS X, Debian, OpenSUSE
Posts: 1,210
Blog Entries: 4

Rep: Reputation: 45
bittorrent 4.0 handling file larger then 4.0 gbyte?


Hi!

I am using bittorrent 4.0 in Slackware-current. Is the slackware-current package compiled to be able to handle files larger then 4.0gbyte?

I was downloading a file which states 3.99gbyte on Azureus. But at 92.4% Azureus keep on crashing. So I kick up btdownloadcurses.py. As bittorrent verify my file, it will crash after 2/3 gbyte(not sure, wasn't observing) and the error given was file is too large.

However, I managed to finish downloading the file with Azureus in Mac Os X.

Wondering, whether there is a switch/option during complation time to handle super large files.
 
Old 03-25-2005, 10:44 AM   #2
tank728
Member
 
Registered: Sep 2003
Posts: 142

Rep: Reputation: 16
the bittorrent package is a noarch package, so there was no
compile, it is just a big script. Open it up and snoop around,
or go to bittorrent homepage and find out. Or perhaps you
downloaded a corrupted copy, check the md5sum with the
original.

-tank
 
Old 03-25-2005, 10:49 AM   #3
carboncopy
Senior Member
 
Registered: Jan 2003
Location: Malaysia
Distribution: Fedora Core, Slackware, Mac OS X, Debian, OpenSUSE
Posts: 1,210
Blog Entries: 4

Original Poster
Rep: Reputation: 45
Thanks for the info.

Now you are throwing more light to it.

So the bittorrent scripts are phyton scripts. That means phyton can't handle file larger then 3-4gbyte?

I thought that my partial downloaded copy was corrupted, but I can continue the download in Mac Os X using Azureus. And the file is in perfect condition (finished).
 
Old 04-04-2005, 11:26 AM   #4
carboncopy
Senior Member
 
Registered: Jan 2003
Location: Malaysia
Distribution: Fedora Core, Slackware, Mac OS X, Debian, OpenSUSE
Posts: 1,210
Blog Entries: 4

Original Poster
Rep: Reputation: 45
Sigh, I realize that it is nothing to do with file size. I am getting that error now even though the download file is like 700mbyte and 300+mbyte

The exact error I get is
[21:18:41] IO Error [Errno 27] File too large

And my Azureus have been crashing as well. Anything wrong with my Slackware?
 
Old 04-04-2005, 11:41 AM   #5
keefaz
Senior Member
 
Registered: Mar 2004
Distribution: Slackware
Posts: 4,605

Rep: Reputation: 134Reputation: 134
Just curious, what is the partition filesystem where you downladed the > 4GB file ?
 
Old 04-04-2005, 10:14 PM   #6
carboncopy
Senior Member
 
Registered: Jan 2003
Location: Malaysia
Distribution: Fedora Core, Slackware, Mac OS X, Debian, OpenSUSE
Posts: 1,210
Blog Entries: 4

Original Poster
Rep: Reputation: 45
All my filesystem are ext3.

Does it have to do with disc cache? How can I clear the disc cache without rebooting?

Azureus and bittorrent crash at the any downloads, but seeding only is ok.
 
Old 12-09-2005, 03:59 AM   #7
ReaperMan
LQ Newbie
 
Registered: Feb 2004
Location: Ontario, Canada
Distribution: Gentoo
Posts: 4

Rep: Reputation: 0
I am having the exact same issue

"IO Error [Errno 27] File too large"

Hello,

I'm having the exact same issue with more than one torrent file.

Now I am currently running ext3 root filesystem, but I am downloading to an ntfs partition via SMB.

After the torrent does the initial file check and begins to download, it gets killed with this error. Its a 3GB file approximately 70% complete.

I am currently using python 2.4.2.

I thought at first the problem lay within Python itself, so I ended up recompiling python for large file support. And I am still getting this error, does the problem actually lie with python?

Was there any resolution you came to for this issue?
 
  


Reply


Thread Tools Search this Thread
Search this Thread:

Advanced Search

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off


Similar Threads
Thread Thread Starter Forum Replies Last Post
Can't resume file transfer on files larger than 1 gig dalesan Linux - Newbie 2 10-11-2005 11:50 AM
Help Making Topologilinux 5 Image File Larger Garibaldi3489 Slackware 1 10-31-2004 11:06 AM
c file handling on linux suchi_s LQ Suggestions & Feedback 1 06-23-2004 01:48 AM
SMB LFS (Large File Support, > 2 GByte) in 2.4.20 broken rrobinet Linux - Networking 4 09-14-2003 02:20 AM
/var file space too low need to make it larger hoodman10 Linux - Newbie 1 11-25-2002 06:23 PM


All times are GMT -5. The time now is 05:39 AM.

Main Menu
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
identi.ca: @linuxquestions
Facebook: linuxquestions Google+: linuxquestions
Open Source Consulting | Domain Registration