LinuxQuestions.org
Download your favorite Linux distribution at LQ ISO.
Home Forums Tutorials Articles Register
Go Back   LinuxQuestions.org > Forums > Linux Forums > Linux - Software
User Name
Password
Linux - Software This forum is for Software issues.
Having a problem installing a new program? Want to know which application is best for the job? Post your question in this forum.

Notices


Reply
  Search this Thread
Old 07-09-2016, 04:42 AM   #1
Glenn D.
Member
 
Registered: May 2009
Location: ACT - Australia
Distribution: Opensuse x86_64 (Latest)
Posts: 132

Rep: Reputation: 26
Question: Share large file/image.


Hello,
Question: Share large file/image.

Who is a provider I can upload a large file/image to and let people access it via url ?

Thanks
--Glenn

# fdisk -l

Disk /dev/sdb: 2000.4 GB, 2000398934016 bytes, 3907029168 sectors
Units = sectors of 1 * 512 = 512 bytes
Sector size (logical/physical): 512 bytes / 512 bytes
I/O size (minimum/optimal): 512 bytes / 512 bytes

# dd if=/dev/sdb | /usr/bin/bzip2 -9 > disk-volume-2016-06-12.iso.bz2
 
Old 07-09-2016, 08:38 PM   #2
frankbell
LQ Guru
 
Registered: Jan 2006
Location: Virginia, USA
Distribution: Slackware, Ubuntu MATE, Mageia, and whatever VMs I happen to be playing with
Posts: 19,324
Blog Entries: 28

Rep: Reputation: 6142Reputation: 6142Reputation: 6142Reputation: 6142Reputation: 6142Reputation: 6142Reputation: 6142Reputation: 6142Reputation: 6142Reputation: 6142Reputation: 6142
Quote:
Disk /dev/sdb: 2000.4 GB, 2000398934016 bytes, 3907029168 sectors
Am I understanding correctly that you wish to make a two terabyte file available for download?
 
1 members found this post helpful.
Old 07-10-2016, 04:11 PM   #3
jefro
Moderator
 
Registered: Mar 2008
Posts: 21,982

Rep: Reputation: 3625Reputation: 3625Reputation: 3625Reputation: 3625Reputation: 3625Reputation: 3625Reputation: 3625Reputation: 3625Reputation: 3625Reputation: 3625Reputation: 3625
Doubt there is a free provider that large yet.
Pretty much any provider of storage that allows both high traffic and large storage will sell you space. The users should be warned to use downloading tools that can restart downloads.

If you have the speed upload you can host it on your system. I don't normally use torrents but many browsers in linux support a url torrent. Ftp and other ways to share also. wget maybe.
 
Old 07-10-2016, 08:11 PM   #4
AwesomeMachine
LQ Guru
 
Registered: Jan 2005
Location: USA and Italy
Distribution: Debian testing/sid; OpenSuSE; Fedora; Mint
Posts: 5,524

Rep: Reputation: 1015Reputation: 1015Reputation: 1015Reputation: 1015Reputation: 1015Reputation: 1015Reputation: 1015Reputation: 1015
There is probably a flaw in the method you have selected to distribute the information. Think about what you are trying to do. I doubt that you really require 2TB of storage space.
 
1 members found this post helpful.
Old 07-10-2016, 08:36 PM   #5
frankbell
LQ Guru
 
Registered: Jan 2006
Location: Virginia, USA
Distribution: Slackware, Ubuntu MATE, Mageia, and whatever VMs I happen to be playing with
Posts: 19,324
Blog Entries: 28

Rep: Reputation: 6142Reputation: 6142Reputation: 6142Reputation: 6142Reputation: 6142Reputation: 6142Reputation: 6142Reputation: 6142Reputation: 6142Reputation: 6142Reputation: 6142
Ignoring the question of a free provider, I consider it highly unlikely that any internet connection will allow a 2TB download to complete successfully.

My own ISP limits home users to 350GB a month (recently upped from 250GB after they upgraded some of their infrastructure).
 
1 members found this post helpful.
Old 07-10-2016, 08:48 PM   #6
notKlaatu
Senior Member
 
Registered: Sep 2010
Location: Lawrence, New Zealand
Distribution: Slackware
Posts: 1,077

Rep: Reputation: 732Reputation: 732Reputation: 732Reputation: 732Reputation: 732Reputation: 732Reputation: 732
You could cut the image into 20GB slices and let people download 100 of those via bittorrent, and then stitch them together with cat at home. For free, I guess you could open 100 gmail drive accounts and put one slice in each account...

As everyone else is saying, though, there's basically no way anyone's internet provider is going to allow them to download 2tb of data, and I don't know why anyone would be crazy enough to try. It would be faster (literally) to use the postal service to let people send harddrives back and forth for imaging.

I would look at what you actually need to distribute, and maybe come up with a different scheme for bundling it.
 
1 members found this post helpful.
Old 07-10-2016, 11:58 PM   #7
John VV
LQ Muse
 
Registered: Aug 2005
Location: A2 area Mi.
Posts: 17,624

Rep: Reputation: 2651Reputation: 2651Reputation: 2651Reputation: 2651Reputation: 2651Reputation: 2651Reputation: 2651Reputation: 2651Reputation: 2651Reputation: 2651Reputation: 2651
set up a torrent and get 8 to 12 ( minimum ) people to seed it
 
  


Reply



Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off



Similar Threads
Thread Thread Starter Forum Replies Last Post
Shell Session Crashing when cat'ing a large file, tar'ing a large file, etc. newmanium2001 Linux - General 3 12-22-2012 10:32 PM
Large file sizes listed incorrectly in a samba share Pain++ Linux - Software 3 05-02-2008 06:01 AM
stupid gzip large 140GB File question lambchops468 Linux - Software 3 08-30-2007 09:52 PM
Large file DVD backup question labratmatt Linux - Software 3 01-08-2005 11:07 PM
Best image viewer for very large image files? andvaranaut Linux - Software 1 02-21-2004 10:01 AM

LinuxQuestions.org > Forums > Linux Forums > Linux - Software

All times are GMT -5. The time now is 01:56 AM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Open Source Consulting | Domain Registration