LinuxQuestions.org
Help answer threads with 0 replies.
Go Back   LinuxQuestions.org > Forums > Linux Forums > Linux - Server
User Name
Password
Linux - Server This forum is for the discussion of Linux Software used in a server related context.

Notices


Reply
  Search this Thread
Old 11-03-2006, 04:21 AM   #1
Dannt
LQ Newbie
 
Registered: Oct 2006
Location: London, United Kingdom
Distribution: Ubuntu 6.10
Posts: 26

Rep: Reputation: 15
Question What is the best way of uploading a large site using FTP?


Hi all,

I recently started using Linux (Fedora Core 5) for editing and creating Websites using Blue Fish 1.0.4. One of the sites I run is still stored on my hard disk and I need to upload it entirely to the remote server, I tried uploading it with the mput command but not sure on how it works to upload the whole thing at once. So after googling for a while I found that it could be handy to tar and gz the entire site and upload it as a single file and The file was transfered to the server without problems.

Now my problem is that I don't know how can I untar and unzip it in the remote server, I tried Gunzip and tar -xf unsuccessfully. My question is: Is this the best way to upload a big site with many folders and files and is there actually any way to uncompress it once there? or Are there any better ways to do this?

Any help or guidance would be highly appreciated...

Thanks

Last edited by Dannt; 11-03-2006 at 04:23 AM.
 
Old 11-03-2006, 02:36 PM   #2
jyoung4
LQ Newbie
 
Registered: Apr 2006
Location: Minneapolis, Minnesota, USA
Posts: 16

Rep: Reputation: 1
You're doing it the way I would have done it. Does the tar command on the remote web server support the -z option for zipping and unzipping files? Most Linux systems I've used do and most UNIX systems don't. If your remote server doesn't support -z and doesn't have gzip/gunzip installed, you could check if it has compress/uncompress. Compress is an older, somewhat less efficient way to shrink files that is supported by most Linux and UNIX systems. If that doesn't work, just send it uncompressed. It will take a bit longer to transfer but you won't have to worry about compression incompatibilities then.

One other thing to check, unrelated to compression, is that you transferred the file in binary mode rather than ascii mode with ftp. Most UNIX ftp daemons default to ascii mode which causes them to trash non-ASCII files.
 
Old 11-04-2006, 09:26 AM   #3
Dannt
LQ Newbie
 
Registered: Oct 2006
Location: London, United Kingdom
Distribution: Ubuntu 6.10
Posts: 26

Original Poster
Rep: Reputation: 15
Dear jyoung4

Thanks for your reply. My main problem is that I am not able to run any of the commands you mentioned above when I am on the server (Gunzip, tar, etc) when trying I have a "Invalid Command name" message. So it seems like I need to have those packages installed there or something. Do you have any idea of how can I go arround this issue?

Many thanks for your help..

Dannt

Last edited by Dannt; 11-04-2006 at 12:15 PM.
 
Old 11-06-2006, 04:02 AM   #4
Dannt
LQ Newbie
 
Registered: Oct 2006
Location: London, United Kingdom
Distribution: Ubuntu 6.10
Posts: 26

Original Poster
Rep: Reputation: 15
This is solved now! I started usig Gftp which is a graphic Ftp application that allows to upload and download as many files as necesary...

Cheers
 
Old 02-08-2011, 10:56 PM   #5
phm
LQ Newbie
 
Registered: Feb 2011
Posts: 3

Rep: Reputation: 0
I have the same problem

Hi. I have this very same problem and notice this thread is old -- dated 2006. What's the latest news on this issue?

I have a VERY large site of millions of files that I need to somehow upload to a server. In fact, I don't even have a server yet. Is it possible to get a server and just hand deliver to them a hard drive or DVD of one's site?

Any help is much appreciated.

By the way, I'm not much of a programmer and have never used Linux, just a Mac and HTML.
 
Old 02-09-2011, 12:55 AM   #6
an15wn
LQ Newbie
 
Registered: Dec 2010
Location: Jakarta, ID
Distribution: Fedora, Ubuntu
Posts: 18

Rep: Reputation: Disabled
hi phm,

MILLIONS of files?

If that is all in plain html, I think the is 'something' in your site design. Usually, that numbers of 'file' will be put in a database for dynamic access via php or else.

If you want to tar + compress those files, I thing the size is still giving you some troubles. Uploading single huge file sure is a challenging task. Maybe you can group the files according to directories and do tar on each directories before uploading.

Before that, better you make sure that the size of ALL files fit in to the server's quota. It will depends on the local provider if you choose to send your harddisk.

Good luck...
 
Old 02-09-2011, 10:20 AM   #7
phm
LQ Newbie
 
Registered: Feb 2011
Posts: 3

Rep: Reputation: 0
Thanks. Yes, eight million to be exact. I don't even know what tar is but I'll look it up.
 
Old 02-09-2011, 10:28 AM   #8
jyoung4
LQ Newbie
 
Registered: Apr 2006
Location: Minneapolis, Minnesota, USA
Posts: 16

Rep: Reputation: 1
Uploading millions of files

I would agree with an15wn that millions of files is "a lot" and you would more commonly see a web site design that stored all this in a database or had some sort of dynamic HTML generator that can create the output but ... that's a different question.

As for the DVD - yes, you could do that depending on what server it is you're loading up, whether they will allow it, will it all fit on one DVD etc. However, it might be simpler to go with ftp. As Dannt mentioned back in 2006, he used a GUI ftp tool that did the upload pretty easily. I would look into that possibility first. I use a freeware tool (WinSCP) on Windows. I would imagine that there might be Mac versions of that tool or something similar that you could find. Then, just start the transfer and let it run overnight.
 
Old 02-09-2011, 11:32 AM   #9
phm
LQ Newbie
 
Registered: Feb 2011
Posts: 3

Rep: Reputation: 0
Thanks so much for your response. I have used Dreamweaver a lot and it's a program I love and I've used it for ftp transfers. I guess my question/problem is how does one get a fast upload connection? Perhaps this is not the right forum for that question but ... I guess if I have a fast broadband connection I can just use ftp as you say and let it run overnight. I've heard that cable modems have fast upload speeds.
 
  


Reply


Thread Tools Search this Thread
Search this Thread:

Advanced Search

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off



Similar Threads
Thread Thread Starter Forum Replies Last Post
uploading via FTP causes wrong umask tensigh Linux - Networking 9 05-09-2006 01:13 PM
Script for changing site name and uploading via FTP fleeced Linux - General 3 08-15-2004 05:15 PM
Uploading data and pics to FTP site laceupboots Linux - Software 2 01-27-2004 02:29 AM
FTP Uploading Problem Ned Linux - Networking 0 05-09-2003 03:22 PM
FTP uploading... Thymox Linux - Networking 4 02-05-2002 01:01 PM

LinuxQuestions.org > Forums > Linux Forums > Linux - Server

All times are GMT -5. The time now is 06:15 AM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Open Source Consulting | Domain Registration