Linux - Server This forum is for the discussion of Linux Software used in a server related context. |
Notices |
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
Are you new to LinuxQuestions.org? Visit the following links:
Site Howto |
Site FAQ |
Sitemap |
Register Now
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
|
|
11-03-2006, 04:21 AM
|
#1
|
LQ Newbie
Registered: Oct 2006
Location: London, United Kingdom
Distribution: Ubuntu 6.10
Posts: 26
Rep:
|
What is the best way of uploading a large site using FTP?
Hi all,
I recently started using Linux (Fedora Core 5) for editing and creating Websites using Blue Fish 1.0.4. One of the sites I run is still stored on my hard disk and I need to upload it entirely to the remote server, I tried uploading it with the mput command but not sure on how it works to upload the whole thing at once. So after googling for a while I found that it could be handy to tar and gz the entire site and upload it as a single file and The file was transfered to the server without problems.
Now my problem is that I don't know how can I untar and unzip it in the remote server, I tried Gunzip and tar -xf unsuccessfully. My question is: Is this the best way to upload a big site with many folders and files and is there actually any way to uncompress it once there? or Are there any better ways to do this?
Any help or guidance would be highly appreciated...
Thanks
Last edited by Dannt; 11-03-2006 at 04:23 AM.
|
|
|
11-03-2006, 02:36 PM
|
#2
|
LQ Newbie
Registered: Apr 2006
Location: Minneapolis, Minnesota, USA
Posts: 16
Rep:
|
You're doing it the way I would have done it. Does the tar command on the remote web server support the -z option for zipping and unzipping files? Most Linux systems I've used do and most UNIX systems don't. If your remote server doesn't support -z and doesn't have gzip/gunzip installed, you could check if it has compress/uncompress. Compress is an older, somewhat less efficient way to shrink files that is supported by most Linux and UNIX systems. If that doesn't work, just send it uncompressed. It will take a bit longer to transfer but you won't have to worry about compression incompatibilities then.
One other thing to check, unrelated to compression, is that you transferred the file in binary mode rather than ascii mode with ftp. Most UNIX ftp daemons default to ascii mode which causes them to trash non-ASCII files.
|
|
|
11-04-2006, 09:26 AM
|
#3
|
LQ Newbie
Registered: Oct 2006
Location: London, United Kingdom
Distribution: Ubuntu 6.10
Posts: 26
Original Poster
Rep:
|
Dear jyoung4
Thanks for your reply. My main problem is that I am not able to run any of the commands you mentioned above when I am on the server (Gunzip, tar, etc) when trying I have a "Invalid Command name" message. So it seems like I need to have those packages installed there or something. Do you have any idea of how can I go arround this issue?
Many thanks for your help..
Dannt
Last edited by Dannt; 11-04-2006 at 12:15 PM.
|
|
|
11-06-2006, 04:02 AM
|
#4
|
LQ Newbie
Registered: Oct 2006
Location: London, United Kingdom
Distribution: Ubuntu 6.10
Posts: 26
Original Poster
Rep:
|
This is solved now! I started usig Gftp which is a graphic Ftp application that allows to upload and download as many files as necesary...
Cheers
|
|
|
02-08-2011, 10:56 PM
|
#5
|
LQ Newbie
Registered: Feb 2011
Posts: 3
Rep:
|
I have the same problem
Hi. I have this very same problem and notice this thread is old -- dated 2006. What's the latest news on this issue?
I have a VERY large site of millions of files that I need to somehow upload to a server. In fact, I don't even have a server yet. Is it possible to get a server and just hand deliver to them a hard drive or DVD of one's site?
Any help is much appreciated.
By the way, I'm not much of a programmer and have never used Linux, just a Mac and HTML.
|
|
|
02-09-2011, 12:55 AM
|
#6
|
LQ Newbie
Registered: Dec 2010
Location: Jakarta, ID
Distribution: Fedora, Ubuntu
Posts: 18
Rep:
|
hi phm,
MILLIONS of files?
If that is all in plain html, I think the is 'something' in your site design. Usually, that numbers of 'file' will be put in a database for dynamic access via php or else.
If you want to tar + compress those files, I thing the size is still giving you some troubles. Uploading single huge file sure is a challenging task. Maybe you can group the files according to directories and do tar on each directories before uploading.
Before that, better you make sure that the size of ALL files fit in to the server's quota. It will depends on the local provider if you choose to send your harddisk.
Good luck...
|
|
|
02-09-2011, 10:20 AM
|
#7
|
LQ Newbie
Registered: Feb 2011
Posts: 3
Rep:
|
Thanks. Yes, eight million to be exact. I don't even know what tar is but I'll look it up.
|
|
|
02-09-2011, 10:28 AM
|
#8
|
LQ Newbie
Registered: Apr 2006
Location: Minneapolis, Minnesota, USA
Posts: 16
Rep:
|
Uploading millions of files
I would agree with an15wn that millions of files is "a lot" and you would more commonly see a web site design that stored all this in a database or had some sort of dynamic HTML generator that can create the output but ... that's a different question.
As for the DVD - yes, you could do that depending on what server it is you're loading up, whether they will allow it, will it all fit on one DVD etc. However, it might be simpler to go with ftp. As Dannt mentioned back in 2006, he used a GUI ftp tool that did the upload pretty easily. I would look into that possibility first. I use a freeware tool (WinSCP) on Windows. I would imagine that there might be Mac versions of that tool or something similar that you could find. Then, just start the transfer and let it run overnight.
|
|
|
02-09-2011, 11:32 AM
|
#9
|
LQ Newbie
Registered: Feb 2011
Posts: 3
Rep:
|
Thanks so much for your response. I have used Dreamweaver a lot and it's a program I love and I've used it for ftp transfers. I guess my question/problem is how does one get a fast upload connection? Perhaps this is not the right forum for that question but ... I guess if I have a fast broadband connection I can just use ftp as you say and let it run overnight. I've heard that cable modems have fast upload speeds.
|
|
|
All times are GMT -5. The time now is 06:15 AM.
|
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.
|
Latest Threads
LQ News
|
|