Help downloading files recursively from an http site
I feel like an idiot here because I cannot manage to do this and I should know better. Anyway how do I download all the files form here: http://openmetaverse.org/viewvc/inde...anches/stable/
I am on freeBSD 7.0 and I tried wget with the -r switch and it gives me URL's only. Maybe this is simply not an ftp site I don't know. Any suggestions on how I can download all those files with the same directory structure would be appreciated. Thanks. |
Try this:
wget -r -l5 -H -t1 -nd -N -np -A.jpg,.gif -erobots=off <url_here> |
Looking at that link you seem to be after the latest stable code. If this is indeed the case then the simplest way is to get it using subversion.
Details for the URLs are on the download page. http://openmv.org/projects/libopenmetaverse/download The commands below will either check-out or export 0.7.0 of the stable branch, there is a link for the development branch too if you need that its on the downloads page above. Lee |
Hey Cardy, I think I would have to install svn since it's not how freeBSD does things. I know I can install svn on freeBSD but I am not sure I want to at this point. Maybe I will have to I don't know?
roreilly I don't want to download the actual web files but the source code files that are in that repository. That wget command will have me downloading jpegs and gif's won't it? |
Mmmmmm I can see what you mean however as all the files are being viewed though some form of web based viewer there is a chance of the files being modified in some way by that viewer. svn would at least give you a clean code set to work with. I believe you can always un-install svn packages afterwards if you are only likely to run the command once.
|
Yeah I'm thinking I might do that now that package is not in the ports collection anyway and if I ever do get this running on this box I'll let it sit in this state forever.
|
All times are GMT -5. The time now is 07:53 PM. |