LinuxQuestions.org

LinuxQuestions.org (/questions/)
-   Programming (https://www.linuxquestions.org/questions/programming-9/)
-   -   wget --spider $webdir (https://www.linuxquestions.org/questions/programming-9/wget-spider-%24webdir-945400/)

masavini 05-16-2012 07:41 PM

wget --spider $webdir
 
hi,
is it possibile to use wget --spider to check for web directory existance?

plain command doesn't seem to work:
Code:

$ wget -q --spider "www.li-tech-shop.it/tmp/"
$ echo $?
8

it seems to give $? == 0? only if some sort of index file exists.
how to get $? == 0? if no index file is present but directory actually exists?

thanks...

cliffordw 05-17-2012 01:33 AM

Hi there,

wget returns 8 (Server issued an error response) for both a non-existent directory and a directory which exists but you don't have access to (like your example where there is no index and directory listings are not allowed). They return different HTTP response codes, though.

I'd suggest you don't use the "-q" option. Without it wget returns the HTTP response, which you can interpret. For the URL in your example, the return code is 403 (Forbidden), while for a non-existent directory it would be 404 (Not Found).

Using curl with the --head option instead of wget might be easier to use for this particular purpose.

I hope this helps.

masavini 05-17-2012 03:25 AM

thank you!

Code:

                        _dirCheck=$(curl -s --head "$_remoteDir")
                        if [[ $_dirCheck =~ "404 Not Found" ]]; then
                                echo "open -u $global__ftpUsr,$global__ftpPwd $global__ftpHost" > $_script
                                echo "mkdir -p $_remoteDir" >> $_script
                                echo "exit" >> $_script
                                lftp -q -f $_script &> /dev/null
                        fi



All times are GMT -5. The time now is 04:57 PM.