In less than 0.1 seconds, Ubuntu's
linkchecker stops while crawling my local site, and I can't really see what's causing it.
Code:
Check time 0.046 seconds
Size 299B
Result Error: 503 Service Unavailable
Statistics:
Downloaded: 0B.
Content types: 0 image, 1 text, 0 video, 0 audio, 0 application, 0 mail and 0 other.
URL lengths: min=26, max=26, avg=26.
I can
wget the home page of the site
(which is 119K) without difficulty, but the link-checker stops cold without saying that it read more than, I think, 299 bytes. I've tried disabling threading or sharply curtailing it using command-line options with no positive effect. Basically, I don't seem to be able to get a handle on it to begin to diagnose it.