proxy caching(squid)
hello! plzz help
i m having problems in caching the inner links of a html page.and i want to know in detail how to download the inner links of html page level by level in our server by using 'wget' command and where to put those downloaded links in our proxy server or squid locations????? can anybody plzzz help me out???? |
what does this have to do with squid? are you after help with wget or squid?
|
Quote:
|
more levels? squid doesn't do any prefetching by default... wanting to prefetch at all is a debatable tactic, you might get a 10 to 20% hit return. on 16 levels? i'd be very suprised at more than a 0.1% hit ratio
|
need the commands for downloading
Quote:
would really be grateful to u if u could help me |
you're just repeating yourself, which really not much use as you've not been able to effectively explain what you want to do in the first place. the only thing i can imagine you mean is to follow links of a given page to a depth of 8 to 16 levels. if an average page contains 10 links, you are saying you wish to download between 191739600 and 3216856876693200 pages for every single link you click...
and now i've actually bothered to do the math, i'll downgrade my 0.1% estimate, to about 0.0000000001%! |
re
so what should i do ?? or what
do u suggest me to do to make the browsing faster????should i leave my proxy server as it is or do i need to modify it??? plzzz suggest on this topic... |
what should you do? you should explain your question correctly. we can't help you if we can't undersatnd your problem.
|
Quote:
does this sound like a n00b to you? hey, nics, :study: |
All times are GMT -5. The time now is 11:59 PM. |