LinuxQuestions.org

LinuxQuestions.org (/questions/)
-   Linux - Software (https://www.linuxquestions.org/questions/linux-software-2/)
-   -   proxy caching(squid) (https://www.linuxquestions.org/questions/linux-software-2/proxy-caching-squid-545018/)

nics 04-10-2007 11:44 PM

proxy caching(squid)
 
hello! plzz help
i m having problems in caching the inner links of a html page.and i want to know in detail how to download the inner links of html page level by level in our server by using 'wget' command and where to put those downloaded links in our proxy server or squid locations?????
can anybody plzzz help me out????

acid_kewpie 04-11-2007 01:44 AM

what does this have to do with squid? are you after help with wget or squid?

nics 04-12-2007 11:27 AM

Quote:

Originally Posted by acid_kewpie
what does this have to do with squid? are you after help with wget or squid?

i want help in caching more levels like up to 8 or 16 levels

acid_kewpie 04-12-2007 12:11 PM

more levels? squid doesn't do any prefetching by default... wanting to prefetch at all is a debatable tactic, you might get a 10 to 20% hit return. on 16 levels? i'd be very suprised at more than a 0.1% hit ratio

nics 04-16-2007 05:44 AM

need the commands for downloading
 
Quote:

Originally Posted by acid_kewpie
more levels? squid doesn't do any prefetching by default... wanting to prefetch at all is a debatable tactic, you might get a 10 to 20% hit return. on 16 levels? i'd be very suprised at more than a 0.1% hit ratio

hello plzzz help me by sending the commands for downloading the inner links of html page...and plzz kindly tell me the location in the squid to place the same...
would really be grateful to u if u could help me

acid_kewpie 04-16-2007 05:58 AM

you're just repeating yourself, which really not much use as you've not been able to effectively explain what you want to do in the first place. the only thing i can imagine you mean is to follow links of a given page to a depth of 8 to 16 levels. if an average page contains 10 links, you are saying you wish to download between 191739600 and 3216856876693200 pages for every single link you click...

and now i've actually bothered to do the math, i'll downgrade my 0.1% estimate, to about 0.0000000001%!

nics 04-17-2007 11:07 PM

re
 
so what should i do ?? or what
do u suggest me to do to make the browsing faster????should i leave my proxy server as it is or do i need to modify it???
plzzz suggest on this topic...

acid_kewpie 04-18-2007 02:04 AM

what should you do? you should explain your question correctly. we can't help you if we can't undersatnd your problem.

cylent 05-23-2007 10:25 AM

Quote:

Originally Posted by acid_kewpie
you're just repeating yourself, which really not much use as you've not been able to effectively explain what you want to do in the first place. the only thing i can imagine you mean is to follow links of a given page to a depth of 8 to 16 levels. if an average page contains 10 links, you are saying you wish to download between 191739600 and 3216856876693200 pages for every single link you click...

and now i've actually bothered to do the math, i'll downgrade my 0.1% estimate, to about 0.0000000001%!

LOL!!! thats funny that you did the math.

does this sound like a n00b to you?

hey, nics, :study:


All times are GMT -5. The time now is 11:59 PM.