I am not sure if I can do this but here goes...
My setup is this. Me and a mate have our own networks and internet connections, but have our networks linked by wifi. The wifi cards are in linux routers, which also route onto the internet, and each run squid. However, as these are domestic internet connections, we cannot bond the connections, to provide faster connections.
My solution to this is: get squid to distribute the load between the connections. What I would like is:
- Local squid receives request
- Checks if it is in cache, if so return it
- Checks if it is in other squid's cache, if so get and return it
- If not, use a weighted round robin to distribute between directly getting file or forwarding request to other squid (which would know it's from it's peer so not use this step)
The main use for this would be to allow 'segmented downloads' (eg from getright/aria2 etc) to pull each segment off a different link, and also provide fault tolerance (it is unlikely that both ISP's will go down at once), and I would also set up QoS so that each of us still had priority on our own link (I can max out mine and use the spare on theirs without them noticing any severe slowdown, and vice versa).
Any ideas? I could do something similar using redirectors, but this would rewrite URL's so I don't know if it would cache correctly (under original URL), and also resuming may be a problem. I also want to make sure that some sites this doesnt happen with (ones which check IP's and link login to that, and any that mess up with this configuration)