Quote:
Originally Posted by the_gripmaster
I have a number of machines on my home LAN running Fedora. Every time I update them, the machines download the same set of files from the yum/dnf repos. What kind of web caching solution can I use so that the files are cached preferably in-memory on the caching server?
I know I can use squid. I can also set up a local yum/dnf repo. But I am looking for a web caching solution.
|
-------------------------------
isn't squid a proxy manager that can be configured to cache?
local repository: not a bad idea. what if you need a fix when internet ISP is down and can't work without it?
your ISP / telco DOES cache more popular web hits
are you asking if there is a free cloud server on the internet (googleish speed and size and infrastructure) that would allow you to specify a url for caching to avoid download pulls on redhat ? or a mirror site closer to you? dunno
or maybe ring file sharing / bittorrent ?
if not i can't think if what you want. you already have telco caching, squid, and said you have ability to dl files and run your own mirror using (apache)
(wish i did. i'd need to go buy hardware, time and money are short)