Web caching solutions?
I have a number of machines on my home LAN running Fedora. Every time I update them, the machines download the same set of files from the yum/dnf repos. What kind of web caching solution can I use so that the files are cached preferably in-memory on the caching server?
I know I can use squid. I can also set up a local yum/dnf repo. But I am looking for a web caching solution. |
I don't know fedora but I would guess that there is a package cache solution. Probably will have to setup a package cache master on your LAN then point all your other hosts to update from that master.
|
Quote:
How is a web caching solution relevant to yum updates? https://www.varnish-cache.org/ |
Quote:
isn't squid a proxy manager that can be configured to cache? local repository: not a bad idea. what if you need a fix when internet ISP is down and can't work without it? your ISP / telco DOES cache more popular web hits :) are you asking if there is a free cloud server on the internet (googleish speed and size and infrastructure) that would allow you to specify a url for caching to avoid download pulls on redhat ? or a mirror site closer to you? dunno or maybe ring file sharing / bittorrent ? if not i can't think if what you want. you already have telco caching, squid, and said you have ability to dl files and run your own mirror using (apache) (wish i did. i'd need to go buy hardware, time and money are short) |
All times are GMT -5. The time now is 05:14 AM. |