0

I'd like to set up Linux to cache some commonly requested URLs so that it doesn't have to go out to the net to get them every time. This would be a system-wide URL cache, not just in a particular browser. For example, a program might request http://www.w3.org/TR/html4/sgml/loosedtd.html a few thousand times per day. There are many other URLs I'd like to cache as well.

Is there an app for that?

Travis J Webb
  • 278
  • 1
  • 2
  • 7

1 Answers1

2

This kind of apps is called "cache proxy" You may take a look at this wikipedia entry to quickly understand what is a cache proxy : http://en.wikipedia.org/wiki/Proxy_server#Caching You may also take a look at squid-cache.org project : http://www.squid-cache.org/

m0ntassar
  • 1,263
  • 7
  • 12
  • This is the opposite of what I want. This isn't for a server, it's for a client machine. – Travis J Webb Mar 30 '12 at 16:49
  • 1
    Keep reading. You set up Squid on a server and either configure clients to use it as their proxy server or just have your firewall silently redirect all web traffic through the proxy. This will result in an office-wide cache, not just a system-wide cache. On a Linux client, you can set the `http_proxy` environment variable so that command line clients like `curl` and `wget` will all use it. – Ladadadada Mar 30 '12 at 17:12