Has anyone given any thought to the problem of multiple download sites?
For example, downloads of Microsofts Internet Explorer could be handled by
a gazillion of web sites, like msvaus.www.connxion.com or
mskuys.www.connxion.com. Squid handles them as different objects, thereby
increasing the bandwidth demands if two users download them from different
sites. In IE4, this could be tweaked by setting up a redirector for the
file IE5SITES.DAT, and replacing it with a stripped down copy to make
everyone use the same download site. However, with IE5 Microsoft turned to
redirects from their main download site, busting this workaround. Of
course, rewriting the URL from a redirector is still possible, but that
sort of defeats the purpose of allowing the user to pick a "good" download
site.
I've considered adding a feature to Squid that uses regexes to rewrite the
URL MD5 hash for such sites, so that once a copy from msvaus has been
cached and a request is then made for the identical URI but at mskuys, it
would still get the cached copy.
Thoughts? Am I once again overlooking features in post-2.2 Squids? :-)
Cheers,
-- Bert
Bert Driehuis, MIS -- bert_driehuis@nl.compuware.com -- +31-20-3116119
Every nonzero finite dimensional inner product space has an
orthonormal basis. It makes sense, when you don't think about it.
Received on Sun Apr 09 2000 - 17:12:28 MDT
This archive was generated by hypermail pre-2.1.9 : Tue Dec 09 2003 - 16:12:22 MST