I have no idea. Alex mentioned in his reply on this topic a couple of
options. And it wouldn't be difficult to script something in perl that
will read an access log and fetch every object fetched during the day.
Probably 5-10 minutes if you just want to test hit ratio (and don't care
about latency or throughput issues).
That said, Polygraph can provide a quite realistic workload without
relying on the quirks of the real internet--polymix-4 has no flaws as
far as I can see for realistically reproducing the workload a production
cache would see. Have you read the paper from the Hewlett Packard folks
who first implemented the heap-based policies? They've already done a
pretty solid bunch of research into the impact of the various policies.
How the policies perform is mostly a known quantity.
maer727@sohu.com wrote:
> Thanks, Joe pal!
>
> Your reply has clarified my doubts. I still have a question. If I want to use
> historical log-files to test the performance of my cache, which do you think
> is the best tool for I to use? My aim to test the LRU, GDSF and LFUDA algorithms
> ( The hit rate of them.)
>
> Best regards,
> George, Ma
-- Joe Cooper <joe@swelltech.com> http://www.swelltech.com Web Caching Appliances and SupportReceived on Wed Apr 10 2002 - 00:11:17 MDT
This archive was generated by hypermail pre-2.1.9 : Tue Dec 09 2003 - 16:15:00 MST