Re: [SQU] Interesting Question

From: Devin Teske <devinteske@dont-contact.us>
Date: Tue, 06 Mar 2001 23:04:52 -0800

I am thinking of going the wget route because the other route sounds like it
is not for me. I don't want to turn off access to outside for everyuser. It
would be more ideal if I could turn off_line mode off based on the
authentication credentials (because obviously a teacher is allowed to go
anywhere (s)he wants while a student would be limited to offline browsing of
the cache).

The wget route sounds very nice. I would allow the pages to be 'gotten'
through wget and saved on the server the students will not be allowed to
view anything but what is in the mirror where the pages are stored.

Thanks a lot for the help. Any further comments?
Thanks,
Devin Teske

>Thanks Joe,
>
>As an addendum:
>a) with squid, don't forget the -z after clearing the cache,
>b) you'll need to find some way to add new pages without turning
>off_line mode off during school hours.
>
>for wget
>a) wget can rewrite absolute page references within the local site to be
>relative.
>b) to get a single page and all graphics, you'll want -r 1 (the 1st link
>off)
>c) what is a potential problem is dynamically generated url's - wget
>won't run javascript or whatever, so those links won't be sucked down.
>
>And as Joe said, both have problems, but should work fairly well with
>some setup effort/testing.
>Rob
>
>----- Original Message -----
>From: "Joe Cooper" <joe@swelltech.com>
>To: "Devin Teske" <devinteske@hotmail.com>
>Cc: <squid-users@ircache.net>
>Sent: Wednesday, March 07, 2001 4:18 AM
>Subject: Re: [SQU] Interesting Question
>
>
> > One of the methods I suggested was to pre-fill the cache, and then go
> > into offline mode.
> >
> > This /may/ require modification of the no_cache settings in order to
> > cache the results of cgi scripts, and other things which are generally
> > not cachable. Experience will have to tell you what to do there, as
> > I've never personally done anything like what you want.
> >
> > In short, here's the steps to a pre-fill:
> >
> > Clear your cache (rm -rf /cachedir/*, or just format the partition).
> > Start Squid. Visit the pages that are needed for the class or
>whatever.
> > Turn on offline_mode in the squid.conf file.
> > Restart Squid.
> > Browse those pages. offline_mode will prevent any other pages from
> > being visited.
> >
> > If all content is static, and none of the content has aggressive
>expiry
> > times, this will work fine. And is probably the easiest for your
> > teachers to use. You could put up a small cgi script on each system,
> > that when called will put the cache into offline mode. And another to
> > empty the cache and put it back into online mode. Then the teachers
> > could click a button to start filling and a button to allow the
>offline
> > browsing.
> >
> > Next is Robert's suggestion for using wget to create a local mirror.
> > Also a good option, but also with some potential problems to be worked
> > around.
> >
> > With wget, you can do what is called a recursive web suck (option
> > -r)...by default this will suck down copies of every link down to 5
> > levels of links (so each link on each page will be pulled down into a
> > local directory). You can then browse this local directory (you could
> > even put it onto a local webserver if you needed it to be shareable
> > across the whole school). The potential problems include absolute
>links
> > in the pages (http://www.yahoo.com/some/page...will jump out of your
> > local mirror...whereas some/page will not).
> >
> > Note that both have their problems...but both will do what you want
>with
> > a little work. There is no magic button to push to limit in such a
> > strict way the internet. Because resources are so very distributed,
>it
> > is very hard to pick one 'page' and say let's only allow this one
>page.
> > That page possibly links to and pulls from a hundred servers. Maybe
> > not...but it could.
> >
> > Devin Teske wrote:
> >
> > >>>>> Forwarded to Squidusers
> > >>>>
> > >
> > >> Joe's pointer (squid cache retention times) & mine (wget from a
>full
> > >> access account to make mirrors) will work.
> > >>
> > >> Squid ACLs, Squid redirectors, WILL NOT.
> > >
> > >
> > > Can you explain in more detail how I would implement either Joe's or
> > > Rob's scenario? How would they both work?
> > >
> > > Thanks,
> > > Devin Teske
> >
> >
> > --
> > Joe Cooper <joe@swelltech.com>
> > Affordable Web Caching Proxy Appliances
> > http://www.swelltech.com
> >
> > --
> > To unsubscribe, see http://www.squid-cache.org/mailing-lists.html
> >
> >
>
_________________________________________________________________
Get your FREE download of MSN Explorer at http://explorer.msn.com

--
To unsubscribe, see http://www.squid-cache.org/mailing-lists.html
Received on Wed Mar 07 2001 - 00:08:24 MST

This archive was generated by hypermail pre-2.1.9 : Tue Dec 09 2003 - 16:58:33 MST