Re: [squid-users] Blocking Certain URLS

From: William Carty <admin@dont-contact.us>
Date: Wed, 6 Feb 2002 23:04:26 -0500

Doh. The actual page I was having problems with was something like
www.real-hot-porn.com. Somehow I had managed to put "-porn" in the
exclusion file... well, glad to see that the exclusion list is working
:-)

Hey, thanks for the reply. Appreciate it!

----- Original Message -----
From: "Colin Campbell" <sgcccdc@citec.qld.gov.au>
To: "William Carty" <admin@thinktankdecoy.com>
Cc: <squid-users@squid-cache.org>
Sent: Wednesday, February 06, 2002 9:10 PM
Subject: Re: [squid-users] Blocking Certain URLS

> Hi,
>
> On Wed, 6 Feb 2002, William Carty wrote:
>
> > The only problem I have is blocking urls that contain dashes. For
> > example:
> >
> > www.screw-for-sex.com
> >
> > In the wordlist file, I have 'screw' & 'sex'. Had the url been:
> >
> > www.screwforsex.com - it would have gotten blocked.
> >
> > I've tried adding a few url's with dashes directly to the wordlist
file
> > like this:
> >
> > screw-for-sex.\com
>
> Typo? You want "screw-for-sex\.com" methinks.
>
> > That doesn't work either.
>
> Are you sure it is working but getting passed by "notporn"? Try adding
> 28,9 to debug_options.
>
> > Does anyone know how I could input url's with dashes in to this text
> > file so that they get blocked, too?
>
> I don't believe it's necessary. I just ran a small test.
>
> acl baddies url-regex "/tmp/baddies"
> http_access deny baddies
>
> /tmp/baddies contained two lines:
>
> screw
> sex
>
> Here's my cache.log output:
>
> | 2002/02/07 11:58:35| aclCheckFast: list: 0x81c7120
> | 2002/02/07 11:58:35| aclMatchAclList: checking all
> | 2002/02/07 11:58:35| aclMatchAcl: checking 'acl all src
0.0.0.0/0.0.0.0'
> | 2002/02/07 11:58:35| aclMatchIp: '127.0.0.1' found
> | 2002/02/07 11:58:35| aclMatchAclList: returning 1
> | 2002/02/07 11:58:35| aclCheck: checking 'http_access deny baddies'
> | 2002/02/07 11:58:35| aclMatchAclList: checking baddies
> | 2002/02/07 11:58:35| aclMatchAcl: checking 'acl baddies url_regex -i
"/tmp/baddies"'
> | 2002/02/07 11:58:35| aclMatchRegex: checking
'http://wwww.nuts-bolts-and-screws.com/'
> | 2002/02/07 11:58:35| aclMatchRegex: looking for 'screw'
> | 2002/02/07 11:58:35| aclMatchAclList: returning 1
> | 2002/02/07 11:58:35| aclCheck: match found, returning 0
> | 2002/02/07 11:58:35| aclCheckCallback: answer=0
>
> and access was denied as the browser says:
>
> | ERROR
> |
> | The requested URL could not be retrieved
> |
> |
> |
> | While trying to retrieve the URL:
http://wwww.nuts-bolts-and-screws.com/
> |
> | The following error was encountered:
> |
> | Access Denied.
> |
> | Access control configuration prevents your request from being
allowed
> | at this time. Please contact your service provider if you feel
this
> | is incorrect.
>
> Of course I have now blocked access to any site with the letters
> "s-c-r-e-w" in anywhere in the URL. I guess that's why you have the
> "excluded" list. :-)
>
> Colin
> --
> Colin Campbell
> Unix Support/Postmaster/Hostmaster
> CITEC
> +61 7 3006 4710
>
Received on Wed Feb 06 2002 - 20:59:31 MST

This archive was generated by hypermail pre-2.1.9 : Tue Dec 09 2003 - 17:06:11 MST