Re: [squid-users] Concurrent Connection Limit

From: <Rafael.Almeida@dont-contact.us>
Date: Tue, 12 Jul 2005 10:55:14 -0300

I would take a look at File descriptor allocation in the cachemgr.cgi and
discover what is going on with your file descriptors.

Rafael Sarres de Almeida
Seção de Gerenciamento de Rede
Superior Tribunal de Justiça
Tel: (61) 319-9342

Jeffrey Ng <jeffreyn@gmail.com>
12/07/2005 02:44
Favor responder a
Jeffrey Ng <jeffreyn@gmail.com>

Para
"Rafael.Almeida@stj.gov.br" <Rafael.Almeida@stj.gov.br>
cc
squid-users@squid-cache.org
Assunto
Re: [squid-users] Concurrent Connection Limit

I recompiled squid and re-installed, and cahce log showed the 2048 file
descriptors there, but z19 still didnt work quite right - showed about
1400 network conenctions.

so i rebuilt for 25088 file descriptors (way over kill hehe).

recompiled again and now its live, i dont know if its perfect but right
now there are (and im not kidding) its now showing 29322 network sockets
open right now.

Im baffled. Somethign doesnt jive. And the cpu spike to 6.0, with that
many sockets opened.

so i set ulimit to 999999 (1 million basicaly) and rebuild squid again.

According ot cache.log it locks at 32768 file descriptors, it wont load
the 9999999, it appears a hard limit in deed! Anyway netstat -vatn shows
only 3352 open sockets right now but z19 isnt repsonding well, and within
a few minutes i stoped it and went back to http mode...

what should i do the next?

On 7/12/05, Rafael.Almeida@stj.gov.br <Rafael.Almeida@stj.gov.br> wrote:
> Have you checked if your File descriptors? Each socket uses one file
> descriptor and there is a limit of 1024 on Linux by default. You can
> increase it if yopu need.
> Take a look here:
> http://www.onlamp.com/pub/a/onlamp/2004/03/25/squid.html?page=2
>
> Hope that helps.
>
> Rafael Sarres de Almeida
> Seção de Gerenciamento de Rede
> Superior Tribunal de Justiça
> Tel: (61) 319-9342
>
>
>
>
>
> Jeffrey Ng <jeffreyn@gmail.com>
> 11/07/2005 15:12
> Favor responder a
> Jeffrey Ng <jeffreyn@gmail.com>
>
>
> Para
> squid-users@squid-cache.org
> cc
>
> Assunto
> Re: [squid-users] Concurrent Connection Limit
>
>
>
>
>
>
> Hello? does anybody know what's wrong?..
>
> On 7/10/05, Joshua Goodall <joshua@roughtrade.net> wrote:
> > On Sun, Jul 10, 2005 at 02:04:36PM +0800, Jeffrey Ng wrote:
> > > Hi, I have problem with squid web accelerator on my site. My site is
a
> > > photo sharing site like webshots. It has a pretty busy load, so I
> > > decided that squid may be able to sooth the load of my image server
by
> > > caching some of the images. We have set everything up and it uses
1GB
> > > RAM. It was fine at first. But suddenly all the images stopped
loading
> > > after 6 hours. I checked netstat and found that there are 1000
> > > connections from outside. and squid stops responding whenever the
> > > connections hit that number. I am pretty sure that squid has a
> > > concurrent connection limit of 1000. How could I increase that
limit?
> > > Any help is appreaciated. Thank you!
> >
> > Sounds like you're running out of filedescriptors.
> > See http://www.squid-cache.org/Doc/FAQ/FAQ-11.html#ss11.4
> >
> > - Joshua.
> >
> > --
> > Joshua Goodall "as modern as tomorrow
> afternoon"
> > joshua@roughtrade.net - FW109
> >
>
>
>
Received on Tue Jul 12 2005 - 08:00:35 MDT

This archive was generated by hypermail pre-2.1.9 : Mon Aug 01 2005 - 12:00:02 MDT