Why does my Perl TCP server script hang with many TCP connections?

This probably isn't the solution to your problem, but it might solve a problem you'll experience in the future: don't forget to close() the sockets when you're done! Shutdown() will disconnect the stream, but it'll still eat a file descriptor.

This probably isn't the solution to your problem, but it might solve a problem you'll experience in the future: don't forget to close() the sockets when you're done! Shutdown() will disconnect the stream, but it'll still eat a file descriptor. Since you said strace is showing processes stuck in read(), then your problem seems to be that the client isn't sending the data you expect it to be sending.

You should either fix your client, or add an alarm() to your server processes so that they can survive dead clients.

If so you may not have your system max open files limit set high enough.

The open files ulimit is at 1024. The server never goes over ~100 dead (time_wait) connections and never over 10 (1 per forked process) live connections. When the blocked connections start, they happen on around ¼ of connections (3 will go through, one will block for ~5 seconds, until the timeout protection kicks in and respawns the process).

– viraptor Apr 20 '10 at 0:17.

I cant really gove you an answer,but what I can give you is a way to a solution, that is you have to find the anglde that you relate to or peaks your interest. A good paper is one that people get drawn into because it reaches them ln some way.As for me WW11 to me, I think of the holocaust and the effect it had on the survivors, their families and those who stood by and did nothing until it was too late.

Related Questions