Socket accept - “Too many open files”

后端 未结 13 1929
情歌与酒
情歌与酒 2020-11-28 02:48

I am working on a school project where I had to write a multi-threaded server, and now I am comparing it to apache by running some tests against it. I am using autobench to

13条回答
  •  误落风尘
    2020-11-28 03:22

    I had similar problem. Quick solution is :

    ulimit -n 4096
    

    explanation is as follows - each server connection is a file descriptor. In CentOS, Redhat and Fedora, probably others, file user limit is 1024 - no idea why. It can be easily seen when you type: ulimit -n

    Note this has no much relation to system max files (/proc/sys/fs/file-max).

    In my case it was problem with Redis, so I did:

    ulimit -n 4096
    redis-server -c xxxx
    

    in your case instead of redis, you need to start your server.

提交回复
热议问题