Too many open files in python

后端 未结 3 1276
离开以前
离开以前 2020-12-05 23:52

I wrote kind of a test suite which is heavily file intensive. After some time (2h) I get an IOError: [Errno 24] Too many open files: \'/tmp/tmpxsqYPm\'. I doubl

相关标签:
3条回答
  • 2020-12-06 00:16

    resource.RLIMIT_NOFILE is indeed 7, but that's an index into resource.getrlimit(), not the limit itself... resource.getrlimit(resource.RLIMIT_NOFILE) is what you want your top range() to be

    0 讨论(0)
  • 2020-12-06 00:20

    Your test script overwrites f each iteration, which means that the file will get closed each time. Both logging to files and subprocess with pipes use up descriptors, which can lead to exhaustion.

    0 讨论(0)
  • 2020-12-06 00:25

    The corrected code is:

    import resource
    import fcntl
    import os
    
    def get_open_fds():
        fds = []
        soft, hard = resource.getrlimit(resource.RLIMIT_NOFILE)
        for fd in range(0, soft):
            try:
                flags = fcntl.fcntl(fd, fcntl.F_GETFD)
            except IOError:
                continue
            fds.append(fd)
        return fds
    
    def get_file_names_from_file_number(fds):
        names = []
        for fd in fds:
            names.append(os.readlink('/proc/self/fd/%d' % fd))
        return names
    
    fds = get_open_fds()
    print get_file_names_from_file_number(fds)
    
    0 讨论(0)
提交回复
热议问题