python - Too many files open with multiprocessing.Pool -
i have python script runs multiprocessing.pool process lot of files separately. have cpu limit of 8. problem after running while "ioerror: [errno 24] many open files". each child process opens few files reading file.open(). these file handlers passed multiple functions retrieve data. @ end of each child process these files closed file.close(). tried statement did not fix issue. 1 have idea whats wrong. googled around failed find answers. closing files , functions returning keep file handlers around.
my settings mac 10.5 python 2.6
thanks
ogan
custom import func1, func2 # func1 , func2 seek, read , return values form file # however, not close file import multiprocessing def worker(*args): f1 = open("db1.txt") f2 = open("db2.txt") each in args[1]: # many stuff x = func1(f1) y = func2(f2) f1.close() f2.close() return data = {1:[2], 2:[3]} jobp= multiprocessing.pool(8) jobp.map_async(worker, data.items()) jobp.close() jobp.join()
its being limited operating system's open file limit. see how change number of open files limit in linux? more information. prefer change /etc/security/limits.conf settings.
Comments
Post a Comment