0
我在处理数百万个文件搜索文件夹时遇到内存问题。有谁知道如何克服这种情况?有没有一些方法来限制glob将搜索的文件?所以它可以分块执行?Python glob.glob(dir)内存错误
Traceback (most recent call last):
File "./lb2_lmanager", line 533, in <module>
main(sys.argv[1:])
File "./lb2_lmanager", line 318, in main
matched = match_files(policy.directory, policy.file_patterns)
File "./lb2_lmanager", line 32, in wrapper
res = func(*args, **kwargs)
File "./lb2_lmanager", line 380, in match_files
listing = glob.glob(directory)
File "/usr/lib/python2.6/glob.py", line 16, in glob
return list(iglob(pathname))
File "/usr/lib/python2.6/glob.py", line 43, in iglob
yield os.path.join(dirname, name)
File "/usr/lib/python2.6/posixpath.py", line 70, in join
path += '/' + b
MemoryError
你试过iglob而不是glob吗? – LexyStardust
请给出一个代码示例! –