2017-03-08 258 views
2

我使用Pillow lib创建缩略图。我创造了很多人,居然超过10000Python Pillow - ValueError:解压缩数据太大

程序工作正常,但加工后的四围1.500,我得到以下错误:

Traceback (most recent call last): 
    File "thumb.py", line 15, in <module> 
    im = Image.open('/Users/Marcel/images/07032017/' + infile) 
    File "/Users/Marcel/product-/PIL/Image.py", line 2339, in open 
    im = _open_core(fp, filename, prefix) 
    File "/Users/Marcel/product-/PIL/Image.py", line 2329, in _open_core 
    im = factory(fp, filename) 
    File "/Users/Marcel/product-/PIL/ImageFile.py", line 97, in __init__ 
    self._open() 
    File "/Users/Marcel/product-/PIL/PngImagePlugin.py", line 538, in _open 
    s = self.png.call(cid, pos, length) 
    File "/Users/Marcel/product-/PIL/PngImagePlugin.py", line 136, in call 
    return getattr(self, "chunk_" + cid.decode('ascii'))(pos, length) 
    File "/Users/Marcel/product-/PIL/PngImagePlugin.py", line 319, in chunk_iCCP 
    icc_profile = _safe_zlib_decompress(s[i+2:]) 
    File "/Users/Marcel/product-/PIL/PngImagePlugin.py", line 90, in _safe_zlib_decompress 
    raise ValueError("Decompressed Data Too Large") 
ValueError: Decompressed Data Too Large 

我的程序是非常简单的:

import os, sys 
import PIL 
from PIL import Image 

size = 235, 210 
reviewedProductsList = open('products.txt', 'r') 
reviewedProducts = reviewedProductsList.readlines() 
t = map(lambda s: s.strip(), reviewedProducts) 

print "Thumbs to create: '%s'" % len(reviewedProducts) 

for infile in t: 
    outfile = infile 
    try: 
     im = Image.open('/Users/Marcel/images/07032017/' + infile) 
     im.thumbnail(size, Image.ANTIALIAS) 
     print "thumb created" 
     im.save('/Users/Marcel/product-/thumbs/' + outfile, "JPEG") 
    except IOError, e: 
     print "cannot create thumbnail for '%s'" % infile 
     print "error: '%s'" % e 

我在本地MacBook Pro上执行此操作。

+1

我不太清楚是什么导致了这个错误,但是你总是可以添加第二个错误处理位来至少记录哪些文件导致这个错误,并继续执行你的列表,例如:''',除了ValueError,e:print“不能创建'%s'的缩略图“%infile ”错误:' %s'“%e''' –

+1

我不知道缓冲一下可能会阻止吗?听起来就像程序内存不足。你是否也确定它不会在同一点崩溃? – celestialroad

+0

@celestialroad有趣的是,它对于第一批1500张照片来说工作得很好。我现在将测试来自James Kent的日志选项,以查看更多 – Marcel

回答

3

这是为了防止运行Pillow的服务器由于解压缩炸弹造成的潜在DoS攻击。当发现解压缩的图像具有太大的元数据时发生。见http://pillow.readthedocs.io/en/4.0.x/handbook/image-file-formats.html?highlight=decompression#png

这里的CVE报告:https://开头www.cvedetails.com/cve/CVE-2014-9601/

从最近遇到的问题:

If you set ImageFile.LOAD_TRUNCATED_IMAGES to true, it will suppress the error (but still not read the large metadata). Alternately, you can change set the values here: https://github.com/python-pillow/Pillow/ blob/master/PIL/PngImagePlugin.py#L74

https://github.com/python-pillow/Pillow/issues/2445

+0

“ImageFile.LOAD_TRUNCATED_IMAGES为true”。我试过了,它不起作用。 – notalentgeek

相关问题