2015-07-21 185 views
1

我有一个程序从服务器下载文件。他们是从2mb到8mb的任何地方。它运行一个循环并抓取我请求的文件数量。问题在于,我的网络在这里出现在自由沙漠的中间。虽然大多数情况下一切都很好,但有时互联网会在urllib.request.urlretrieve请求期间掉线并冻结程序。我需要一种方法让urllib检测网络何时掉线,然后重试文件,直到它再次恢复。任何帮助赞赏!我在做什么Python 3 urllib urlretrieve重试下载失败

例子:

try: 
    numimgs = len(imgsToGet) 

    path1 = "LEVEL II" #HIGHTEST FOLDER 
    self.fn = imgs.split('/')[-1] #SPLIT OUT NAME FROM LEFT 
    path2 = self.fn[:4] #SPLIT OUT KICX 
    path3 = self.fn.split('_')[1] #SPLIT OUT DATE 
    savepath = os.path.join(path1, path2, path3) #LEVEL II/RADAR/DATE PATH 

    if not os.path.isdir(savepath): #See if it exists 
     os.makedirs(savepath) #If not, make it 

    fileSavePath = os.path.join(path1, path2, path3, self.fn) 

    if os.path.isfile(fileSavePath): #chcek to see if image path already exists 
     self.time['text'] = self.fn + ' exists \n' 
     continue 

    #DOWNLOAD PROGRESS 
    def reporthook(blocknum, blocksize, totalsize): 
     percent = 0 
     readsofar = blocknum * blocksize 
     if totalsize > 0: 
      percent = readsofar * 1e2/totalsize 
      if percent >= 100: 
       percent = 100 

      s = "\r%5.1f%% %*d/%d" % (
       percent, len(str(totalsize)), readsofar, totalsize) 

      self.time['text'] = 'Downloading File: '+str(curimg)+ ' of '+str(numimgs)+' '+self.fn+'' + s 

      if readsofar >= totalsize: # near the end 
       self.time['text'] = "Saving File..." 
     else: # total size is unknown 
      self.time['text'] = "read %d\n" % (readsofar) 

     #UPDATE PROGRESSBAR 
     self.pb.config(mode="determinate") 
     if percent > 0: 
      self.dl_p = round(percent,0) 
      self.pb['value'] = self.dl_p 
      self.pb.update() 
     if percent > 100: 
      self.pb['value'] = 0 
      self.pb.update() 

    urllib.request.urlretrieve(imgs, fileSavePath, reporthook) 

except urllib.error.HTTPError as err: #catch 404 not found and continue 
    if err.code == 404: 
     self.time['text'] = ' Not Found' 
     continue 

干杯,

大卫

回答

0

您可以将代码放在except块一试,用一个计数器。这是我所做的:

remaining_download_tries = 15 

while remaining_download_tries > 0 : 
    try: 
     urlretrieve(CaseLawURL,File_Path_and_Name) 
     print("successfully downloaded: " + CaseLawURL) 
     time.sleep(0.1) 
    except: 
     print("error downloading " + CaseLawURL +" on trial no: " + str(16 - remaining_download_tries)) 
     remaining_download_tries = remaining_download_tries - 1 
     continue 
    else: 
     break 

我希望代码是自我解释。问候