2017-10-12 69 views
0

我已经编写了一个脚本来获取Qualys的扫描结果,以便每周进行度量收集。PyCurl请求在执行时无限挂起

该脚本的第一部分涉及获取上一周运行的每个扫描的引用列表以供进一步处理。

问题是,虽然这有时会很好地工作,但其他时间脚本将挂在c.perform()行。当手动运行脚本时这是可管理的,因为它可以重新运行直到它工作。不过,我希望每周都将此作为计划任务运行,而无需任何手动交互。

有没有一种万无一失的方法,我可以检测到是否发生挂起并重新发送PyCurl请求,直到它工作?

我试过设置c.TIMEOUTc.CONNECTTIMEOUT选项,但这些似乎并不奏效。另外,由于没有例外,简单地把它放在try-except块中也不会飞。

有问题的功能如下:

# Retrieve a list of all scans conducted in the past week 
# Save this to refs_raw.txt 
def getScanRefs(usr, pwd): 

    print("getting scan references...") 

    with open('refs_raw.txt','wb') as refsraw: 
     today = DT.date.today() 
     week_ago = today - DT.timedelta(days=7) 
     strtoday = str(today) 
     strweek_ago = str(week_ago) 

     c = pycurl.Curl() 

     c.setopt(c.URL, 'https://qualysapi.qualys.eu/api/2.0/fo/scan/?action=list&launched_after_datetime=' + strweek_ago + '&launched_before_datetime=' + strtoday) 
     c.setopt(c.HTTPHEADER, ['X-Requested-With: pycurl', 'Content-Type: text/xml']) 
     c.setopt(c.USERPWD, usr + ':' + pwd) 
     c.setopt(c.POST, 1) 
     c.setopt(c.PROXY, 'companyproxy.net:8080') 
     c.setopt(c.CAINFO, certifi.where()) 
     c.setopt(c.SSL_VERIFYPEER, 0) 
     c.setopt(c.SSL_VERIFYHOST, 0) 
     c.setopt(c.CONNECTTIMEOUT, 3) 
     c.setopt(c.TIMEOUT, 3) 

     refsbuffer = BytesIO() 
     c.setopt(c.WRITEDATA, refsbuffer) 
     c.perform() 

     body = refsbuffer.getvalue() 
     refsraw.write(body) 
     c.close() 

    print("Got em!") 
+0

我现在意识到变量命名的时候,我已经使用驼峰,under_scores的可怕的组合,和nothingatall。请不要太苛刻地评价我。 –

回答

0

我固定的问题我自己使用multiprocessing推出一个单独的进程API调用启动一个单独的进程,杀死并重新启动,如果它继续为长于5秒。这不是很漂亮,但是是跨平台的。对于那些寻找更优雅的解决方案,但只适用于* nix查看the signal library,特别是SIGALRM。下面

代码:

# As this request for scan references sometimes hangs it will be run in a separate thread here 
# This will be terminated and relaunched if no response is received within 5 seconds 
def performRequest(usr, pwd): 
    today = DT.date.today() 
    week_ago = today - DT.timedelta(days=7) 
    strtoday = str(today) 
    strweek_ago = str(week_ago) 

    c = pycurl.Curl() 

    c.setopt(c.URL, 'https://qualysapi.qualys.eu/api/2.0/fo/scan/?action=list&launched_after_datetime=' + strweek_ago + '&launched_before_datetime=' + strtoday) 
    c.setopt(c.HTTPHEADER, ['X-Requested-With: pycurl', 'Content-Type: text/xml']) 
    c.setopt(c.USERPWD, usr + ':' + pwd) 
    c.setopt(c.POST, 1) 
    c.setopt(c.PROXY, 'companyproxy.net:8080') 
    c.setopt(c.CAINFO, certifi.where()) 
    c.setopt(c.SSL_VERIFYPEER, 0) 
    c.setopt(c.SSL_VERIFYHOST, 0) 

    refsBuffer = BytesIO() 
    c.setopt(c.WRITEDATA, refsBuffer) 
    c.perform() 
    c.close() 
    body = refsBuffer.getvalue() 
    refsraw = open('refs_raw.txt', 'wb') 
    refsraw.write(body) 
    refsraw.close() 

# Retrieve a list of all scans conducted in the past week 
# Save this to refs_raw.txt 
def getScanRefs(usr, pwd): 

    print("Getting scan references...") 

    # Occasionally the request will hang infinitely. Launch in separate method and retry if no response in 5 seconds 
    success = False 
    while success != True: 
     sendRequest = multiprocessing.Process(target=performRequest, args=(usr, pwd)) 
     sendRequest.start() 

     for seconds in range(5): 
      print("...") 
      time.sleep(1) 

     if sendRequest.is_alive(): 
      print("Maximum allocated time reached... Resending request") 
      sendRequest.terminate() 
      del sendRequest 
     else: 
      success = True 

    print("Got em!")