我试图写一个龙卷风服务器的简单工作负载生成,这里是它的简化版本:如何让龙卷风执行并发代码?
class EventsLoader(object):
generate_num_requests = 1000
generate_concurrency = 32
server_port = 8001
def __init__(self, conf_file):
self.parse_config(conf_file)
self.client = AsyncHTTPClient()
def generate(self):
IOLoop.current().run_sync(self.generate_work)
@gen.coroutine
def generate_work(self):
self.queue = queues.Queue()
IOLoop.current().spawn_callback(self.fetch_requests)
for i in range(self.generate_concurrency):
yield self.generate_requests(i)
print 'before join queue size: %s' % self.queue.qsize()
yield self.queue.join()
@gen.coroutine
def generate_requests(self, i):
load = self.generate_num_requests/self.generate_concurrency
for j in range(load):
request = self.generate_request(i * 1000 + j)
self.queue.put(request)
@gen.coroutine
def fetch_requests(self):
while True:
try:
request = yield self.queue.get()
yield self.client.fetch(request)
except Exception as e:
print 'failed fetching: %s: %s' % (request.body, e)
finally:
print 'fetched: %s' % json.loads(request.body)['seq']
self.queue.task_done()
def generate_request(self, seq):
event = {
'seq': seq,
# ... more fields here ...
}
return HTTPRequest(
'http://localhost:%s/events' % self.server_port,
method='POST',
body=json.dumps(event),
)
我看到发生的是,所有的消息fetched: xxxx
出现的顺序,这是绝对不可能的,如果发电机的确在同时工作。
如何让它同时运行?在我了解I/O循环是什么以及什么是@gen.coroutine
时,肯定会有一些巨大的缺失。即不管我的generate_concurrency
设置如何,性能不变。
[与龙卷风Python的异步函数调用]的可能的复制(HTTPS运行它们/stackoverflow.com/questions/44139848/async-function-call-with-tornado-python) –