2016-11-28 54 views
0

我正在docker-compose拒绝汇集的Django,芹菜,Postgres的和RabbitMQ的,具有以下泊坞窗,compose.yml多克尔 - 撰写连接芹菜

version: '2' 

services: 
    # PostgreSQL database 
    db: 
    image: postgres:9.4 
    hostname: db 
    environment: 
     - POSTGRES_USER=<XXX> 
     - POSTGRES_PASSWORD=<XXX> 
     - POSTGRES_DB=<XXX> 
    ports: 
     - "5431:5432" 

    rabbit: 
    hostname: rabbit 
    image: rabbitmq:3-management 
    environment: 
     - RABBITMQ_DEFAULT_USER=<XXX> 
     - RABBITMQ_DEFAULT_PASS=<XXX> 
    ports: 
     - "5672:5672" 
     - "15672:15672" 

    # Django web server 
    web: 
    build: 
     context: . 
     dockerfile: Dockerfile 
    hostname: web 
    command: /srv/www/run_web.sh 
    volumes: 
      - .:/srv/www 
    ports: 
     - "8000:8000" 
    links: 
     - db 
     - rabbit 
    depends_on: 
     - db 

    # Celery worker 
    worker: 
     hostname: celery 
     build: 
      context: . 
      dockerfile: Dockerfile 
     command: /srv/www/run_celery.sh 
     volumes: 
      - .:/srv/www 
     links: 
      - db 
      - rabbit 
     depends_on: 
      - rabbit 

在Django的意见我委派出一个其中做了一些处理,然后芹菜任务试图后的结果到另一个Web服务:

#views.py 
@csrf_exempt 
def process_data(request): 
    if request.method == 'POST': 

     # 
     #Processing to retrieve data here 
     # 

     delegate_celery_task.delay(data) 
    return HttpResponse(status=200) 

#tasks.py 
@app.task 
def delegate_celery_task(in_data): 
    from extractorService.settings import MASTER_NODE 
    import json 
    import urllib 

    # 
    #Some processing on in_data here to give out_data 
    # 

    data = {'data': out_data} 
    params = json.dumps(data).encode('utf8') 

    req = urllib.request.Request('http://%s/api/data/'%(MASTER_NODE), data=params, 
       headers={'content-type': 'application/json'}) 

    urllib.request.urlopen(req) 

现在MASTER_NODE仅仅是本地主机:8001我在哪里运行的其他网络服务。当我运行码头以外的所有东西时,安装程​​序会运行。在启动码头工人虽然工作进程给出:

worker_1 | [2016-11-28 12:20:17,527: WARNING/PoolWorker-2] unable to cache TLDs in file /usr/local/lib/python3.5/site-packages/tldextract/.tld_set: [Errno 13] Permission denied: '/ usr/local/lib/python3.5/site-packages/tldextract/.tld_set'

,然后张贴到Django的观点,芹菜工人开始,但对的urlopen调用提供了一个错误:

worker_1 | Traceback (most recent call last): worker_1 | File "/usr/local/lib/python3.5/site-packages/celery/app/trace.py", line 368, in trace_task worker_1 | R = retval = fun(*args, **kwargs) worker_1 | File "/usr/local/lib/python3.5/site-packages/celery/app/trace.py", line 623, in protected_call worker_1 | return self.run(*args, **kwargs) worker_1 | File "/srv/extractor_django/extractorService/tasks.py", line 25, in extract_entities worker_1 | urllib.request.urlopen(req) worker_1 | File "/usr/local/lib/python3.5/urllib/request.py", line 162, in urlopen worker_1 | return opener.open(url, data, timeout) worker_1 | File "/usr/local/lib/python3.5/urllib/request.py", line 465, in open worker_1 | response = self._open(req, data) worker_1 | File "/usr/local/lib/python3.5/urllib/request.py", line 483, in _open worker_1 | '_open', req) worker_1 | File "/usr/local/lib/python3.5/urllib/request.py", line 443, in _call_chain worker_1 | result = func(*args) worker_1 | File "/usr/local/lib/python3.5/urllib/request.py", line 1268, in http_open worker_1 | return self.do_open(http.client.HTTPConnection, req) worker_1 | File "/usr/local/lib/python3.5/urllib/request.py", line 1242, in do_open worker_1 | raise URLError(err) worker_1 | urllib.error.URLError:

的在settings.py中的芹菜配置是:

RABBIT_HOSTNAME = os.environ.get('RABBIT_PORT_5672_TCP', 'rabbit') 
if RABBIT_HOSTNAME.startswith('tcp://'): 
    RABBIT_HOSTNAME = RABBIT_HOSTNAME.split('//')[1] 

BROKER_URL = os.environ.get('BROKER_URL', '') 
if not BROKER_URL: 
    BROKER_URL = 'amqp://{user}:{password}@{hostname}'.format(
     user=os.environ.get('RABBIT_ENV_USER', '<XXX>'), 
     password=os.environ.get('RABBIT_ENV_RABBITMQ_PASS', '<XXX>'), 
     hostname=RABBIT_HOSTNAME) 

BROKER_HEARTBEAT = '?heartbeat=30' 
if not BROKER_URL.endswith(BROKER_HEARTBEAT): 
BROKER_URL += BROKER_HEARTBEAT 

BROKER_POOL_LIMIT = 1 
BROKER_CONNECTION_TIMEOUT = 10 

CELERY_DEFAULT_QUEUE = 'default' 
CELERY_QUEUES = (
Queue('default', Exchange('default'), routing_key='default'),) 

CELERY_ALWAYS_EAGER = False 
CELERY_ACKS_LATE = True 
CELERY_TASK_PUBLISH_RETRY = True 
CELERY_DISABLE_RATE_LIMITS = False 

CELERY_IGNORE_RESULT = True 
CELERY_SEND_TASK_ERROR_EMAILS = False 
CELERY_TASK_RESULT_EXPIRES = 600 

CELERYD_HIJACK_ROOT_LOGGER = False 
CELERYD_PREFETCH_MULTIPLIER = 1 
CELERYD_MAX_TASKS_PER_CHILD = 1000 

有没有人有任何想法如何可以修复ED?

+0

哪里是你的芹菜尝试后从V3.1更新芹菜v4和?它应该对你的兔子容器而不是本地主机发出请求。 – user3012759

+0

它正试图发布到在码头外运行的外部站点。在开发中,这是我的机器的本地主机在运行另一个Web服务的端口8001上,但是在AWS上运行的站点上。 – mohevi

+0

我更关心什么是你正在创建的芹菜任务的后端配置? – user3012759

回答

0

你没有提到芹菜的版本,但从发布日期我可以猜测它是v4。

我只是有类似的问题,因为根据该tutorial这是需要改变BROKER_URLCELERY_BROKER_URLsettings.py