2017-04-03 92 views
0

我有我芹菜在我的应用程序运行的任务。我在开发环境中没有任何压力地进行设置,并且作为代理与Redis完美协作。昨天我将代码转移到我的服务器并设置redis,但芹菜无法发现任务。代码是一样的。芹菜拍没有发现任务

celery_conf.py文件(最初celery.py):在设置

# coding: utf-8 
from __future__ import absolute_import, unicode_literals 

import os 
from celery import Celery 
from django.conf import settings 


# set the default Django settings module for the 'celery' program. 
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'vertNews.settings') 
app = Celery('vertNews') 

app.config_from_object('django.conf:settings', namespace='CELERY') 
app.autodiscover_tasks(lambda: settings.INSTALLED_APPS) 


@app.task(bind=True) 
def debug_task(self): 
    print('Request: {0!r}'.format(self.request)) 

芹菜配置

# Celery Configuration 

CELERY_TASK_ALWAYS_EAGER = False 
CELERY_BROKER_URL = SECRETS['celery']['broker_url'] 
CELERY_RESULT_BACKEND = SECRETS['celery']['result_backend'] 
CELERY_ACCEPT_CONTENT = ['application/json'] 
CELERY_TASK_SERIALIZER = 'json' 
CELERY_RESULT_SERIALIZER = 'json' 
CELERY_TIMEZONE = TIME_ZONE 

根应用程序的__init__.py

# coding: utf-8 
from __future__ import absolute_import, unicode_literals 

from .celery_conf import app as celery_app 

__all__ = ['celery_app'] 

我的任务

# coding=utf-8 
from __future__ import unicode_literals, absolute_import 

import logging 
from celery.schedules import crontab 
from celery.task import periodic_task 
from .api import fetch_tweets, delete_tweets 


logger = logging.getLogger(__name__) 


@periodic_task(
    run_every=(crontab(minute=10, hour='0, 6, 12, 18, 23')), 
    name="fetch_tweets_task", 
    ignore_result=True) 
def fetch_tweets_task(): 
    logger.info("Tweet download started") 
    fetch_tweets() 
    logger.info("Tweet download and summarization finished") 


@periodic_task(
    run_every=(crontab(minute=13, hour=13)), 
    name="delete_tweets_task", 
    ignore_result=True) 
def delete_tweets_task(): 
    logger.info("Tweet deletion started") 
    delete_tweets() 
    logger.info("Tweet deletion finished") 

的结果,当我在远程服务器上运行(不工作)

(trendiz) [email protected]:~/projects/verticals-news/src$ celery -A vertNews beat -l debug 
Trying import production.py settings... 
celery beat v4.0.2 (latentcall) is starting. 
__ - ... __ -  _ 
LocalTime -> 2017-04-03 13:55:49 
Configuration -> 
    . broker -> redis://localhost:6379// 
    . loader -> celery.loaders.app.AppLoader 
    . scheduler -> celery.beat.PersistentScheduler 
    . db -> celerybeat-schedule 
    . logfile -> [stderr]@%DEBUG 
    . maxinterval -> 5.00 minutes (300s) 
[2017-04-03 13:55:49,770: DEBUG/MainProcess] Setting default socket timeout to 30 
[2017-04-03 13:55:49,771: INFO/MainProcess] beat: Starting... 
[2017-04-03 13:55:49,785: DEBUG/MainProcess] Current schedule: 

[2017-04-03 13:55:49,785: DEBUG/MainProcess] beat: Ticking with max interval->5.00 minutes 
[2017-04-03 13:55:49,785: DEBUG/MainProcess] beat: Waking up in 5.00 minutes. 

结果,当我在开发服务器(工作)

LocalTime -> 2017-04-03 14:16:19 
Configuration -> 
    . broker -> redis://localhost:6379// 
    . loader -> celery.loaders.app.AppLoader 
    . scheduler -> celery.beat.PersistentScheduler 
    . db -> celerybeat-schedule 
    . logfile -> [stderr]@%DEBUG 
    . maxinterval -> 5.00 minutes (300s) 
[2017-04-03 14:16:19,919: DEBUG/MainProcess] Setting default socket timeout to 30 
[2017-04-03 14:16:19,919: INFO/MainProcess] beat: Starting... 
[2017-04-03 14:16:19,952: DEBUG/MainProcess] Current schedule: 
<ScheduleEntry: fetch_tweets_task fetch_tweets_task() <crontab: 36 0, 6, 12, 18, 22 * * * (m/h/d/dM/MY)> 
<ScheduleEntry: delete_tweets_task delete_tweets_task() <crontab: 13 13 * * * (m/h/d/dM/MY)> 
[2017-04-03 14:16:19,952: DEBUG/MainProcess] beat: Ticking with max interval->5.00 minutes 
[2017-04-03 14:16:19,953: DEBUG/MainProcess] beat: Waking up in 5.00 minutes. 

跑我跑蟒蛇3.5和4.0芹菜.2在这两种环境下

回答

0

我不知道究竟是什么问题,但清除项目中的所有* .pyc文件摆脱了问题