2017-09-12 41 views
1

我在使用LocalScheduler选项的EC2实例上使用气流。我已经调用airflow schedulerairflow webserver,一切似乎都运行良好。也就是说,在将cron字符串提供给schedule_interval用于“每10分钟执行一次”'*/10 * * * *'后,该作业默认每24小时继续执行一次。下面的代码头:每X分钟运行一次气流DAG

from datetime import datetime 
import os 
import sys 

from airflow.models import DAG 
from airflow.operators.python_operator import PythonOperator 

import ds_dependencies 

SCRIPT_PATH = os.getenv('PREPROC_PATH') 

if SCRIPT_PATH: 
    sys.path.insert(0, SCRIPT_PATH) 
    import workers 
else: 
    print('Define PREPROC_PATH value in environmental variables') 
    sys.exit(1) 

default_args = { 
    'start_date': datetime(2017, 9, 9, 10, 0, 0, 0), #..EC2 time. Equal to 11pm hora México 
    'max_active_runs': 1, 
    'concurrency': 4, 
    'schedule_interval': '*/10 * * * *' #..every 10 minutes 
} 

DAG = DAG(
    dag_id='dash_update', 
    default_args=default_args 
) 

... 

回答

4

default_args只是为了填补传递给运营商一个DAG内PARAMS。 max_active_runs,concurrencyschedule_interval都是用于初始化DAG的参数,而不是操作员。这是你想要什么:

DAG = DAG(
    dag_id='dash_update', 
    start_date=datetime(2017, 9, 9, 10, 0, 0, 0), #..EC2 time. Equal to 11pm hora México 
    max_active_runs=1, 
    concurrency=4, 
    schedule_interval='*/10 * * * *', #..every 10 minutes 
    default_args=default_args, 
) 

我已经把它们混合起来之前一样,所以以供参考(注意有重叠):

DAG参数:https://airflow.incubator.apache.org/code.html?highlight=dag#airflow.models.DAG 操作参数:https://airflow.incubator.apache.org/code.html#baseoperator

+0

品牌很有道理,完全错过了。谢谢@Daniel – Aaron