2017-07-23 71 views
1

我在气流中制作了以下DAG,并执行一组EMRSteps来运行我的管道。Airflow EMR从传感器执行步骤

default_args = { 
    'owner': 'airflow', 
    'depends_on_past': False, 
    'start_date': datetime(2017, 07, 20, 10, 00), 
    'email': ['[email protected]'], 
    'email_on_failure': False, 
    'email_on_retry': False, 
    'retries': 5, 
    'retry_delay': timedelta(minutes=2), 
} 

dag = DAG('dag_import_match_hourly', 
     default_args=default_args, 
     description='Fancy Description', 
     schedule_interval=timedelta(hours=1), 
     dagrun_timeout=timedelta(hours=2)) 

try: 
    merge_s3_match_step = EmrAddStepsOperator(
     task_id='merge_s3_match_step', 
     job_flow_id=cluster_id, 
     aws_conn_id='aws_default', 
     steps=create_step('Merge S3 Match'), 
     dag=dag 
    ) 

    mapreduce_step = EmrAddStepsOperator(
     task_id='mapreduce_match_step', 
     job_flow_id=cluster_id, 
     aws_conn_id='aws_default', 
     steps=create_step('MapReduce Match Hourly'), 
     dag=dag 
    ) 

    merge_hdfs_step = EmrAddStepsOperator(
     task_id='merge_hdfs_step', 
     job_flow_id=cluster_id, 
     aws_conn_id='aws_default', 
     steps=create_step('Merge HDFS Match Hourly'), 
     dag=dag 
    ) 

    ## Sensors 
    check_merge_s3 = EmrStepSensor(
     task_id='watch_merge_s3', 
     job_flow_id=cluster_id, 
     step_id="{{ task_instance.xcom_pull('merge_s3_match_step', key='return_value')[0] }}", 
     aws_conn_id='aws_default', 
     dag=dag 
    ) 

    check_mapreduce = EmrStepSensor(
     task_id='watch_mapreduce', 
     job_flow_id=cluster_id, 
     step_id="{{ task_instance.xcom_pull('mapreduce_match_step', key='return_value')[0] }}", 
     aws_conn_id='aws_default', 
     dag=dag 
    ) 

    check_merge_hdfs = EmrStepSensor(
     task_id='watch_merge_hdfs', 
     job_flow_id=cluster_id, 
     step_id="{{ task_instance.xcom_pull('merge_hdfs_step', key='return_value')[0] }}", 
     aws_conn_id='aws_default', 
     dag=dag 
    ) 

    mapreduce_step.set_upstream(merge_s3_match_step) 
    merge_s3_match_step.set_downstream(check_merge_s3) 

    mapreduce_step.set_downstream(check_mapreduce) 

    merge_hdfs_step.set_upstream(mapreduce_step) 
    merge_hdfs_step.set_downstream(check_merge_hdfs) 

except AirflowException as ae: 
    print ae.message 

的DAG工作正常,但我想用的传感器,以确保我去执行下一步当且仅当电子病历工作已经正确完成。我尝试了一些东西,但没有一个在工作。上面的代码不能正确完成这项工作。有人知道如何使用EMRSensorStep来实现我的目标吗?

+0

是您写的那些自定义传感器吗?我正在设置airflow和emr,并需要一种方法来检查群集中步骤的状态。 – luckytaxi

+1

没有那些传感器,你可以在'''contrib'''包中找到 - 见这里[https://github.com/apache/incubator-airflow/blob/master/airflow/contrib/sensors/emr_step_sensor.py ](https://github.com/apache/incubator-airflow/blob/master/airflow/contrib/sensors/emr_step_sensor.py) – davideberdin

+0

Thx ...我没有意识到'contrib'目录中还有其他的。 – luckytaxi

回答

1

它看起来像你的EmrStepSensor任务需要设置正确的依赖关系,例如,check_mapreduce,如果你想等待check_mapreduce完成,下一步应该是merge_hdfs_step.set_upstream(check_mapreduce)check_mapreduce.set_downstream(merge_hdfs_step)。因此,它将是TaskA >> SensorA >> TaskB >> SensorB >> TaskC >> SensorC,尝试使用这种方式设置依赖关系

+0

这完全有效,谢谢! – davideberdin