2017-04-27 45 views
1

云功能有没有方法确认数据流作业是否成功?确认云功能中数据流作业的成功

云功能,我试过:

const google = require('googleapis'); 

exports.statusJob = function(event, callback) { 
const file = event.data; 
if (file.resourceState === 'exists' && file.name) { 
    console.log(file.name); 
    console.log(event.data); 
    google.auth.getApplicationDefault(function (err, authClient, projectId) { 
    if (err) { 
     throw err; 
    } 

    if (authClient.createScopedRequired && authClient.createScopedRequired()) { 
     authClient = authClient.createScoped([ 
     'https://www.googleapis.com/auth/cloud-platform', 
     'https://www.googleapis.com/auth/userinfo.email' 
     ]); 
    } 

    const dataflow = google.dataflow({ version: 'v1b3', auth: authClient }); 

    dataflow.projects.jobs.get({ 
     projectId: 'my-project-id', 
     resource: { 
     jobId: 'some_number' 
     } 
    }, function(err, response) { 
     if (err) { 
     console.error("problem running dataflow template, error was: ", err); 
     } 
     console.log("Dataflow template response: ", response); 
     callback(); 
    }); 

    }); 
} 
}; 

包装JSON:

{ 
    "name": "test", 
    "version": "1.0.0", 
    "description": "", 
    "main": "index.js", 
    "scripts": { 
    "test": "echo \"Error: no test specified\" && exit 1" 
    }, 
    "author": "", 
    "license": "ISC", 
    "dependencies": { 
    "googleapis": "^18.0.0" 
    } 
} 

以上事情为我工作完美ONCE。 ,我得到的答复是:

Dataflow template response: { id: 'some_number', projectId: 'my-project-id', name: 'cloud-fn', type: 'JOB_TYPE_BATCH', environment: { userAgent: { name: 'Google Cloud Dataflow SDK for Java', support: [Object], 'build.date': '2017-05-23 19:46', version: '2.0.0' }, version: { major: '6', job_type: 'JAVA_BATCH_AUTOSCALING' } }, currentState: 'JOB_STATE_DONE',........ 

然后它那个说法后各一次给了一个错误:

problem running dataflow template, error was: Error: Missing required parameters: jobId at createAPIRequest (/user_code/node_modules/googleapis/lib/apirequest.js:110:14) at Object.get (/user_code/node_modules/googleapis/apis/dataflow/v1b3.js:670:16) at /user_code/index.js:22:29 at callback (/user_code/node_modules/googleapis/node_modules/google-auth-library/lib/auth/googleauth.js:42:14) at /user_code/node_modules/googleapis/node_modules/google-auth-library/lib/auth/googleauth.js:289:13 at _combinedTickCallback (internal/process/next_tick.js:73:7) at process._tickDomainCallback (internal/process/next_tick.js:128:9) 

有谁知道这事?

感谢

回答

1

可以使用CLI数据流,以确定是否一个作业失败或成功。它可以让你列出工作,并检查他们的失败/成功/运行/取消状态。

具体来说,要检查单个作业的状态,您可以运行:

gcloud beta dataflow jobs describe <JOB_ID> 

欲了解更多信息检查文档:

https://cloud.google.com/dataflow/pipelines/dataflow-command-line-intf

+0

注:一个成功的作业将显示“Done “并且失败的作业将在输出中显示状态字段的”失败“。 –

+0

好的...如果我想通过JavaScript中的云功能代码确认相同的结果怎么办? 假设我调用了一个数据流作业,并且在它完成之后我想知道它是否成功...... – rish0097

+0

我假设您通过调用云功能启动数据流作业,请按照此处的指导进行操作? https://shinesolutions.com/2017/03/23/triggering-dataflow-pipelines-with-cloud-functions/ 在这个例子中,他们访问的Javascript googleApis。 您可以编写另一个调用google API的云端函数,并调用dataflow.projects.templates.get。不幸的是,我找不到JS API的文档/代码示例。但我认为你可以用projectId和jobId参数来调用它。 –