我在同一台linux机器上安装wso2 am 1.9.1和wso2 bam 2.5,并且配置wso2 am和bam作为https://docs.wso2.com/display/AM190/Publishing+API+Runtime+Statistics描述。但是当我启动wso2 bam时,脚本am_stats_analyzer再次运行,并且没有错误报告。在wso2部分,它显示统计信息未配置。wso2 am Linux上的1.9.1 + bam 2.5问题
java版本是Oracle jdk 1.7.80。并以root身份运行。日志在下面,那些会一再打印,请帮助我!
日志
[2015-12-21 02:22:00,005] INFO {org.wso2.carbon.analytics.hive.task.HiveScriptExecutorTask} - Running script executor task for script **am_stats_analyzer**.
[Mon Dec 21 02:22:00 CST 2015]Hive history file=/home/wso2bam-2.5.0/tmp/hive/root-querylogs/hive_job_log_root_201512210222_2145444007.txt
OK
OK
Total MapReduce jobs = 1
Launching Job 1 out of 1
Number of reduce tasks not specified. Estimated from input data size: 1
In order to change the average load for a reducer (in bytes):
set hive.exec.reducers.bytes.per.reducer=<number>
In order to limit the maximum number of reducers:
set hive.exec.reducers.max=<number>
In order to set a constant number of reducers:
set mapred.reduce.tasks=<number>
log4j:WARN No appenders could be found for logger (org.apache.axiom.util.stax.dialect.StAXDialectDetector).
log4j:WARN Please initialize the log4j system properly.
Execution log at: /home/wso2bam-2.5.0/repository/logs//wso2carbon.log
[2015-12-21 02:22:07,801] WARN {org.apache.hadoop.mapred.JobClient} - Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
Job running in-process (local Hadoop)
Hadoop job information for null: number of mappers: 0; number of reducers: 0
2015-12-21 02:22:10,999 null map = 0%, reduce = 0%
2015-12-21 02:22:14,001 null map = 100%, reduce = 0%
2015-12-21 02:22:20,004 null map = 100%, reduce = 100%
Ended Job = job_local_0001
Execution completed successfully
Mapred Local Task Succeeded . Convert the Join into MapJoin
OK
OK
Total MapReduce jobs = 1
Launching Job 1 out of 1
Number of reduce tasks not specified. Estimated from input data size: 1
In order to change the average load for a reducer (in bytes):
set hive.exec.reducers.bytes.per.reducer=<number>
In order to limit the maximum number of reducers:
set hive.exec.reducers.max=<number>
In order to set a constant number of reducers:
set mapred.reduce.tasks=<number>
log4j:WARN No appenders could be found for logger (org.apache.axiom.util.stax.dialect.StAXDialectDetector).
log4j:WARN Please initialize the log4j system properly.
Execution log at: /home/wso2bam-2.5.0/repository/logs//wso2carbon.log
[2015-12-21 02:22:24,419] WARN {org.apache.hadoop.mapred.JobClient} - Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
Job running in-process (local Hadoop)
Hadoop job information for null: number of mappers: 0; number of reducers: 0
2015-12-21 02:22:27,574 null map = 0%, reduce = 0%
2015-12-21 02:22:30,576 null map = 100%, reduce = 0%
2015-12-21 02:22:36,579 null map = 100%, reduce = 100%
Ended Job = job_local_0001
Execution completed successfully
Mapred Local Task Succeeded . Convert the Join into MapJoin
OK
OK
Total MapReduce jobs = 1
Launching Job 1 out of 1
Number of reduce tasks not specified. Estimated from input data size: 1
In order to change the average load for a reducer (in bytes):
set hive.exec.reducers.bytes.per.reducer=<number>
In order to limit the maximum number of reducers:
set hive.exec.reducers.max=<number>
In order to set a constant number of reducers:
set mapred.reduce.tasks=<number>
log4j:WARN No appenders could be found for logger (org.apache.axiom.util.stax.dialect.StAXDialectDetector).
log4j:WARN Please initialize the log4j system properly.
Execution log at: /home/wso2bam-2.5.0/repository/logs//wso2carbon.log
[2015-12-21 02:22:40,883] WARN {org.apache.hadoop.mapred.JobClient} - Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
Job running in-process (local Hadoop)
Hadoop job information for null: number of mappers: 0; number of reducers: 0
2015-12-21 02:22:43,945 null map = 0%, reduce = 0%
2015-12-21 02:22:46,947 null map = 100%, reduce = 0%
2015-12-21 02:22:52,950 null map = 100%, reduce = 100%
Ended Job = job_local_0001
Execution completed successfully
Mapred Local Task Succeeded . Convert the Join into MapJoin
OK
OK
OK
Total MapReduce jobs = 1