2017-04-07 90 views
1

我试图本地CSV文件导出到MySQL表“测试”:sqoop出口本地CSV到MySQL错误MapReduce的

$ sqoop export -fs local -jt local --connect jdbc:mysql://172.16.21.64:3306/cf_ae07c762_41a9_4b46_af6c_a29ecb050204 --username username --password password --table test --export-dir file:///home/username/test.csv 

不过,我有一个奇怪的错误说mapreduce.tar.gz未发现:

Warning: /usr/hdp/2.5.0.0-1245/hbase does not exist! HBase imports will fail. 
Please set $HBASE_HOME to the root of your HBase installation. 
Warning: /usr/hdp/2.5.0.0-1245/accumulo does not exist! Accumulo imports will fail. 
Please set $ACCUMULO_HOME to the root of your Accumulo installation. 
17/04/07 14:22:14 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6.2.5.0.0-1245 
17/04/07 14:22:14 WARN fs.FileSystem: "local" is a deprecated filesystem name. Use "file:///" instead. 
17/04/07 14:22:14 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead. 
17/04/07 14:22:15 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset. 
17/04/07 14:22:15 INFO tool.CodeGenTool: Beginning code generation 
17/04/07 14:22:15 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `test2` AS t LIMIT 1 
17/04/07 14:22:15 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `test2` AS t LIMIT 1 
17/04/07 14:22:15 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /usr/hdp/2.5.0.0-1245/hadoop-mapreduce 
Note: /tmp/sqoop-bedrock/compile/009603476b0dfc767b1b94c0607bf6fa/test2.java uses or overrides a deprecated API. 
Note: Recompile with -Xlint:deprecation for details. 
17/04/07 14:22:17 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-bedrock/compile/009603476b0dfc767b1b94c0607bf6fa/test2.jar 
17/04/07 14:22:17 INFO mapreduce.ExportJobBase: Beginning export of test2 
17/04/07 14:22:17 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId= 
17/04/07 14:22:17 ERROR tool.ExportTool: Encountered IOException running export job: java.io.FileNotFoundException: File file:/hdp/apps/2.5.0.0-1245/mapreduce/mapreduce.tar.gz does not exist 

该文件但可在我的本地机器:

/usr/hdp/2.5.0.0-1245/hadoop/mapreduce.tar.gz 

/data/hadoop/yarn/local/filecache/13/mapreduce.tar.gz 

任何人都知道问题是什么?我只是按照本指南:

http://ingest.tips/2015/02/06/use-sqoop-transfer-csv-data-local-filesystem-relational-database/

+0

的'export'命令是精细的,问题是与该位置'/ HDP /应用/ 2.5.0.0-1245 /映射精简/ mapreduce.tar.gz'。你必须从Sqoop所在的路径中找到这个路径,这是不正确的。 – franklinsijo

+0

是的,这是困难的部分,因为我无法弄清楚如何追逐那个路径变量。哪里可以找到潜在的领域? –

+0

认为我找到了它。 – franklinsijo

回答

1

酒店mapreduce.application.framework.path有这个值/hdp/apps/2.5.0.0-1245/mapreduce/mapreduce.tar.gzmapred-site.xml集。这是MapReduce框架归档的路径,并指向HDFS中的文件。

这里,Sqoop正在触发-fs local此属性需要设置一个LocalFS路径。尝试使用mapreduce存档文件的本地路径覆盖此属性值。

$ sqoop export -fs local -jt local -D 'mapreduce.application.framework.path=/usr/hdp/2.5.0.0-1245/hadoop/mapreduce.tar.gz' --connect jdbc:mysql://172.16.21.64:3306/cf_ae07c762_41a9_4b46_af6c_a29ecb050204 --username username --password password --table test --export-dir file:///home/username/test.csv 
+0

谢谢。这工作。但另一个后续问题在这里:http://stackoverflow.com/questions/43328725/sqoop-export-to-mysql-export-job-failed-tool-exporttool-but-got-records –