我使用Sqoop将处理后的数据从Hive格式的HDFS导出到MySQL服务器。代码简单明了,但无论我做什么,Sqoop都无法正确识别字段分隔符。可能是什么问题?无法使用Sqoop将数据从Hive导出到MySQL
这是蜂巢
hive> show create table database.weblog_ag;
OK
CREATE TABLE database.weblog_ag(
visitor_id string,
time array<string>,
url array<string>,
client_time array<string>,
resolution array<string>,
browser array<string>,
os array<string>,
devicetype array<string>,
devicemodel array<string>,
ipinfo array<string>
CLUSTERED BY (
visitor_id)
SORTED BY (
time ASC)
INTO 32 BUCKETS
ROW FORMAT DELIMITED
FIELDS TERMINATED BY '\t'
STORED AS INPUTFORMAT
'org.apache.hadoop.mapred.TextInputFormat'
OUTPUTFORMAT
'org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat'
LOCATION
'hdfs://poc/apps/hive/warehouse/database.db/weblog_ag'
TBLPROPERTIES (
'numPartitions'='0',
'numFiles'='96',
'transient_lastDdlTime'='1390411893',
'totalSize'='59633487',
'numRows'='0',
'rawDataSize'='0')
Time taken: 1.871 seconds, Fetched: 31 row(s)
我的表定义当我检查HDFS文件中,字段的使用\t
(TAB)字符正确分隔。这是我从HDFS
101009a36b3113fa 2014-01-06 08:59:58 http://someurl 2014-01-06 08:56:53 1280x800 Chrome Windows XP General_Desktop Other 115.74.215.116
抓住样本数据这是我Sqoop选项文件配置
export
--connect
jdbc:mysql://webserver/fprofile_db
--username
username
--password
password
--table
weblog
--direct
--export-dir
/apps/hive/warehouse/database.db/weblog_ag
--input-fields-terminated-by
'\011'
--columns
visitor_id, time, url, client_time, resolution, browser, os, devicetype, devicemodel, ipinfo
我试图用'\011
,\t
为--input-fields-terminated-by
参数,但他们没有工作。在mySQL中导出的结果如下:
这里有什么问题?