2017-07-07 93 views
1

我跟着Spark instructions启动旧货JDBC服务器:无法连接到星火thriftserver使用JDBC

 
$ ./spark-2.1.1-bin-hadoop2.7/sbin/start-thriftserver.sh 

我可以连接到它从直线确定:

 
$ ./spark-2.1.1-bin-hadoop2.7/bin/beeline -u 'jdbc:hive2://localhost:10000' 
Connecting to jdbc:hive2://localhost:10000 
log4j:WARN No appenders could be found for logger (org.apache.hive.jdbc.Utils). 
log4j:WARN Please initialize the log4j system properly. 
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info. 
Connected to: Spark SQL (version 2.1.1) 
Driver: Hive JDBC (version 1.2.1.spark2) 
Transaction isolation: TRANSACTION_REPEATABLE_READ 
Beeline version 1.2.1.spark2 by Apache Hive 
0: jdbc:hive2://localhost:10000> 

然而,尝试连接从DataGrip使用JDBC和相同的连接字符串,我得到一个错误:

 
[2017-07-07 16:46:57] java.lang.ClassNotFoundException: org.apache.thrift.transport.TTransportException 
[2017-07-07 16:46:57] at java.net.URLClassLoader.findClass(URLClassLoader.java:381) 
[2017-07-07 16:46:57] at java.lang.ClassLoader.loadClass(ClassLoader.java:424) 
[2017-07-07 16:46:57] at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331) 
[2017-07-07 16:46:57] at java.lang.ClassLoader.loadClass(ClassLoader.java:357) 
[2017-07-07 16:46:57] at org.apache.hive.jdbc.HiveDriver.connect(HiveDriver.java:105) 
[2017-07-07 16:46:57] at com.intellij.database.remote.jdbc.impl.RemoteDriverImpl.connect(RemoteDriverImpl.java:27) 
[2017-07-07 16:46:57] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
[2017-07-07 16:46:57] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 
[2017-07-07 16:46:57] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 
[2017-07-07 16:46:57] at java.lang.reflect.Method.invoke(Method.java:498) 
[2017-07-07 16:46:57] at sun.rmi.server.UnicastServerRef.dispatch(UnicastServerRef.java:324) 
[2017-07-07 16:46:57] at sun.rmi.transport.Transport$1.run(Transport.java:200) 
[2017-07-07 16:46:57] at sun.rmi.transport.Transport$1.run(Transport.java:197) 
[2017-07-07 16:46:57] at java.security.AccessController.doPrivileged(Native Method) 
[2017-07-07 16:46:57] at sun.rmi.transport.Transport.serviceCall(Transport.java:196) 
[2017-07-07 16:46:57] at sun.rmi.transport.tcp.TCPTransport.handleMessages(TCPTransport.java:568) 
[2017-07-07 16:46:57] at sun.rmi.transport.tcp.TCPTransport$ConnectionHandler.run0(TCPTransport.java:826) 
[2017-07-07 16:46:57] at sun.rmi.transport.tcp.TCPTransport$ConnectionHandler.lambda$run$0(TCPTransport.java:683) 
[2017-07-07 16:46:57] at java.security.AccessController.doPrivileged(Native Method) 
[2017-07-07 16:46:57] at sun.rmi.transport.tcp.TCPTransport$ConnectionHandler.run(TCPTransport.java:682) 
[2017-07-07 16:46:57] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) 
[2017-07-07 16:46:57] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) 
[2017-07-07 16:46:57] at java.lang.Thread.run(Thread.java:745) (no stack trace) 

我配置的数据谨慎使用Spark安装文件夹中的JDBC库hive-jdbc-1.2.1.spark2.jar

回答

2

从火花2.2.1分配,则需要以下jar文件:

commons-logging-1.1.3.jar 
hadoop-common-2.7.3.jar 
hive-exec-1.2.1.spark2.jar 
hive-jdbc-1.2.1.spark2.jar 
hive-metastore-1.2.1.spark2.jar 
httpclient-4.5.2.jar 
httpcore-4.4.4.jar 
libthrift-0.9.3.jar 
slf4j-api-1.7.16.jar 
spark-hive-thriftserver_2.11-2.2.1.jar 
spark-network-common_2.11-2.2.1.jar 

在Datagrip选择类org.apache.hive.jdbc.HiveDriver并设置德克萨斯(交易控制)Manual(火花不支持自动提交)。

您现在应该可以使用网址jdbc:hive2://hostname:10000/

0

将所有*.jar文件从spark/jars文件夹添加到DataGrip中的“JDBC驱动程序”窗口后,它可以工作!不知道哪些库是必需的,但反复试验告诉我它们中的很多都是!