2016-06-13 72 views
3

我真的试图连接通过多碱在SQL Server中的Hadoop 2016年 我的代码是:多碱外部表的访问失败 - 拒绝权限

CREATE EXTERNAL DATA SOURCE MyHadoopCluster WITH ( 
     TYPE = HADOOP, 
     LOCATION ='hdfs://192.168.114.20:8020', 
     credential= HadoopUser1 
    ); 


CREATE EXTERNAL FILE FORMAT TextFileFormat WITH ( 
     FORMAT_TYPE = DELIMITEDTEXT, 
    FORMAT_OPTIONS (FIELD_TERMINATOR ='\001', 
      USE_TYPE_DEFAULT = TRUE) 
); 


CREATE EXTERNAL TABLE [dbo].[test_hadoop] ( 
     [Market_Name] int NOT NULL, 
     [Claim_GID] int NOT NULL, 
     [Completion_Flag] int NULL, 
     [Diag_CDE] float NOT NULL, 
     [Patient_GID] int NOT NULL, 
     [Record_ID] int NOT NULL, 
     [SRVC_FROM_DTE] int NOT NULL 
) 
WITH (LOCATION='/applications/gidr/processing/lnd/sha/clm/cf/claim_diagnosis', 
     DATA_SOURCE = MyHadoopCluster, 
     FILE_FORMAT = TextFileFormat 

); 

,我得到这个错误:

EXTERNAL TABLE access failed due to internal error: 'Java exception raised on call to HdfsBridge_GetDirectoryFiles: Error [Permission denied: user=pdw_user, access=READ_EXECUTE, inode="/applications/gidr/processing/lnd/sha/clm/cf/claim_diagnosis":root:supergroup:drwxrwxr-- at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkFsPermission(DefaultAuthorizationProvider.java:281) at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.check(DefaultAuthorizationProvider.java:262) at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkPermission(DefaultAuthorizationProvider.java:175) at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:152) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:6590) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:6572) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPathAccess(FSNamesystem.java:6497) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getListingInt(FSNamesystem.java:5034) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getListing(FSNamesystem.java:4995) at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getListing(NameNodeRpcServer.java:882) at org.apache.hadoop.hdfs.server.namenode.AuthorizationProviderProxyClientProtocol.getListing(AuthorizationProviderProxyClientProtocol.java:335) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getListing(ClientNamenodeProtocolServerSideTranslatorPB.java:615) at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java) at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:617) at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1073) at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2086) at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2082) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:415) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1693) at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2080) ] occurred while accessing external file.'

问题是,在最新版本的polybase中没有配置文件,您可以在其中指定hadoop默认登录名和密码。所以即使在我创建范围证书时,polybase仍然使用默认的pdw_user。我甚至试图在hadoop上创建pdw_user,但仍然有这个错误。有任何想法吗?

回答