博客
关于我
强烈建议你试试无所不能的chatGPT,快点击我
Phoenix报错(2-2)AccessDeniedException: Insufficient permissions
阅读量:5977 次
发布时间:2019-06-20

本文共 18655 字,大约阅读时间需要 62 分钟。

  • 解决办法

grant 'bdmp_test', 'RWXCA', 'SYSTEM:CATALOG'

grant 'bdmp_test', 'RWXCA', 'SYSTEM:FUNCTION'
grant 'bdmp_test', 'RWXCA', 'SYSTEM:SEQUENCE'
grant 'bdmp_test', 'RWXCA', 'SYSTEM:STATS'
本来想着直接给这四张系统表赋权限就行,结果还是不够,于是就附了下面namespace的权限。
grant 'bdmp_test', 'RWXCA', '@SYSTEM'
至此,问题解决
其中:READ('R'), WRITE('W'), EXEC('X'), CREATE('C'), ADMIN('A')
另外,这个问题涉及到两个参数:

clipboard.png

  • 报错信息如下
Exception in thread "main" org.apache.phoenix.exception.PhoenixIOException: org.apache.hadoop.hbase.security.AccessDeniedException: Insufficient permissions (user=bdmp_test@HADOOP.TEST, scope=SYSTEM, params=[namespace=SYSTEM],action=ADMIN)    at org.apache.hadoop.hbase.security.access.AccessController.requireNamespacePermission(AccessController.java:603)    at org.apache.hadoop.hbase.security.access.AccessController.preGetNamespaceDescriptor(AccessController.java:1384)    at org.apache.hadoop.hbase.master.MasterCoprocessorHost$7.call(MasterCoprocessorHost.java:175)    at org.apache.hadoop.hbase.master.MasterCoprocessorHost.execOperation(MasterCoprocessorHost.java:1151)    at org.apache.hadoop.hbase.master.MasterCoprocessorHost.preGetNamespaceDescriptor(MasterCoprocessorHost.java:171)    at org.apache.hadoop.hbase.master.HMaster.getNamespaceDescriptor(HMaster.java:2598)    at org.apache.hadoop.hbase.master.MasterRpcServices.getNamespaceDescriptor(MasterRpcServices.java:818)    at org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java:55732)    at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2183)    at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:112)    at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:183)    at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:163)    at org.apache.phoenix.util.ServerUtil.parseServerException(ServerUtil.java:111)    at org.apache.phoenix.query.ConnectionQueryServicesImpl.ensureNamespaceCreated(ConnectionQueryServicesImpl.java:980)    at org.apache.phoenix.query.ConnectionQueryServicesImpl.access$1700(ConnectionQueryServicesImpl.java:219)    at org.apache.phoenix.query.ConnectionQueryServicesImpl$13.ensureSystemTablesUpgraded(ConnectionQueryServicesImpl.java:2605)    at org.apache.phoenix.query.ConnectionQueryServicesImpl$13.call(ConnectionQueryServicesImpl.java:2341)    at org.apache.phoenix.query.ConnectionQueryServicesImpl$13.call(ConnectionQueryServicesImpl.java:2300)    at org.apache.phoenix.util.PhoenixContextExecutor.call(PhoenixContextExecutor.java:78)    at org.apache.phoenix.query.ConnectionQueryServicesImpl.init(ConnectionQueryServicesImpl.java:2300)    at org.apache.phoenix.jdbc.PhoenixDriver.getConnectionQueryServices(PhoenixDriver.java:231)    at org.apache.phoenix.jdbc.PhoenixEmbeddedDriver.createConnection(PhoenixEmbeddedDriver.java:144)    at org.apache.phoenix.jdbc.PhoenixDriver.connect(PhoenixDriver.java:202)    at java.sql.DriverManager.getConnection(DriverManager.java:571)    at java.sql.DriverManager.getConnection(DriverManager.java:215)    at scala.com.chinalife.GetPhoenix$.connectionPhoenic(GetPhoenix.scala:58)    at scala.com.chinalife.GetPhoenix$.main(GetPhoenix.scala:16)    at scala.com.chinalife.GetPhoenix.main(GetPhoenix.scala)    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)    at java.lang.reflect.Method.invoke(Method.java:606)    at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:730)    at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)    at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)Caused by: org.apache.hadoop.hbase.security.AccessDeniedException: org.apache.hadoop.hbase.security.AccessDeniedException: Insufficient permissions (user=bdmp_test@HADOOP.TEST, scope=SYSTEM, params=[namespace=SYSTEM],action=ADMIN)    at org.apache.hadoop.hbase.security.access.AccessController.requireNamespacePermission(AccessController.java:603)    at org.apache.hadoop.hbase.security.access.AccessController.preGetNamespaceDescriptor(AccessController.java:1384)    at org.apache.hadoop.hbase.master.MasterCoprocessorHost$7.call(MasterCoprocessorHost.java:175)    at org.apache.hadoop.hbase.master.MasterCoprocessorHost.execOperation(MasterCoprocessorHost.java:1151)    at org.apache.hadoop.hbase.master.MasterCoprocessorHost.preGetNamespaceDescriptor(MasterCoprocessorHost.java:171)    at org.apache.hadoop.hbase.master.HMaster.getNamespaceDescriptor(HMaster.java:2598)    at org.apache.hadoop.hbase.master.MasterRpcServices.getNamespaceDescriptor(MasterRpcServices.java:818)    at org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java:55732)    at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2183)    at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:112)    at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:183)    at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:163)    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)    at java.lang.reflect.Constructor.newInstance(Constructor.java:526)    at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)    at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:95)    at org.apache.hadoop.hbase.client.RpcRetryingCaller.translateException(RpcRetryingCaller.java:236)    at org.apache.hadoop.hbase.client.RpcRetryingCaller.translateException(RpcRetryingCaller.java:254)    at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:150)    at org.apache.hadoop.hbase.client.HBaseAdmin.executeCallable(HBaseAdmin.java:4313)    at org.apache.hadoop.hbase.client.HBaseAdmin.executeCallable(HBaseAdmin.java:4305)    at org.apache.hadoop.hbase.client.HBaseAdmin.getNamespaceDescriptor(HBaseAdmin.java:3025)    at org.apache.phoenix.query.ConnectionQueryServicesImpl.ensureNamespaceCreated(ConnectionQueryServicesImpl.java:970)    ... 23 moreCaused by: org.apache.hadoop.hbase.ipc.RemoteWithExtrasException(org.apache.hadoop.hbase.security.AccessDeniedException): org.apache.hadoop.hbase.security.AccessDeniedException: Insufficient permissions (user=bdmp_test@HADOOP.TEST, scope=SYSTEM, params=[namespace=SYSTEM],action=ADMIN)    at org.apache.hadoop.hbase.security.access.AccessController.requireNamespacePermission(AccessController.java:603)    at org.apache.hadoop.hbase.security.access.AccessController.preGetNamespaceDescriptor(AccessController.java:1384)    at org.apache.hadoop.hbase.master.MasterCoprocessorHost$7.call(MasterCoprocessorHost.java:175)    at org.apache.hadoop.hbase.master.MasterCoprocessorHost.execOperation(MasterCoprocessorHost.java:1151)    at org.apache.hadoop.hbase.master.MasterCoprocessorHost.preGetNamespaceDescriptor(MasterCoprocessorHost.java:171)    at org.apache.hadoop.hbase.master.HMaster.getNamespaceDescriptor(HMaster.java:2598)    at org.apache.hadoop.hbase.master.MasterRpcServices.getNamespaceDescriptor(MasterRpcServices.java:818)    at org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java:55732)    at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2183)    at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:112)
Exception in thread "main" java.sql.SQLException: ERROR 2006 (INT08): Incompatible jars detected between client and server. Ensure that phoenix.jar is put on the classpath of HBase in every region server: org.apache.hadoop.hbase.security.AccessDeniedException: Insufficient permissions (user=bdmp_test@HADOOP.TEST, scope=SYSTEM:CATALOG, params=[table=SYSTEM:CATALOG],action=EXEC)    at org.apache.hadoop.hbase.security.access.AccessController.requirePermission(AccessController.java:448)    at org.apache.hadoop.hbase.security.access.AccessController.preEndpointInvocation(AccessController.java:2185)    at org.apache.hadoop.hbase.regionserver.RegionCoprocessorHost$67.call(RegionCoprocessorHost.java:1628)    at org.apache.hadoop.hbase.regionserver.RegionCoprocessorHost$EndpointOperation.call(RegionCoprocessorHost.java:1693)    at org.apache.hadoop.hbase.regionserver.RegionCoprocessorHost.execOperation(RegionCoprocessorHost.java:1749)    at org.apache.hadoop.hbase.regionserver.RegionCoprocessorHost.execOperationWithResult(RegionCoprocessorHost.java:1732)    at org.apache.hadoop.hbase.regionserver.RegionCoprocessorHost.preEndpointInvocation(RegionCoprocessorHost.java:1623)    at org.apache.hadoop.hbase.regionserver.HRegion.execService(HRegion.java:7854)    at org.apache.hadoop.hbase.regionserver.RSRpcServices.execServiceOnRegion(RSRpcServices.java:1968)    at org.apache.hadoop.hbase.regionserver.RSRpcServices.execService(RSRpcServices.java:1950)    at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:33652)    at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2183)    at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:112)    at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:183)    at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:163)    at org.apache.phoenix.exception.SQLExceptionCode$Factory$1.newException(SQLExceptionCode.java:454)    at org.apache.phoenix.exception.SQLExceptionInfo.buildException(SQLExceptionInfo.java:145)    at org.apache.phoenix.query.ConnectionQueryServicesImpl.checkClientServerCompatibility(ConnectionQueryServicesImpl.java:1214)    at org.apache.phoenix.query.ConnectionQueryServicesImpl.ensureTableCreated(ConnectionQueryServicesImpl.java:1063)    at org.apache.phoenix.query.ConnectionQueryServicesImpl.createTable(ConnectionQueryServicesImpl.java:1396)    at org.apache.phoenix.schema.MetaDataClient.createTableInternal(MetaDataClient.java:2302)    at org.apache.phoenix.schema.MetaDataClient.createTable(MetaDataClient.java:922)    at org.apache.phoenix.compile.CreateTableCompiler$2.execute(CreateTableCompiler.java:194)    at org.apache.phoenix.jdbc.PhoenixStatement$2.call(PhoenixStatement.java:343)    at org.apache.phoenix.jdbc.PhoenixStatement$2.call(PhoenixStatement.java:331)    at org.apache.phoenix.call.CallRunner.run(CallRunner.java:53)    at org.apache.phoenix.jdbc.PhoenixStatement.executeMutation(PhoenixStatement.java:329)    at org.apache.phoenix.jdbc.PhoenixStatement.executeUpdate(PhoenixStatement.java:1421)    at org.apache.phoenix.query.ConnectionQueryServicesImpl$13.call(ConnectionQueryServicesImpl.java:2353)    at org.apache.phoenix.query.ConnectionQueryServicesImpl$13.call(ConnectionQueryServicesImpl.java:2300)    at org.apache.phoenix.util.PhoenixContextExecutor.call(PhoenixContextExecutor.java:78)    at org.apache.phoenix.query.ConnectionQueryServicesImpl.init(ConnectionQueryServicesImpl.java:2300)    at org.apache.phoenix.jdbc.PhoenixDriver.getConnectionQueryServices(PhoenixDriver.java:231)    at org.apache.phoenix.jdbc.PhoenixEmbeddedDriver.createConnection(PhoenixEmbeddedDriver.java:144)    at org.apache.phoenix.jdbc.PhoenixDriver.connect(PhoenixDriver.java:202)    at java.sql.DriverManager.getConnection(DriverManager.java:571)    at java.sql.DriverManager.getConnection(DriverManager.java:215)    at scala.com.chinalife.GetPhoenix$.connectionPhoenic(GetPhoenix.scala:58)    at scala.com.chinalife.GetPhoenix$.main(GetPhoenix.scala:16)    at scala.com.chinalife.GetPhoenix.main(GetPhoenix.scala)    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)    at java.lang.reflect.Method.invoke(Method.java:606)    at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:730)    at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)    at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)Caused by: org.apache.hadoop.hbase.security.AccessDeniedException: org.apache.hadoop.hbase.security.AccessDeniedException: Insufficient permissions (user=bdmp_test@HADOOP.TEST, scope=SYSTEM:CATALOG, params=[table=SYSTEM:CATALOG],action=EXEC)    at org.apache.hadoop.hbase.security.access.AccessController.requirePermission(AccessController.java:448)    at org.apache.hadoop.hbase.security.access.AccessController.preEndpointInvocation(AccessController.java:2185)    at org.apache.hadoop.hbase.regionserver.RegionCoprocessorHost$67.call(RegionCoprocessorHost.java:1628)    at org.apache.hadoop.hbase.regionserver.RegionCoprocessorHost$EndpointOperation.call(RegionCoprocessorHost.java:1693)    at org.apache.hadoop.hbase.regionserver.RegionCoprocessorHost.execOperation(RegionCoprocessorHost.java:1749)    at org.apache.hadoop.hbase.regionserver.RegionCoprocessorHost.execOperationWithResult(RegionCoprocessorHost.java:1732)    at org.apache.hadoop.hbase.regionserver.RegionCoprocessorHost.preEndpointInvocation(RegionCoprocessorHost.java:1623)    at org.apache.hadoop.hbase.regionserver.HRegion.execService(HRegion.java:7854)    at org.apache.hadoop.hbase.regionserver.RSRpcServices.execServiceOnRegion(RSRpcServices.java:1968)    at org.apache.hadoop.hbase.regionserver.RSRpcServices.execService(RSRpcServices.java:1950)    at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:33652)    at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2183)    at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:112)    at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:183)    at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:163)    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)    at java.lang.reflect.Constructor.newInstance(Constructor.java:526)    at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)    at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:95)    at org.apache.hadoop.hbase.protobuf.ProtobufUtil.getRemoteException(ProtobufUtil.java:332)    at org.apache.hadoop.hbase.protobuf.ProtobufUtil.execService(ProtobufUtil.java:1637)    at org.apache.hadoop.hbase.ipc.RegionCoprocessorRpcChannel$1.call(RegionCoprocessorRpcChannel.java:104)    at org.apache.hadoop.hbase.ipc.RegionCoprocessorRpcChannel$1.call(RegionCoprocessorRpcChannel.java:94)    at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:136)    at org.apache.hadoop.hbase.ipc.RegionCoprocessorRpcChannel.callExecService(RegionCoprocessorRpcChannel.java:107)    at org.apache.hadoop.hbase.ipc.CoprocessorRpcChannel.callMethod(CoprocessorRpcChannel.java:56)    at org.apache.phoenix.coprocessor.generated.MetaDataProtos$MetaDataService$Stub.getVersion(MetaDataProtos.java:16460)    at org.apache.phoenix.query.ConnectionQueryServicesImpl$6.call(ConnectionQueryServicesImpl.java:1179)    at org.apache.phoenix.query.ConnectionQueryServicesImpl$6.call(ConnectionQueryServicesImpl.java:1171)    at org.apache.hadoop.hbase.client.HTable$15.call(HTable.java:1800)    at java.util.concurrent.FutureTask.run(FutureTask.java:262)    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)    at java.lang.Thread.run(Thread.java:745)Caused by: org.apache.hadoop.hbase.ipc.RemoteWithExtrasException(org.apache.hadoop.hbase.security.AccessDeniedException): org.apache.hadoop.hbase.security.AccessDeniedException: Insufficient permissions (user=bdmp_test@HADOOP.TEST, scope=SYSTEM:CATALOG, params=[table=SYSTEM:CATALOG],action=EXEC)    at org.apache.hadoop.hbase.security.access.AccessController.requirePermission(AccessController.java:448)    at org.apache.hadoop.hbase.security.access.AccessController.preEndpointInvocation(AccessController.java:2185)    at org.apache.hadoop.hbase.regionserver.RegionCoprocessorHost$67.call(RegionCoprocessorHost.java:1628)    at org.apache.hadoop.hbase.regionserver.RegionCoprocessorHost$EndpointOperation.call(RegionCoprocessorHost.java:1693)    at org.apache.hadoop.hbase.regionserver.RegionCoprocessorHost.execOperation(RegionCoprocessorHost.java:1749)    at org.apache.hadoop.hbase.regionserver.RegionCoprocessorHost.execOperationWithResult(RegionCoprocessorHost.java:1732)    at org.apache.hadoop.hbase.regionserver.RegionCoprocessorHost.preEndpointInvocation(RegionCoprocessorHost.java:1623)    at org.apache.hadoop.hbase.regionserver.HRegion.execService(HRegion.java:7854)    at org.apache.hadoop.hbase.regionserver.RSRpcServices.execServiceOnRegion(RSRpcServices.java:1968)    at org.apache.hadoop.hbase.regionserver.RSRpcServices.execService(RSRpcServices.java:1950)    at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:33652)    at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2183)    at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:112)    at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:183)    at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:163)    at org.apache.hadoop.hbase.ipc.RpcClientImpl.call(RpcClientImpl.java:1269)    at org.apache.hadoop.hbase.ipc.AbstractRpcClient.callBlockingMethod(AbstractRpcClient.java:227)    at org.apache.hadoop.hbase.ipc.AbstractRpcClient$BlockingRpcChannelImplementation.callBlockingMethod(AbstractRpcClient.java:336)    at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$BlockingStub.execService(ClientProtos.java:34118)    at org.apache.hadoop.hbase.protobuf.ProtobufUtil.execService(ProtobufUtil.java:1633)

转载地址:http://qkpox.baihongyu.com/

你可能感兴趣的文章
乐观锁、悲观锁简单分析,回忆旧(新)知识...
查看>>
addcolumn---Grid---Magento
查看>>
逐行分析Hadoop的HelloWorld
查看>>
ubuntu下的win交叉编译
查看>>
SVG(网页加载显示的加载进度动态图)
查看>>
openSUSE install albertlauncher from source files
查看>>
nginx模块学习六 add_header 跨域访问
查看>>
Java并发编程:AbstractQueuedSynchronizer的内部结构
查看>>
Could not find artifact com.sun:tools:jar:1.5.0 解决办法
查看>>
神经网络---Hessian矩阵
查看>>
TreeMap之floorKey
查看>>
phpstorm xdebug remote配置
查看>>
iOS 无限后台运行
查看>>
STL札记3-2(hashtable关联容器set、map)
查看>>
git revert 的问题
查看>>
No module named MySQLdb (django)
查看>>
linux-mint 安装 python 包 Gevent
查看>>
iOS Provisioning Portal概述
查看>>
职业经理人应该如何处理如下问题
查看>>
Android自动化测试之MonkeyRunner
查看>>