Search in sources :

Example 1 with Type

use of org.apache.hadoop.hbase.shaded.protobuf.generated.AccessControlProtos.Permission.Type in project hbase by apache.

the class MasterRpcServices method getUserPermissions.

@Override
public GetUserPermissionsResponse getUserPermissions(RpcController controller, GetUserPermissionsRequest request) throws ServiceException {
    try {
        server.checkInitialized();
        if (server.cpHost != null && hasAccessControlServiceCoprocessor(server.cpHost)) {
            final String userName = request.hasUserName() ? request.getUserName().toStringUtf8() : null;
            String namespace = request.hasNamespaceName() ? request.getNamespaceName().toStringUtf8() : null;
            TableName table = request.hasTableName() ? ProtobufUtil.toTableName(request.getTableName()) : null;
            byte[] cf = request.hasColumnFamily() ? request.getColumnFamily().toByteArray() : null;
            byte[] cq = request.hasColumnQualifier() ? request.getColumnQualifier().toByteArray() : null;
            Type permissionType = request.hasType() ? request.getType() : null;
            server.getMasterCoprocessorHost().preGetUserPermissions(userName, namespace, table, cf, cq);
            List<UserPermission> perms = null;
            if (permissionType == Type.Table) {
                boolean filter = (cf != null || userName != null) ? true : false;
                perms = PermissionStorage.getUserTablePermissions(server.getConfiguration(), table, cf, cq, userName, filter);
            } else if (permissionType == Type.Namespace) {
                perms = PermissionStorage.getUserNamespacePermissions(server.getConfiguration(), namespace, userName, userName != null ? true : false);
            } else {
                perms = PermissionStorage.getUserPermissions(server.getConfiguration(), null, null, null, userName, userName != null ? true : false);
                // Skip super users when filter user is specified
                if (userName == null) {
                    // will help in avoiding any leakage of information about being superusers.
                    for (String user : Superusers.getSuperUsers()) {
                        perms.add(new UserPermission(user, Permission.newBuilder().withActions(Action.values()).build()));
                    }
                }
            }
            server.getMasterCoprocessorHost().postGetUserPermissions(userName, namespace, table, cf, cq);
            AccessControlProtos.GetUserPermissionsResponse response = ShadedAccessControlUtil.buildGetUserPermissionsResponse(perms);
            return response;
        } else {
            throw new DoNotRetryIOException(new UnsupportedOperationException(AccessController.class.getName() + " is not loaded"));
        }
    } catch (IOException ioe) {
        throw new ServiceException(ioe);
    }
}
Also used : DoNotRetryIOException(org.apache.hadoop.hbase.DoNotRetryIOException) ByteString(org.apache.hbase.thirdparty.com.google.protobuf.ByteString) IOException(java.io.IOException) DoNotRetryIOException(org.apache.hadoop.hbase.DoNotRetryIOException) GetUserPermissionsResponse(org.apache.hadoop.hbase.shaded.protobuf.generated.AccessControlProtos.GetUserPermissionsResponse) TableName(org.apache.hadoop.hbase.TableName) AccessControlProtos(org.apache.hadoop.hbase.shaded.protobuf.generated.AccessControlProtos) RegionSpecifierType(org.apache.hadoop.hbase.shaded.protobuf.generated.HBaseProtos.RegionSpecifier.RegionSpecifierType) MasterSwitchType(org.apache.hadoop.hbase.client.MasterSwitchType) LockType(org.apache.hadoop.hbase.procedure2.LockType) ServerType(org.apache.hadoop.hbase.util.DNS.ServerType) Type(org.apache.hadoop.hbase.shaded.protobuf.generated.AccessControlProtos.Permission.Type) AccessController(org.apache.hadoop.hbase.security.access.AccessController) ServiceException(org.apache.hbase.thirdparty.com.google.protobuf.ServiceException) UserPermission(org.apache.hadoop.hbase.security.access.UserPermission)

Aggregations

IOException (java.io.IOException)1 DoNotRetryIOException (org.apache.hadoop.hbase.DoNotRetryIOException)1 TableName (org.apache.hadoop.hbase.TableName)1 MasterSwitchType (org.apache.hadoop.hbase.client.MasterSwitchType)1 LockType (org.apache.hadoop.hbase.procedure2.LockType)1 AccessController (org.apache.hadoop.hbase.security.access.AccessController)1 UserPermission (org.apache.hadoop.hbase.security.access.UserPermission)1 AccessControlProtos (org.apache.hadoop.hbase.shaded.protobuf.generated.AccessControlProtos)1 GetUserPermissionsResponse (org.apache.hadoop.hbase.shaded.protobuf.generated.AccessControlProtos.GetUserPermissionsResponse)1 Type (org.apache.hadoop.hbase.shaded.protobuf.generated.AccessControlProtos.Permission.Type)1 RegionSpecifierType (org.apache.hadoop.hbase.shaded.protobuf.generated.HBaseProtos.RegionSpecifier.RegionSpecifierType)1 ServerType (org.apache.hadoop.hbase.util.DNS.ServerType)1 ByteString (org.apache.hbase.thirdparty.com.google.protobuf.ByteString)1 ServiceException (org.apache.hbase.thirdparty.com.google.protobuf.ServiceException)1