Search in sources :

Example 21 with ExploreException

use of co.cask.cdap.explore.service.ExploreException in project cdap by caskdata.

the class ExploreHttpClient method getQueries.

@Override
public List<QueryInfo> getQueries(NamespaceId namespace) throws ExploreException, SQLException {
    String resource = String.format("namespaces/%s/data/explore/queries/", namespace.getEntityName());
    HttpResponse response = doGet(resource);
    if (response.getResponseCode() == HttpURLConnection.HTTP_OK) {
        return parseJson(response, QUERY_INFO_LIST_TYPE);
    }
    throw new ExploreException("Cannot get list of queries. Reason: " + response);
}
Also used : HttpResponse(co.cask.common.http.HttpResponse) ExploreException(co.cask.cdap.explore.service.ExploreException)

Example 22 with ExploreException

use of co.cask.cdap.explore.service.ExploreException in project cdap by caskdata.

the class BaseHiveExploreService method createNamespace.

public QueryHandle createNamespace(NamespaceMeta namespaceMeta) throws ExploreException, SQLException {
    startAndWait();
    try {
        // This check prevents the extra warn log.
        if (NamespaceId.DEFAULT.equals(namespaceMeta.getNamespaceId())) {
            return QueryHandle.NO_OP;
        }
        Map<String, String> sessionConf = startSession();
        SessionHandle sessionHandle = null;
        OperationHandle operationHandle = null;
        try {
            sessionHandle = openHiveSession(sessionConf);
            QueryHandle handle;
            if (Strings.isNullOrEmpty(namespaceMeta.getConfig().getHiveDatabase())) {
                // if no custom hive database was provided get the hive database according to cdap format and create it
                // if one does not exists since cdap is responsible for managing the lifecycle of such databases
                String database = createHiveDBName(namespaceMeta.getName());
                // "IF NOT EXISTS" so that this operation is idempotent.
                String statement = String.format("CREATE DATABASE IF NOT EXISTS %s", database);
                operationHandle = executeAsync(sessionHandle, statement);
                handle = saveReadOnlyOperation(operationHandle, sessionHandle, sessionConf, statement, database);
                LOG.info("Creating database {} with handle {}", database, handle);
            } else {
                // a custom database name was provided so check its existence
                // there is no way to check if a hive database exists or not other than trying to use it and see whether
                // it fails or not. So, run a USE databaseName command and see if it throws exception
                // Other way can be to list all database and check if the database exists or not but we are doing USE to
                // make sure that user can acutally use the database once we have impersonation.
                String statement = String.format("USE %s", namespaceMeta.getConfig().getHiveDatabase());
                // if the database does not exists the below line will throw exception from hive
                try {
                    operationHandle = executeAsync(sessionHandle, statement);
                } catch (HiveSQLException e) {
                    // then we will get an exception from Hive with error code 10072 which represent database was not found
                    if (e.toTStatus().getErrorCode() == ErrorMsg.DATABASE_NOT_EXISTS.getErrorCode()) {
                        //TODO: Add username here
                        throw new ExploreException(String.format("A custom Hive Database %s was provided for namespace %s " + "which does not exists. Please create the database in hive " + "for the user and try creating the namespace again.", namespaceMeta.getConfig().getHiveDatabase(), namespaceMeta.getName()), e);
                    } else {
                        // some other exception was generated while checking the existense of the database
                        throw new ExploreException(String.format("Failed to check existence of given custom hive database " + "%s for namespace %s", namespaceMeta.getConfig().getHiveDatabase(), namespaceMeta.getName()), e);
                    }
                }
                // if we didn't got an exception on the line above we know that the database exists
                handle = saveReadOnlyOperation(operationHandle, sessionHandle, sessionConf, statement, namespaceMeta.getConfig().getHiveDatabase());
                LOG.debug("Custom database {} existence verified with handle {}", namespaceMeta.getConfig().getHiveDatabase(), handle);
            }
            return handle;
        } catch (Throwable e) {
            closeInternal(getQueryHandle(sessionConf), new ReadOnlyOperationInfo(sessionHandle, operationHandle, sessionConf, "", ""));
            throw e;
        }
    } catch (HiveSQLException e) {
        throw getSqlException(e);
    } catch (Throwable e) {
        throw new ExploreException(e);
    }
}
Also used : HiveSQLException(org.apache.hive.service.cli.HiveSQLException) SessionHandle(org.apache.hive.service.cli.SessionHandle) QueryHandle(co.cask.cdap.proto.QueryHandle) OperationHandle(org.apache.hive.service.cli.OperationHandle) ExploreException(co.cask.cdap.explore.service.ExploreException)

Example 23 with ExploreException

use of co.cask.cdap.explore.service.ExploreException in project cdap by caskdata.

the class BaseHiveExploreService method getTables.

@Override
public QueryHandle getTables(String catalog, String schemaPattern, String tableNamePattern, List<String> tableTypes) throws ExploreException, SQLException {
    startAndWait();
    try {
        SessionHandle sessionHandle = null;
        OperationHandle operationHandle = null;
        Map<String, String> sessionConf = startSession();
        String database = getHiveDatabase(schemaPattern);
        try {
            sessionHandle = openHiveSession(sessionConf);
            operationHandle = cliService.getTables(sessionHandle, catalog, database, tableNamePattern, tableTypes);
            QueryHandle handle = saveReadOnlyOperation(operationHandle, sessionHandle, sessionConf, "", database);
            LOG.trace("Retrieving tables: catalog {}, schemaNamePattern {}, tableNamePattern {}, tableTypes {}", catalog, database, tableNamePattern, tableTypes);
            return handle;
        } catch (Throwable e) {
            closeInternal(getQueryHandle(sessionConf), new ReadOnlyOperationInfo(sessionHandle, operationHandle, sessionConf, "", database));
            throw e;
        }
    } catch (HiveSQLException e) {
        throw getSqlException(e);
    } catch (Throwable e) {
        throw new ExploreException(e);
    }
}
Also used : HiveSQLException(org.apache.hive.service.cli.HiveSQLException) SessionHandle(org.apache.hive.service.cli.SessionHandle) QueryHandle(co.cask.cdap.proto.QueryHandle) OperationHandle(org.apache.hive.service.cli.OperationHandle) ExploreException(co.cask.cdap.explore.service.ExploreException)

Example 24 with ExploreException

use of co.cask.cdap.explore.service.ExploreException in project cdap by caskdata.

the class BaseHiveExploreService method getTableTypes.

@Override
public QueryHandle getTableTypes() throws ExploreException, SQLException {
    startAndWait();
    try {
        SessionHandle sessionHandle = null;
        OperationHandle operationHandle = null;
        Map<String, String> sessionConf = startSession();
        try {
            sessionHandle = openHiveSession(sessionConf);
            operationHandle = cliService.getTableTypes(sessionHandle);
            QueryHandle handle = saveReadOnlyOperation(operationHandle, sessionHandle, sessionConf, "", "");
            LOG.trace("Retrieving table types");
            return handle;
        } catch (Throwable e) {
            closeInternal(getQueryHandle(sessionConf), new ReadOnlyOperationInfo(sessionHandle, operationHandle, sessionConf, "", ""));
            throw e;
        }
    } catch (HiveSQLException e) {
        throw getSqlException(e);
    } catch (Throwable e) {
        throw new ExploreException(e);
    }
}
Also used : HiveSQLException(org.apache.hive.service.cli.HiveSQLException) SessionHandle(org.apache.hive.service.cli.SessionHandle) QueryHandle(co.cask.cdap.proto.QueryHandle) OperationHandle(org.apache.hive.service.cli.OperationHandle) ExploreException(co.cask.cdap.explore.service.ExploreException)

Example 25 with ExploreException

use of co.cask.cdap.explore.service.ExploreException in project cdap by caskdata.

the class BaseHiveExploreService method getTableInfo.

@Override
public TableInfo getTableInfo(String namespace, @Nullable String databaseName, String table) throws ExploreException, TableNotFoundException {
    startAndWait();
    // TODO check if the database user is allowed to access if security is enabled
    try {
        String db = databaseName != null ? databaseName : getHiveDatabase(namespace);
        Table tableInfo = getMetaStoreClient().getTable(db, table);
        List<FieldSchema> tableFields = tableInfo.getSd().getCols();
        // in the storage descriptor. If columns are missing, do a separate call for schema.
        if (tableFields == null || tableFields.isEmpty()) {
            // don't call .getSchema()... class not found exception if we do in the thrift code...
            tableFields = getMetaStoreClient().getFields(db, table);
        }
        ImmutableList.Builder<TableInfo.ColumnInfo> schemaBuilder = ImmutableList.builder();
        Set<String> fieldNames = Sets.newHashSet();
        for (FieldSchema column : tableFields) {
            schemaBuilder.add(new TableInfo.ColumnInfo(column.getName(), column.getType(), column.getComment()));
            fieldNames.add(column.getName());
        }
        ImmutableList.Builder<TableInfo.ColumnInfo> partitionKeysBuilder = ImmutableList.builder();
        for (FieldSchema column : tableInfo.getPartitionKeys()) {
            TableInfo.ColumnInfo columnInfo = new TableInfo.ColumnInfo(column.getName(), column.getType(), column.getComment());
            partitionKeysBuilder.add(columnInfo);
            // since they show up when you do a 'describe <table>' command.
            if (!fieldNames.contains(column.getName())) {
                schemaBuilder.add(columnInfo);
            }
        }
        // its a cdap generated table if it uses our storage handler, or if a property is set on the table.
        String cdapName = null;
        Map<String, String> tableParameters = tableInfo.getParameters();
        if (tableParameters != null) {
            cdapName = tableParameters.get(Constants.Explore.CDAP_NAME);
        }
        // tables created after CDAP 2.6 should set the "cdap.name" property, but older ones
        // do not. So also check if it uses a cdap storage handler.
        String storageHandler = tableInfo.getParameters().get("storage_handler");
        boolean isDatasetTable = cdapName != null || DatasetStorageHandler.class.getName().equals(storageHandler) || StreamStorageHandler.class.getName().equals(storageHandler);
        return new TableInfo(tableInfo.getTableName(), tableInfo.getDbName(), tableInfo.getOwner(), (long) tableInfo.getCreateTime() * 1000, (long) tableInfo.getLastAccessTime() * 1000, tableInfo.getRetention(), partitionKeysBuilder.build(), tableInfo.getParameters(), tableInfo.getTableType(), schemaBuilder.build(), tableInfo.getSd().getLocation(), tableInfo.getSd().getInputFormat(), tableInfo.getSd().getOutputFormat(), tableInfo.getSd().isCompressed(), tableInfo.getSd().getNumBuckets(), tableInfo.getSd().getSerdeInfo().getSerializationLib(), tableInfo.getSd().getSerdeInfo().getParameters(), isDatasetTable);
    } catch (NoSuchObjectException e) {
        throw new TableNotFoundException(e);
    } catch (TException e) {
        throw new ExploreException(e);
    }
}
Also used : TException(org.apache.thrift.TException) Table(org.apache.hadoop.hive.metastore.api.Table) ImmutableList(com.google.common.collect.ImmutableList) FieldSchema(org.apache.hadoop.hive.metastore.api.FieldSchema) ExploreException(co.cask.cdap.explore.service.ExploreException) TableNotFoundException(co.cask.cdap.explore.service.TableNotFoundException) DatasetStorageHandler(co.cask.cdap.hive.datasets.DatasetStorageHandler) TableInfo(co.cask.cdap.proto.TableInfo) NoSuchObjectException(org.apache.hadoop.hive.metastore.api.NoSuchObjectException)

Aggregations

ExploreException (co.cask.cdap.explore.service.ExploreException)41 QueryHandle (co.cask.cdap.proto.QueryHandle)16 HiveSQLException (org.apache.hive.service.cli.HiveSQLException)14 HttpResponse (co.cask.common.http.HttpResponse)12 OperationHandle (org.apache.hive.service.cli.OperationHandle)12 SessionHandle (org.apache.hive.service.cli.SessionHandle)12 SQLException (java.sql.SQLException)11 IOException (java.io.IOException)9 HandleNotFoundException (co.cask.cdap.explore.service.HandleNotFoundException)6 TableNotFoundException (co.cask.cdap.explore.service.TableNotFoundException)4 Path (javax.ws.rs.Path)4 QueryStatus (co.cask.cdap.proto.QueryStatus)3 FileNotFoundException (java.io.FileNotFoundException)3 TException (org.apache.thrift.TException)3 UnsupportedTypeException (co.cask.cdap.api.data.schema.UnsupportedTypeException)2 NamespaceNotFoundException (co.cask.cdap.common.NamespaceNotFoundException)2 HBaseDDLExecutor (co.cask.cdap.spi.hbase.HBaseDDLExecutor)2 ImmutableList (com.google.common.collect.ImmutableList)2 JsonObject (com.google.gson.JsonObject)2 HashMap (java.util.HashMap)2