Search in sources :

Example 16 with InvalidObjectException

use of org.apache.hadoop.hive.metastore.api.InvalidObjectException in project hive by apache.

the class HiveAlterHandler method updatePartColumnStats.

private void updatePartColumnStats(RawStore msdb, String dbName, String tableName, List<String> partVals, Partition newPart) throws MetaException, InvalidObjectException {
    dbName = HiveStringUtils.normalizeIdentifier(dbName);
    tableName = HiveStringUtils.normalizeIdentifier(tableName);
    String newDbName = HiveStringUtils.normalizeIdentifier(newPart.getDbName());
    String newTableName = HiveStringUtils.normalizeIdentifier(newPart.getTableName());
    Table oldTable = msdb.getTable(dbName, tableName);
    if (oldTable == null) {
        return;
    }
    try {
        String oldPartName = Warehouse.makePartName(oldTable.getPartitionKeys(), partVals);
        String newPartName = Warehouse.makePartName(oldTable.getPartitionKeys(), newPart.getValues());
        if (!dbName.equals(newDbName) || !tableName.equals(newTableName) || !oldPartName.equals(newPartName)) {
            msdb.deletePartitionColumnStatistics(dbName, tableName, oldPartName, partVals, null);
        } else {
            Partition oldPartition = msdb.getPartition(dbName, tableName, partVals);
            if (oldPartition == null) {
                return;
            }
            if (oldPartition.getSd() != null && newPart.getSd() != null) {
                List<FieldSchema> oldCols = oldPartition.getSd().getCols();
                if (!MetaStoreUtils.columnsIncluded(oldCols, newPart.getSd().getCols())) {
                    updatePartColumnStatsForAlterColumns(msdb, oldPartition, oldPartName, partVals, oldCols, newPart);
                }
            }
        }
    } catch (NoSuchObjectException nsoe) {
        LOG.debug("Could not find db entry." + nsoe);
    //ignore
    } catch (InvalidInputException iie) {
        throw new InvalidObjectException("Invalid input to update partition column stats." + iie);
    }
}
Also used : Partition(org.apache.hadoop.hive.metastore.api.Partition) InvalidInputException(org.apache.hadoop.hive.metastore.api.InvalidInputException) Table(org.apache.hadoop.hive.metastore.api.Table) FieldSchema(org.apache.hadoop.hive.metastore.api.FieldSchema) NoSuchObjectException(org.apache.hadoop.hive.metastore.api.NoSuchObjectException) InvalidObjectException(org.apache.hadoop.hive.metastore.api.InvalidObjectException)

Example 17 with InvalidObjectException

use of org.apache.hadoop.hive.metastore.api.InvalidObjectException in project hive by apache.

the class HBaseStore method addRole.

@Override
public boolean addRole(String roleName, String ownerName) throws InvalidObjectException, MetaException, NoSuchObjectException {
    int now = (int) (System.currentTimeMillis() / 1000);
    Role role = new Role(roleName, now, ownerName);
    boolean commit = false;
    openTransaction();
    try {
        if (getHBase().getRole(roleName) != null) {
            throw new InvalidObjectException("Role " + roleName + " already exists");
        }
        getHBase().putRole(role);
        commit = true;
        return true;
    } catch (IOException e) {
        LOG.error("Unable to create role ", e);
        throw new MetaException("Unable to read from or write to hbase " + e.getMessage());
    } finally {
        commitOrRoleBack(commit);
    }
}
Also used : Role(org.apache.hadoop.hive.metastore.api.Role) InvalidObjectException(org.apache.hadoop.hive.metastore.api.InvalidObjectException) IOException(java.io.IOException) MetaException(org.apache.hadoop.hive.metastore.api.MetaException)

Example 18 with InvalidObjectException

use of org.apache.hadoop.hive.metastore.api.InvalidObjectException in project metacat by Netflix.

the class HiveConnectorTableService method create.

/**
     * Create a table.
     *
     * @param requestContext The request context
     * @param tableInfo      The resource metadata
     */
@Override
public void create(@Nonnull @NonNull final ConnectorContext requestContext, @Nonnull @NonNull final TableInfo tableInfo) {
    final QualifiedName tableName = tableInfo.getName();
    try {
        final Table table = hiveMetacatConverters.fromTableInfo(tableInfo);
        updateTable(requestContext, table, tableInfo);
        metacatHiveClient.createTable(table);
    } catch (AlreadyExistsException exception) {
        throw new TableAlreadyExistsException(tableName, exception);
    } catch (MetaException exception) {
        throw new InvalidMetaException(tableName, exception);
    } catch (NoSuchObjectException | InvalidObjectException exception) {
        throw new DatabaseNotFoundException(QualifiedName.ofDatabase(tableName.getCatalogName(), tableName.getDatabaseName()), exception);
    } catch (TException exception) {
        throw new ConnectorException(String.format("Failed create hive table %s", tableName), exception);
    }
}
Also used : TException(org.apache.thrift.TException) TableAlreadyExistsException(com.netflix.metacat.common.server.connectors.exception.TableAlreadyExistsException) Table(org.apache.hadoop.hive.metastore.api.Table) AlreadyExistsException(org.apache.hadoop.hive.metastore.api.AlreadyExistsException) TableAlreadyExistsException(com.netflix.metacat.common.server.connectors.exception.TableAlreadyExistsException) QualifiedName(com.netflix.metacat.common.QualifiedName) DatabaseNotFoundException(com.netflix.metacat.common.server.connectors.exception.DatabaseNotFoundException) ConnectorException(com.netflix.metacat.common.server.connectors.exception.ConnectorException) NoSuchObjectException(org.apache.hadoop.hive.metastore.api.NoSuchObjectException) InvalidObjectException(org.apache.hadoop.hive.metastore.api.InvalidObjectException) InvalidMetaException(com.netflix.metacat.common.server.connectors.exception.InvalidMetaException) MetaException(org.apache.hadoop.hive.metastore.api.MetaException) InvalidMetaException(com.netflix.metacat.common.server.connectors.exception.InvalidMetaException)

Example 19 with InvalidObjectException

use of org.apache.hadoop.hive.metastore.api.InvalidObjectException in project metacat by Netflix.

the class MetacatHiveClient method addDropPartitions.

/**
     * {@inheritDoc}.
     */
@Override
public void addDropPartitions(final String dbName, final String tableName, final List<Partition> partitions, final List<String> delPartitionNames) throws TException {
    try (HiveMetastoreClient client = createMetastoreClient()) {
        try {
            dropHivePartitions(client, dbName, tableName, delPartitionNames);
            client.add_partitions(partitions);
        } catch (MetaException | InvalidObjectException e) {
            throw new InvalidMetaException("One or more partitions are invalid.", e);
        } catch (TException e) {
            throw new TException(String.format("Internal server error adding/dropping partitions for table %s.%s", dbName, tableName), e);
        }
    }
}
Also used : TException(org.apache.thrift.TException) InvalidObjectException(org.apache.hadoop.hive.metastore.api.InvalidObjectException) InvalidMetaException(com.netflix.metacat.common.server.connectors.exception.InvalidMetaException) MetaException(org.apache.hadoop.hive.metastore.api.MetaException) InvalidMetaException(com.netflix.metacat.common.server.connectors.exception.InvalidMetaException)

Example 20 with InvalidObjectException

use of org.apache.hadoop.hive.metastore.api.InvalidObjectException in project metacat by Netflix.

the class MetacatHMSHandler method add_drop_partitions.

/**
     * Adds and drops partitions in one transaction.
     *
     * @param databaseName database name
     * @param tableName    table name
     * @param addParts     list of partitions
     * @param dropParts    list of partition values
     * @param deleteData   if true, deletes the data
     * @return true if successful
     * @throws NoSuchObjectException Exception if table does not exists
     * @throws MetaException         Exception if
     * @throws TException            any internal exception
     */
@SuppressWarnings({ "checkstyle:methodname" })
public boolean add_drop_partitions(final String databaseName, final String tableName, final List<Partition> addParts, final List<List<String>> dropParts, final boolean deleteData) throws NoSuchObjectException, MetaException, TException {
    startFunction("add_drop_partitions : db=" + databaseName + " tbl=" + tableName);
    if (addParts.size() == 0 && dropParts.size() == 0) {
        return true;
    }
    for (List<String> partVals : dropParts) {
        LOG.info("Drop Partition values:" + partVals);
    }
    for (Partition part : addParts) {
        LOG.info("Add Partition values:" + part);
    }
    boolean ret = false;
    Exception ex = null;
    try {
        ret = addDropPartitionsCore(getMS(), databaseName, tableName, addParts, dropParts, false, null);
    } catch (Exception e) {
        ex = e;
        if (e instanceof MetaException) {
            throw (MetaException) e;
        } else if (e instanceof InvalidObjectException) {
            throw (InvalidObjectException) e;
        } else if (e instanceof AlreadyExistsException) {
            throw (AlreadyExistsException) e;
        } else if (e instanceof NoSuchObjectException) {
            throw (NoSuchObjectException) e;
        } else {
            throw newMetaException(e);
        }
    } finally {
        endFunction("drop_partitions", ret, ex, tableName);
    }
    return ret;
}
Also used : Partition(org.apache.hadoop.hive.metastore.api.Partition) AlreadyExistsException(org.apache.hadoop.hive.metastore.api.AlreadyExistsException) InvalidObjectException(org.apache.hadoop.hive.metastore.api.InvalidObjectException) NoSuchObjectException(org.apache.hadoop.hive.metastore.api.NoSuchObjectException) MetaException(org.apache.hadoop.hive.metastore.api.MetaException) AlreadyExistsException(org.apache.hadoop.hive.metastore.api.AlreadyExistsException) InvalidInputException(org.apache.hadoop.hive.metastore.api.InvalidInputException) TException(org.apache.thrift.TException) IOException(java.io.IOException) InvalidObjectException(org.apache.hadoop.hive.metastore.api.InvalidObjectException) NoSuchObjectException(org.apache.hadoop.hive.metastore.api.NoSuchObjectException) MetaException(org.apache.hadoop.hive.metastore.api.MetaException)

Aggregations

InvalidObjectException (org.apache.hadoop.hive.metastore.api.InvalidObjectException)36 MetaException (org.apache.hadoop.hive.metastore.api.MetaException)21 NoSuchObjectException (org.apache.hadoop.hive.metastore.api.NoSuchObjectException)21 Table (org.apache.hadoop.hive.metastore.api.Table)14 TException (org.apache.thrift.TException)14 ArrayList (java.util.ArrayList)13 Partition (org.apache.hadoop.hive.metastore.api.Partition)11 IOException (java.io.IOException)8 AlreadyExistsException (org.apache.hadoop.hive.metastore.api.AlreadyExistsException)8 InvalidInputException (org.apache.hadoop.hive.metastore.api.InvalidInputException)8 MTable (org.apache.hadoop.hive.metastore.model.MTable)8 FieldSchema (org.apache.hadoop.hive.metastore.api.FieldSchema)7 InvalidMetaException (com.netflix.metacat.common.server.connectors.exception.InvalidMetaException)6 ConnectorException (com.netflix.metacat.common.server.connectors.exception.ConnectorException)5 List (java.util.List)5 TableNotFoundException (com.netflix.metacat.common.server.connectors.exception.TableNotFoundException)4 UnknownDBException (org.apache.hadoop.hive.metastore.api.UnknownDBException)4 MConstraint (org.apache.hadoop.hive.metastore.model.MConstraint)4 InvalidOperationException (org.apache.hadoop.hive.metastore.api.InvalidOperationException)3 SerDeInfo (org.apache.hadoop.hive.metastore.api.SerDeInfo)3