Search in sources :

Example 1 with PartitionSpecInvalidException

use of org.apache.flink.table.catalog.exceptions.PartitionSpecInvalidException in project flink by apache.

the class HiveCatalog method getPartitionColumnStatistics.

@Override
public CatalogColumnStatistics getPartitionColumnStatistics(ObjectPath tablePath, CatalogPartitionSpec partitionSpec) throws PartitionNotExistException, CatalogException {
    try {
        Partition partition = getHivePartition(tablePath, partitionSpec);
        Table hiveTable = getHiveTable(tablePath);
        String partName = getEscapedPartitionName(tablePath, partitionSpec, hiveTable);
        List<String> partNames = new ArrayList<>();
        partNames.add(partName);
        Map<String, List<ColumnStatisticsObj>> partitionColumnStatistics = client.getPartitionColumnStatistics(partition.getDbName(), partition.getTableName(), partNames, getFieldNames(partition.getSd().getCols()));
        List<ColumnStatisticsObj> columnStatisticsObjs = partitionColumnStatistics.get(partName);
        if (columnStatisticsObjs != null && !columnStatisticsObjs.isEmpty()) {
            return new CatalogColumnStatistics(HiveStatsUtil.createCatalogColumnStats(columnStatisticsObjs, hiveVersion));
        } else {
            return CatalogColumnStatistics.UNKNOWN;
        }
    } catch (TableNotExistException | PartitionSpecInvalidException e) {
        throw new PartitionNotExistException(getName(), tablePath, partitionSpec);
    } catch (TException e) {
        throw new CatalogException(String.format("Failed to get table stats of table %s 's partition %s", tablePath.getFullName(), String.valueOf(partitionSpec)), e);
    }
}
Also used : TException(org.apache.thrift.TException) Partition(org.apache.hadoop.hive.metastore.api.Partition) CatalogPartition(org.apache.flink.table.catalog.CatalogPartition) CatalogTable(org.apache.flink.table.catalog.CatalogTable) SqlCreateHiveTable(org.apache.flink.sql.parser.hive.ddl.SqlCreateHiveTable) Table(org.apache.hadoop.hive.metastore.api.Table) CatalogBaseTable(org.apache.flink.table.catalog.CatalogBaseTable) TableNotExistException(org.apache.flink.table.catalog.exceptions.TableNotExistException) ArrayList(java.util.ArrayList) CatalogException(org.apache.flink.table.catalog.exceptions.CatalogException) CatalogColumnStatistics(org.apache.flink.table.catalog.stats.CatalogColumnStatistics) ColumnStatisticsObj(org.apache.hadoop.hive.metastore.api.ColumnStatisticsObj) ArrayList(java.util.ArrayList) List(java.util.List) PartitionNotExistException(org.apache.flink.table.catalog.exceptions.PartitionNotExistException) PartitionSpecInvalidException(org.apache.flink.table.catalog.exceptions.PartitionSpecInvalidException)

Example 2 with PartitionSpecInvalidException

use of org.apache.flink.table.catalog.exceptions.PartitionSpecInvalidException in project flink by apache.

the class HiveCatalog method getPartition.

@Override
public CatalogPartition getPartition(ObjectPath tablePath, CatalogPartitionSpec partitionSpec) throws PartitionNotExistException, CatalogException {
    checkNotNull(tablePath, "Table path cannot be null");
    checkNotNull(partitionSpec, "CatalogPartitionSpec cannot be null");
    try {
        Partition hivePartition = getHivePartition(tablePath, partitionSpec);
        Map<String, String> properties = hivePartition.getParameters();
        properties.put(SqlCreateHiveTable.TABLE_LOCATION_URI, hivePartition.getSd().getLocation());
        String comment = properties.remove(HiveCatalogConfig.COMMENT);
        return new CatalogPartitionImpl(properties, comment);
    } catch (NoSuchObjectException | MetaException | TableNotExistException | PartitionSpecInvalidException e) {
        throw new PartitionNotExistException(getName(), tablePath, partitionSpec, e);
    } catch (TException e) {
        throw new CatalogException(String.format("Failed to get partition %s of table %s", partitionSpec, tablePath), e);
    }
}
Also used : TException(org.apache.thrift.TException) Partition(org.apache.hadoop.hive.metastore.api.Partition) CatalogPartition(org.apache.flink.table.catalog.CatalogPartition) TableNotExistException(org.apache.flink.table.catalog.exceptions.TableNotExistException) CatalogException(org.apache.flink.table.catalog.exceptions.CatalogException) NoSuchObjectException(org.apache.hadoop.hive.metastore.api.NoSuchObjectException) PartitionNotExistException(org.apache.flink.table.catalog.exceptions.PartitionNotExistException) PartitionSpecInvalidException(org.apache.flink.table.catalog.exceptions.PartitionSpecInvalidException) CatalogPartitionImpl(org.apache.flink.table.catalog.CatalogPartitionImpl) MetaException(org.apache.hadoop.hive.metastore.api.MetaException)

Example 3 with PartitionSpecInvalidException

use of org.apache.flink.table.catalog.exceptions.PartitionSpecInvalidException in project flink by apache.

the class HiveCatalog method instantiateHivePartition.

private Partition instantiateHivePartition(Table hiveTable, CatalogPartitionSpec partitionSpec, CatalogPartition catalogPartition) throws PartitionSpecInvalidException {
    List<String> partCols = getFieldNames(hiveTable.getPartitionKeys());
    List<String> partValues = getOrderedFullPartitionValues(partitionSpec, partCols, new ObjectPath(hiveTable.getDbName(), hiveTable.getTableName()));
    // validate partition values
    for (int i = 0; i < partCols.size(); i++) {
        if (isNullOrWhitespaceOnly(partValues.get(i))) {
            throw new PartitionSpecInvalidException(getName(), partCols, new ObjectPath(hiveTable.getDbName(), hiveTable.getTableName()), partitionSpec);
        }
    }
    // TODO: handle GenericCatalogPartition
    StorageDescriptor sd = hiveTable.getSd().deepCopy();
    sd.setLocation(catalogPartition.getProperties().remove(SqlCreateHiveTable.TABLE_LOCATION_URI));
    Map<String, String> properties = new HashMap<>(catalogPartition.getProperties());
    String comment = catalogPartition.getComment();
    if (comment != null) {
        properties.put(HiveCatalogConfig.COMMENT, comment);
    }
    return HiveTableUtil.createHivePartition(hiveTable.getDbName(), hiveTable.getTableName(), partValues, sd, properties);
}
Also used : ObjectPath(org.apache.flink.table.catalog.ObjectPath) HashMap(java.util.HashMap) StorageDescriptor(org.apache.hadoop.hive.metastore.api.StorageDescriptor) UniqueConstraint(org.apache.flink.table.api.constraints.UniqueConstraint) PartitionSpecInvalidException(org.apache.flink.table.catalog.exceptions.PartitionSpecInvalidException)

Example 4 with PartitionSpecInvalidException

use of org.apache.flink.table.catalog.exceptions.PartitionSpecInvalidException in project flink by apache.

the class HiveCatalog method alterPartitionColumnStatistics.

@Override
public void alterPartitionColumnStatistics(ObjectPath tablePath, CatalogPartitionSpec partitionSpec, CatalogColumnStatistics columnStatistics, boolean ignoreIfNotExists) throws PartitionNotExistException, CatalogException {
    try {
        Partition hivePartition = getHivePartition(tablePath, partitionSpec);
        Table hiveTable = getHiveTable(tablePath);
        String partName = getEscapedPartitionName(tablePath, partitionSpec, hiveTable);
        client.updatePartitionColumnStatistics(HiveStatsUtil.createPartitionColumnStats(hivePartition, partName, columnStatistics.getColumnStatisticsData(), hiveVersion));
    } catch (TableNotExistException | PartitionSpecInvalidException e) {
        if (!ignoreIfNotExists) {
            throw new PartitionNotExistException(getName(), tablePath, partitionSpec, e);
        }
    } catch (TException e) {
        throw new CatalogException(String.format("Failed to alter table column stats of table %s 's partition %s", tablePath.getFullName(), String.valueOf(partitionSpec)), e);
    }
}
Also used : TException(org.apache.thrift.TException) Partition(org.apache.hadoop.hive.metastore.api.Partition) CatalogPartition(org.apache.flink.table.catalog.CatalogPartition) CatalogTable(org.apache.flink.table.catalog.CatalogTable) SqlCreateHiveTable(org.apache.flink.sql.parser.hive.ddl.SqlCreateHiveTable) Table(org.apache.hadoop.hive.metastore.api.Table) CatalogBaseTable(org.apache.flink.table.catalog.CatalogBaseTable) TableNotExistException(org.apache.flink.table.catalog.exceptions.TableNotExistException) CatalogException(org.apache.flink.table.catalog.exceptions.CatalogException) PartitionNotExistException(org.apache.flink.table.catalog.exceptions.PartitionNotExistException) PartitionSpecInvalidException(org.apache.flink.table.catalog.exceptions.PartitionSpecInvalidException)

Example 5 with PartitionSpecInvalidException

use of org.apache.flink.table.catalog.exceptions.PartitionSpecInvalidException in project flink by apache.

the class HiveCatalog method partitionExists.

// ------ partitions ------
@Override
public boolean partitionExists(ObjectPath tablePath, CatalogPartitionSpec partitionSpec) throws CatalogException {
    checkNotNull(tablePath, "Table path cannot be null");
    checkNotNull(partitionSpec, "CatalogPartitionSpec cannot be null");
    try {
        return getHivePartition(tablePath, partitionSpec) != null;
    } catch (NoSuchObjectException | TableNotExistException | PartitionSpecInvalidException e) {
        return false;
    } catch (TException e) {
        throw new CatalogException(String.format("Failed to get partition %s of table %s", partitionSpec, tablePath), e);
    }
}
Also used : TException(org.apache.thrift.TException) TableNotExistException(org.apache.flink.table.catalog.exceptions.TableNotExistException) CatalogException(org.apache.flink.table.catalog.exceptions.CatalogException) NoSuchObjectException(org.apache.hadoop.hive.metastore.api.NoSuchObjectException) PartitionSpecInvalidException(org.apache.flink.table.catalog.exceptions.PartitionSpecInvalidException)

Aggregations

PartitionSpecInvalidException (org.apache.flink.table.catalog.exceptions.PartitionSpecInvalidException)8 CatalogException (org.apache.flink.table.catalog.exceptions.CatalogException)7 TableNotExistException (org.apache.flink.table.catalog.exceptions.TableNotExistException)7 TException (org.apache.thrift.TException)7 PartitionNotExistException (org.apache.flink.table.catalog.exceptions.PartitionNotExistException)6 CatalogPartition (org.apache.flink.table.catalog.CatalogPartition)5 Partition (org.apache.hadoop.hive.metastore.api.Partition)5 SqlCreateHiveTable (org.apache.flink.sql.parser.hive.ddl.SqlCreateHiveTable)4 CatalogBaseTable (org.apache.flink.table.catalog.CatalogBaseTable)4 CatalogTable (org.apache.flink.table.catalog.CatalogTable)4 NoSuchObjectException (org.apache.hadoop.hive.metastore.api.NoSuchObjectException)4 Table (org.apache.hadoop.hive.metastore.api.Table)4 MetaException (org.apache.hadoop.hive.metastore.api.MetaException)3 ArrayList (java.util.ArrayList)1 HashMap (java.util.HashMap)1 List (java.util.List)1 AlterTableOp (org.apache.flink.sql.parser.hive.ddl.SqlAlterHiveTable.AlterTableOp)1 UniqueConstraint (org.apache.flink.table.api.constraints.UniqueConstraint)1 CatalogPartitionImpl (org.apache.flink.table.catalog.CatalogPartitionImpl)1 ObjectPath (org.apache.flink.table.catalog.ObjectPath)1