Search in sources :

Example 6 with CatalogException

use of org.apache.flink.table.catalog.exceptions.CatalogException in project flink by apache.

the class HiveCatalog method createDatabase.

@Override
public void createDatabase(String databaseName, CatalogDatabase database, boolean ignoreIfExists) throws DatabaseAlreadyExistException, CatalogException {
    checkArgument(!isNullOrWhitespaceOnly(databaseName), "databaseName cannot be null or empty");
    checkNotNull(database, "database cannot be null");
    Map<String, String> properties = database.getProperties();
    String dbLocationUri = properties.remove(SqlCreateHiveDatabase.DATABASE_LOCATION_URI);
    Database hiveDatabase = new Database(databaseName, database.getComment(), dbLocationUri, properties);
    try {
        client.createDatabase(hiveDatabase);
    } catch (AlreadyExistsException e) {
        if (!ignoreIfExists) {
            throw new DatabaseAlreadyExistException(getName(), hiveDatabase.getName());
        }
    } catch (TException e) {
        throw new CatalogException(String.format("Failed to create database %s", hiveDatabase.getName()), e);
    }
}
Also used : TException(org.apache.thrift.TException) AlreadyExistsException(org.apache.hadoop.hive.metastore.api.AlreadyExistsException) PartitionAlreadyExistsException(org.apache.flink.table.catalog.exceptions.PartitionAlreadyExistsException) CatalogDatabase(org.apache.flink.table.catalog.CatalogDatabase) SqlAlterHiveDatabase(org.apache.flink.sql.parser.hive.ddl.SqlAlterHiveDatabase) SqlCreateHiveDatabase(org.apache.flink.sql.parser.hive.ddl.SqlCreateHiveDatabase) Database(org.apache.hadoop.hive.metastore.api.Database) CatalogException(org.apache.flink.table.catalog.exceptions.CatalogException) DatabaseAlreadyExistException(org.apache.flink.table.catalog.exceptions.DatabaseAlreadyExistException)

Example 7 with CatalogException

use of org.apache.flink.table.catalog.exceptions.CatalogException in project flink by apache.

the class HiveCatalog method listPartitions.

@Override
public List<CatalogPartitionSpec> listPartitions(ObjectPath tablePath, CatalogPartitionSpec partitionSpec) throws TableNotExistException, TableNotPartitionedException, PartitionSpecInvalidException, CatalogException {
    checkNotNull(tablePath, "Table path cannot be null");
    checkNotNull(partitionSpec, "CatalogPartitionSpec cannot be null");
    Table hiveTable = getHiveTable(tablePath);
    ensurePartitionedTable(tablePath, hiveTable);
    checkValidPartitionSpec(partitionSpec, getFieldNames(hiveTable.getPartitionKeys()), tablePath);
    try {
        // partition spec can be partial
        List<String> partialVals = HiveReflectionUtils.getPvals(hiveShim, hiveTable.getPartitionKeys(), partitionSpec.getPartitionSpec());
        return client.listPartitionNames(tablePath.getDatabaseName(), tablePath.getObjectName(), partialVals, (short) -1).stream().map(HiveCatalog::createPartitionSpec).collect(Collectors.toList());
    } catch (TException e) {
        throw new CatalogException(String.format("Failed to list partitions of table %s", tablePath), e);
    }
}
Also used : TException(org.apache.thrift.TException) CatalogTable(org.apache.flink.table.catalog.CatalogTable) SqlCreateHiveTable(org.apache.flink.sql.parser.hive.ddl.SqlCreateHiveTable) Table(org.apache.hadoop.hive.metastore.api.Table) CatalogBaseTable(org.apache.flink.table.catalog.CatalogBaseTable) CatalogException(org.apache.flink.table.catalog.exceptions.CatalogException)

Example 8 with CatalogException

use of org.apache.flink.table.catalog.exceptions.CatalogException in project flink by apache.

the class HiveShimV100 method listBuiltInFunctions.

@Override
public Set<String> listBuiltInFunctions() {
    try {
        Method method = FunctionRegistry.class.getDeclaredMethod("getFunctionNames", boolean.class);
        method.setAccessible(true);
        // don't search HMS cause we're only interested in built-in functions
        Set<String> names = (Set<String>) method.invoke(null, false);
        return names.stream().filter(n -> getBuiltInFunctionInfo(n).isPresent()).collect(Collectors.toSet());
    } catch (Exception ex) {
        throw new CatalogException("Failed to invoke FunctionRegistry.getFunctionNames()", ex);
    }
}
Also used : MetaException(org.apache.hadoop.hive.metastore.api.MetaException) Text(org.apache.hadoop.io.Text) DateWritable(org.apache.hadoop.hive.serde2.io.DateWritable) Writable(org.apache.hadoop.io.Writable) SemanticException(org.apache.hadoop.hive.ql.parse.SemanticException) FunctionRegistry(org.apache.hadoop.hive.ql.exec.FunctionRegistry) LongWritable(org.apache.hadoop.io.LongWritable) FileSinkOperator(org.apache.hadoop.hive.ql.exec.FileSinkOperator) HiveChar(org.apache.hadoop.hive.common.type.HiveChar) BigDecimal(java.math.BigDecimal) Configuration(org.apache.hadoop.conf.Configuration) Path(org.apache.hadoop.fs.Path) ObjectInspector(org.apache.hadoop.hive.serde2.objectinspector.ObjectInspector) ShortWritable(org.apache.hadoop.hive.serde2.io.ShortWritable) Method(java.lang.reflect.Method) IntWritable(org.apache.hadoop.io.IntWritable) FlinkHiveException(org.apache.flink.connectors.hive.FlinkHiveException) HiveVarcharWritable(org.apache.hadoop.hive.serde2.io.HiveVarcharWritable) Timestamp(java.sql.Timestamp) Set(java.util.Set) FunctionInfo(org.apache.hadoop.hive.ql.exec.FunctionInfo) HiveOutputFormat(org.apache.hadoop.hive.ql.io.HiveOutputFormat) Preconditions(org.apache.flink.util.Preconditions) Collectors(java.util.stream.Collectors) InvocationTargetException(java.lang.reflect.InvocationTargetException) List(java.util.List) BooleanWritable(org.apache.hadoop.io.BooleanWritable) HiveReflectionUtils(org.apache.flink.table.catalog.hive.util.HiveReflectionUtils) LogicalType(org.apache.flink.table.types.logical.LogicalType) IMetaStoreClient(org.apache.hadoop.hive.metastore.IMetaStoreClient) LocalDate(java.time.LocalDate) Optional(java.util.Optional) UniqueConstraint(org.apache.flink.table.api.constraints.UniqueConstraint) InvalidOperationException(org.apache.hadoop.hive.metastore.api.InvalidOperationException) HiveMetaStoreClient(org.apache.hadoop.hive.metastore.HiveMetaStoreClient) ByteWritable(org.apache.hadoop.hive.serde2.io.ByteWritable) LocalDateTime(java.time.LocalDateTime) CatalogColumnStatisticsDataDate(org.apache.flink.table.catalog.stats.CatalogColumnStatisticsDataDate) Partition(org.apache.hadoop.hive.metastore.api.Partition) Constructor(java.lang.reflect.Constructor) HiveCharWritable(org.apache.hadoop.hive.serde2.io.HiveCharWritable) ArrayList(java.util.ArrayList) HiveVarchar(org.apache.hadoop.hive.common.type.HiveVarchar) SimpleGenericUDAFParameterInfo(org.apache.hadoop.hive.ql.udf.generic.SimpleGenericUDAFParameterInfo) BytesWritable(org.apache.hadoop.io.BytesWritable) TimestampWritable(org.apache.hadoop.hive.serde2.io.TimestampWritable) DoubleWritable(org.apache.hadoop.hive.serde2.io.DoubleWritable) Nonnull(javax.annotation.Nonnull) RowData(org.apache.flink.table.data.RowData) Properties(java.util.Properties) UnknownDBException(org.apache.hadoop.hive.metastore.api.UnknownDBException) BulkWriter(org.apache.flink.api.common.serialization.BulkWriter) Reporter(org.apache.hadoop.mapred.Reporter) FunctionUtils(org.apache.hadoop.hive.ql.exec.FunctionUtils) HiveConf(org.apache.hadoop.hive.conf.HiveConf) TException(org.apache.thrift.TException) Table(org.apache.hadoop.hive.metastore.api.Table) Date(java.sql.Date) OrcNoHiveBulkWriterFactory(org.apache.flink.orc.nohive.OrcNoHiveBulkWriterFactory) JobConf(org.apache.hadoop.mapred.JobConf) FieldSchema(org.apache.hadoop.hive.metastore.api.FieldSchema) HiveDecimal(org.apache.hadoop.hive.common.type.HiveDecimal) HiveFileFormatUtils(org.apache.hadoop.hive.ql.io.HiveFileFormatUtils) Deserializer(org.apache.hadoop.hive.serde2.Deserializer) HiveDecimalWritable(org.apache.hadoop.hive.serde2.io.HiveDecimalWritable) CatalogException(org.apache.flink.table.catalog.exceptions.CatalogException) FloatWritable(org.apache.hadoop.io.FloatWritable) Collections(java.util.Collections) ColumnStatisticsData(org.apache.hadoop.hive.metastore.api.ColumnStatisticsData) Set(java.util.Set) CatalogException(org.apache.flink.table.catalog.exceptions.CatalogException) Method(java.lang.reflect.Method) MetaException(org.apache.hadoop.hive.metastore.api.MetaException) SemanticException(org.apache.hadoop.hive.ql.parse.SemanticException) FlinkHiveException(org.apache.flink.connectors.hive.FlinkHiveException) InvocationTargetException(java.lang.reflect.InvocationTargetException) InvalidOperationException(org.apache.hadoop.hive.metastore.api.InvalidOperationException) UnknownDBException(org.apache.hadoop.hive.metastore.api.UnknownDBException) TException(org.apache.thrift.TException) CatalogException(org.apache.flink.table.catalog.exceptions.CatalogException)

Example 9 with CatalogException

use of org.apache.flink.table.catalog.exceptions.CatalogException in project flink by apache.

the class HiveShimV100 method alterPartition.

@Override
public void alterPartition(IMetaStoreClient client, String databaseName, String tableName, Partition partition) throws InvalidOperationException, MetaException, TException {
    String errorMsg = "Failed to alter partition for table %s in database %s";
    try {
        Method method = client.getClass().getMethod("alter_partition", String.class, String.class, Partition.class);
        method.invoke(client, databaseName, tableName, partition);
    } catch (InvocationTargetException ite) {
        Throwable targetEx = ite.getTargetException();
        if (targetEx instanceof TException) {
            throw (TException) targetEx;
        } else {
            throw new CatalogException(String.format(errorMsg, tableName, databaseName), targetEx);
        }
    } catch (NoSuchMethodException | IllegalAccessException e) {
        throw new CatalogException(String.format(errorMsg, tableName, databaseName), e);
    }
}
Also used : TException(org.apache.thrift.TException) CatalogException(org.apache.flink.table.catalog.exceptions.CatalogException) Method(java.lang.reflect.Method) InvocationTargetException(java.lang.reflect.InvocationTargetException)

Example 10 with CatalogException

use of org.apache.flink.table.catalog.exceptions.CatalogException in project flink by apache.

the class HiveCatalog method alterPartitionColumnStatistics.

@Override
public void alterPartitionColumnStatistics(ObjectPath tablePath, CatalogPartitionSpec partitionSpec, CatalogColumnStatistics columnStatistics, boolean ignoreIfNotExists) throws PartitionNotExistException, CatalogException {
    try {
        Partition hivePartition = getHivePartition(tablePath, partitionSpec);
        Table hiveTable = getHiveTable(tablePath);
        String partName = getEscapedPartitionName(tablePath, partitionSpec, hiveTable);
        client.updatePartitionColumnStatistics(HiveStatsUtil.createPartitionColumnStats(hivePartition, partName, columnStatistics.getColumnStatisticsData(), hiveVersion));
    } catch (TableNotExistException | PartitionSpecInvalidException e) {
        if (!ignoreIfNotExists) {
            throw new PartitionNotExistException(getName(), tablePath, partitionSpec, e);
        }
    } catch (TException e) {
        throw new CatalogException(String.format("Failed to alter table column stats of table %s 's partition %s", tablePath.getFullName(), String.valueOf(partitionSpec)), e);
    }
}
Also used : TException(org.apache.thrift.TException) Partition(org.apache.hadoop.hive.metastore.api.Partition) CatalogPartition(org.apache.flink.table.catalog.CatalogPartition) CatalogTable(org.apache.flink.table.catalog.CatalogTable) SqlCreateHiveTable(org.apache.flink.sql.parser.hive.ddl.SqlCreateHiveTable) Table(org.apache.hadoop.hive.metastore.api.Table) CatalogBaseTable(org.apache.flink.table.catalog.CatalogBaseTable) TableNotExistException(org.apache.flink.table.catalog.exceptions.TableNotExistException) CatalogException(org.apache.flink.table.catalog.exceptions.CatalogException) PartitionNotExistException(org.apache.flink.table.catalog.exceptions.PartitionNotExistException) PartitionSpecInvalidException(org.apache.flink.table.catalog.exceptions.PartitionSpecInvalidException)

Aggregations

CatalogException (org.apache.flink.table.catalog.exceptions.CatalogException)53 TException (org.apache.thrift.TException)28 TableNotExistException (org.apache.flink.table.catalog.exceptions.TableNotExistException)16 Table (org.apache.hadoop.hive.metastore.api.Table)15 InvocationTargetException (java.lang.reflect.InvocationTargetException)14 CatalogTable (org.apache.flink.table.catalog.CatalogTable)14 Method (java.lang.reflect.Method)13 CatalogBaseTable (org.apache.flink.table.catalog.CatalogBaseTable)13 SqlCreateHiveTable (org.apache.flink.sql.parser.hive.ddl.SqlCreateHiveTable)12 PartitionNotExistException (org.apache.flink.table.catalog.exceptions.PartitionNotExistException)9 PartitionSpecInvalidException (org.apache.flink.table.catalog.exceptions.PartitionSpecInvalidException)9 MetaException (org.apache.hadoop.hive.metastore.api.MetaException)9 ArrayList (java.util.ArrayList)8 List (java.util.List)8 FlinkHiveException (org.apache.flink.connectors.hive.FlinkHiveException)8 DatabaseNotExistException (org.apache.flink.table.catalog.exceptions.DatabaseNotExistException)8 CatalogPartition (org.apache.flink.table.catalog.CatalogPartition)7 NoSuchObjectException (org.apache.hadoop.hive.metastore.api.NoSuchObjectException)7 PartitionAlreadyExistsException (org.apache.flink.table.catalog.exceptions.PartitionAlreadyExistsException)6 InvalidOperationException (org.apache.hadoop.hive.metastore.api.InvalidOperationException)6