Search in sources :

Example 41 with CatalogException

use of org.apache.flink.table.catalog.exceptions.CatalogException in project flink by apache.

the class HiveShimV120 method listBuiltInFunctions.

@Override
public Set<String> listBuiltInFunctions() {
    try {
        Method method = FunctionRegistry.class.getMethod("getFunctionNames");
        // getFunctionNames is a static method
        Set<String> names = (Set<String>) method.invoke(null);
        return names.stream().filter(n -> getBuiltInFunctionInfo(n).isPresent()).collect(Collectors.toSet());
    } catch (Exception ex) {
        throw new CatalogException("Failed to invoke FunctionRegistry.getFunctionNames()", ex);
    }
}
Also used : FlinkHiveException(org.apache.flink.connectors.hive.FlinkHiveException) Date(org.apache.flink.table.catalog.stats.Date) Array(java.lang.reflect.Array) MetaException(org.apache.hadoop.hive.metastore.api.MetaException) TypeInfoFactory(org.apache.hadoop.hive.serde2.typeinfo.TypeInfoFactory) HiveConf(org.apache.hadoop.hive.conf.HiveConf) TException(org.apache.thrift.TException) Set(java.util.Set) FunctionInfo(org.apache.hadoop.hive.ql.exec.FunctionInfo) FunctionRegistry(org.apache.hadoop.hive.ql.exec.FunctionRegistry) Field(java.lang.reflect.Field) CatalogColumnStatisticsDataDate(org.apache.flink.table.catalog.stats.CatalogColumnStatisticsDataDate) Constructor(java.lang.reflect.Constructor) Collectors(java.util.stream.Collectors) Table(org.apache.hadoop.hive.metastore.api.Table) InvocationTargetException(java.lang.reflect.InvocationTargetException) IMetaStoreClient(org.apache.hadoop.hive.metastore.IMetaStoreClient) PrimitiveObjectInspector(org.apache.hadoop.hive.serde2.objectinspector.PrimitiveObjectInspector) RetryingMetaStoreClient(org.apache.hadoop.hive.metastore.RetryingMetaStoreClient) PrimitiveTypeInfo(org.apache.hadoop.hive.serde2.typeinfo.PrimitiveTypeInfo) CatalogException(org.apache.flink.table.catalog.exceptions.CatalogException) Method(java.lang.reflect.Method) ColumnStatisticsData(org.apache.hadoop.hive.metastore.api.ColumnStatisticsData) InvalidOperationException(org.apache.hadoop.hive.metastore.api.InvalidOperationException) Set(java.util.Set) CatalogException(org.apache.flink.table.catalog.exceptions.CatalogException) Method(java.lang.reflect.Method) FlinkHiveException(org.apache.flink.connectors.hive.FlinkHiveException) MetaException(org.apache.hadoop.hive.metastore.api.MetaException) TException(org.apache.thrift.TException) InvocationTargetException(java.lang.reflect.InvocationTargetException) CatalogException(org.apache.flink.table.catalog.exceptions.CatalogException) InvalidOperationException(org.apache.hadoop.hive.metastore.api.InvalidOperationException)

Example 42 with CatalogException

use of org.apache.flink.table.catalog.exceptions.CatalogException in project flink by apache.

the class HiveShimV310 method getNotNullColumns.

@Override
public Set<String> getNotNullColumns(IMetaStoreClient client, Configuration conf, String dbName, String tableName) {
    try {
        String hiveDefaultCatalog = getHMSDefaultCatalog(conf);
        Class requestClz = Class.forName("org.apache.hadoop.hive.metastore.api.NotNullConstraintsRequest");
        Object request = requestClz.getDeclaredConstructor(String.class, String.class, String.class).newInstance(hiveDefaultCatalog, dbName, tableName);
        List<?> constraints = (List<?>) HiveReflectionUtils.invokeMethod(client.getClass(), client, "getNotNullConstraints", new Class[] { requestClz }, new Object[] { request });
        Class constraintClz = Class.forName("org.apache.hadoop.hive.metastore.api.SQLNotNullConstraint");
        Method colNameMethod = constraintClz.getDeclaredMethod("getColumn_name");
        Method isRelyMethod = constraintClz.getDeclaredMethod("isRely_cstr");
        Set<String> res = new HashSet<>();
        for (Object constraint : constraints) {
            if ((boolean) isRelyMethod.invoke(constraint)) {
                res.add((String) colNameMethod.invoke(constraint));
            }
        }
        return res;
    } catch (Exception e) {
        throw new CatalogException("Failed to get NOT NULL constraints", e);
    }
}
Also used : CatalogException(org.apache.flink.table.catalog.exceptions.CatalogException) ArrayList(java.util.ArrayList) List(java.util.List) Method(java.lang.reflect.Method) FlinkHiveException(org.apache.flink.connectors.hive.FlinkHiveException) InvocationTargetException(java.lang.reflect.InvocationTargetException) CatalogException(org.apache.flink.table.catalog.exceptions.CatalogException) HashSet(java.util.HashSet)

Example 43 with CatalogException

use of org.apache.flink.table.catalog.exceptions.CatalogException in project flink by apache.

the class HiveShimV110 method getHiveRecordWriter.

@Override
public FileSinkOperator.RecordWriter getHiveRecordWriter(JobConf jobConf, Class outputFormatClz, Class<? extends Writable> outValClz, boolean isCompressed, Properties tableProps, Path outPath) {
    try {
        Class utilClass = HiveFileFormatUtils.class;
        OutputFormat outputFormat = (OutputFormat) outputFormatClz.newInstance();
        Method utilMethod = utilClass.getDeclaredMethod("getRecordWriter", JobConf.class, OutputFormat.class, Class.class, boolean.class, Properties.class, Path.class, Reporter.class);
        return (FileSinkOperator.RecordWriter) utilMethod.invoke(null, jobConf, outputFormat, outValClz, isCompressed, tableProps, outPath, Reporter.NULL);
    } catch (Exception e) {
        throw new CatalogException("Failed to create Hive RecordWriter", e);
    }
}
Also used : HiveFileFormatUtils(org.apache.hadoop.hive.ql.io.HiveFileFormatUtils) OutputFormat(org.apache.hadoop.mapred.OutputFormat) CatalogException(org.apache.flink.table.catalog.exceptions.CatalogException) Method(java.lang.reflect.Method) FlinkHiveException(org.apache.flink.connectors.hive.FlinkHiveException) InvocationTargetException(java.lang.reflect.InvocationTargetException) CatalogException(org.apache.flink.table.catalog.exceptions.CatalogException)

Example 44 with CatalogException

use of org.apache.flink.table.catalog.exceptions.CatalogException in project flink by apache.

the class HiveShimV210 method createTableWithConstraints.

@Override
public void createTableWithConstraints(IMetaStoreClient client, Table table, Configuration conf, UniqueConstraint pk, List<Byte> pkTraits, List<String> notNullCols, List<Byte> nnTraits) {
    if (!notNullCols.isEmpty()) {
        throw new UnsupportedOperationException("NOT NULL constraints not supported until 3.0.0");
    }
    try {
        List<Object> hivePKs = createHivePKs(table, pk, pkTraits);
        // createTableWithConstraints takes PK and FK lists
        HiveReflectionUtils.invokeMethod(client.getClass(), client, "createTableWithConstraints", new Class[] { Table.class, List.class, List.class }, new Object[] { table, hivePKs, Collections.emptyList() });
    } catch (Exception e) {
        throw new CatalogException("Failed to create Hive table with constraints", e);
    }
}
Also used : CatalogException(org.apache.flink.table.catalog.exceptions.CatalogException) MetaException(org.apache.hadoop.hive.metastore.api.MetaException) TException(org.apache.thrift.TException) InvocationTargetException(java.lang.reflect.InvocationTargetException) CatalogException(org.apache.flink.table.catalog.exceptions.CatalogException) TApplicationException(org.apache.thrift.TApplicationException) InvalidOperationException(org.apache.hadoop.hive.metastore.api.InvalidOperationException)

Example 45 with CatalogException

use of org.apache.flink.table.catalog.exceptions.CatalogException in project flink by apache.

the class HiveShimV210 method alterPartition.

@Override
public void alterPartition(IMetaStoreClient client, String databaseName, String tableName, Partition partition) throws InvalidOperationException, MetaException, TException {
    String errorMsg = "Failed to alter partition for table %s in database %s";
    try {
        Method method = client.getClass().getMethod("alter_partition", String.class, String.class, Partition.class, EnvironmentContext.class);
        method.invoke(client, databaseName, tableName, partition, null);
    } catch (InvocationTargetException ite) {
        Throwable targetEx = ite.getTargetException();
        if (targetEx instanceof TException) {
            throw (TException) targetEx;
        } else {
            throw new CatalogException(String.format(errorMsg, tableName, databaseName), targetEx);
        }
    } catch (NoSuchMethodException | IllegalAccessException e) {
        throw new CatalogException(String.format(errorMsg, tableName, databaseName), e);
    }
}
Also used : TException(org.apache.thrift.TException) CatalogException(org.apache.flink.table.catalog.exceptions.CatalogException) Method(java.lang.reflect.Method) InvocationTargetException(java.lang.reflect.InvocationTargetException)

Aggregations

CatalogException (org.apache.flink.table.catalog.exceptions.CatalogException)53 TException (org.apache.thrift.TException)28 TableNotExistException (org.apache.flink.table.catalog.exceptions.TableNotExistException)16 Table (org.apache.hadoop.hive.metastore.api.Table)15 InvocationTargetException (java.lang.reflect.InvocationTargetException)14 CatalogTable (org.apache.flink.table.catalog.CatalogTable)14 Method (java.lang.reflect.Method)13 CatalogBaseTable (org.apache.flink.table.catalog.CatalogBaseTable)13 SqlCreateHiveTable (org.apache.flink.sql.parser.hive.ddl.SqlCreateHiveTable)12 PartitionNotExistException (org.apache.flink.table.catalog.exceptions.PartitionNotExistException)9 PartitionSpecInvalidException (org.apache.flink.table.catalog.exceptions.PartitionSpecInvalidException)9 MetaException (org.apache.hadoop.hive.metastore.api.MetaException)9 ArrayList (java.util.ArrayList)8 List (java.util.List)8 FlinkHiveException (org.apache.flink.connectors.hive.FlinkHiveException)8 DatabaseNotExistException (org.apache.flink.table.catalog.exceptions.DatabaseNotExistException)8 CatalogPartition (org.apache.flink.table.catalog.CatalogPartition)7 NoSuchObjectException (org.apache.hadoop.hive.metastore.api.NoSuchObjectException)7 PartitionAlreadyExistsException (org.apache.flink.table.catalog.exceptions.PartitionAlreadyExistsException)6 InvalidOperationException (org.apache.hadoop.hive.metastore.api.InvalidOperationException)6