Search in sources :

Example 1 with CatalogFunctionImpl

use of org.apache.flink.table.catalog.CatalogFunctionImpl in project flink by apache.

the class HiveCatalog method createFunction.

// ------ functions ------
@Override
public void createFunction(ObjectPath functionPath, CatalogFunction function, boolean ignoreIfExists) throws FunctionAlreadyExistException, DatabaseNotExistException, CatalogException {
    checkNotNull(functionPath, "functionPath cannot be null");
    checkNotNull(function, "function cannot be null");
    Function hiveFunction;
    if (function instanceof CatalogFunctionImpl) {
        hiveFunction = instantiateHiveFunction(functionPath, function);
    } else {
        throw new CatalogException(String.format("Unsupported catalog function type %s", function.getClass().getName()));
    }
    try {
        client.createFunction(hiveFunction);
    } catch (NoSuchObjectException e) {
        throw new DatabaseNotExistException(getName(), functionPath.getDatabaseName(), e);
    } catch (AlreadyExistsException e) {
        if (!ignoreIfExists) {
            throw new FunctionAlreadyExistException(getName(), functionPath, e);
        }
    } catch (TException e) {
        throw new CatalogException(String.format("Failed to create function %s", functionPath.getFullName()), e);
    }
}
Also used : TException(org.apache.thrift.TException) CatalogFunction(org.apache.flink.table.catalog.CatalogFunction) Function(org.apache.hadoop.hive.metastore.api.Function) FunctionAlreadyExistException(org.apache.flink.table.catalog.exceptions.FunctionAlreadyExistException) AlreadyExistsException(org.apache.hadoop.hive.metastore.api.AlreadyExistsException) PartitionAlreadyExistsException(org.apache.flink.table.catalog.exceptions.PartitionAlreadyExistsException) CatalogException(org.apache.flink.table.catalog.exceptions.CatalogException) NoSuchObjectException(org.apache.hadoop.hive.metastore.api.NoSuchObjectException) DatabaseNotExistException(org.apache.flink.table.catalog.exceptions.DatabaseNotExistException) CatalogFunctionImpl(org.apache.flink.table.catalog.CatalogFunctionImpl)

Example 2 with CatalogFunctionImpl

use of org.apache.flink.table.catalog.CatalogFunctionImpl in project flink by apache.

the class SqlToOperationConverter method convertCreateFunction.

/**
 * Convert CREATE FUNCTION statement.
 */
private Operation convertCreateFunction(SqlCreateFunction sqlCreateFunction) {
    UnresolvedIdentifier unresolvedIdentifier = UnresolvedIdentifier.of(sqlCreateFunction.getFunctionIdentifier());
    if (sqlCreateFunction.isSystemFunction()) {
        return new CreateTempSystemFunctionOperation(unresolvedIdentifier.getObjectName(), sqlCreateFunction.getFunctionClassName().getValueAs(String.class), sqlCreateFunction.isIfNotExists(), parseLanguage(sqlCreateFunction.getFunctionLanguage()));
    } else {
        FunctionLanguage language = parseLanguage(sqlCreateFunction.getFunctionLanguage());
        CatalogFunction catalogFunction = new CatalogFunctionImpl(sqlCreateFunction.getFunctionClassName().getValueAs(String.class), language);
        ObjectIdentifier identifier = catalogManager.qualifyIdentifier(unresolvedIdentifier);
        return new CreateCatalogFunctionOperation(identifier, catalogFunction, sqlCreateFunction.isIfNotExists(), sqlCreateFunction.isTemporary());
    }
}
Also used : CreateCatalogFunctionOperation(org.apache.flink.table.operations.ddl.CreateCatalogFunctionOperation) UnresolvedIdentifier(org.apache.flink.table.catalog.UnresolvedIdentifier) FunctionLanguage(org.apache.flink.table.catalog.FunctionLanguage) CatalogFunction(org.apache.flink.table.catalog.CatalogFunction) CreateTempSystemFunctionOperation(org.apache.flink.table.operations.ddl.CreateTempSystemFunctionOperation) CatalogFunctionImpl(org.apache.flink.table.catalog.CatalogFunctionImpl) ObjectIdentifier(org.apache.flink.table.catalog.ObjectIdentifier)

Example 3 with CatalogFunctionImpl

use of org.apache.flink.table.catalog.CatalogFunctionImpl in project flink by apache.

the class SqlToOperationConverter method convertAlterFunction.

/**
 * Convert ALTER FUNCTION statement.
 */
private Operation convertAlterFunction(SqlAlterFunction sqlAlterFunction) {
    if (sqlAlterFunction.isSystemFunction()) {
        throw new ValidationException("Alter temporary system function is not supported");
    }
    FunctionLanguage language = parseLanguage(sqlAlterFunction.getFunctionLanguage());
    CatalogFunction catalogFunction = new CatalogFunctionImpl(sqlAlterFunction.getFunctionClassName().getValueAs(String.class), language);
    UnresolvedIdentifier unresolvedIdentifier = UnresolvedIdentifier.of(sqlAlterFunction.getFunctionIdentifier());
    ObjectIdentifier identifier = catalogManager.qualifyIdentifier(unresolvedIdentifier);
    return new AlterCatalogFunctionOperation(identifier, catalogFunction, sqlAlterFunction.isIfExists(), sqlAlterFunction.isTemporary());
}
Also used : ValidationException(org.apache.flink.table.api.ValidationException) AlterCatalogFunctionOperation(org.apache.flink.table.operations.ddl.AlterCatalogFunctionOperation) UnresolvedIdentifier(org.apache.flink.table.catalog.UnresolvedIdentifier) FunctionLanguage(org.apache.flink.table.catalog.FunctionLanguage) CatalogFunction(org.apache.flink.table.catalog.CatalogFunction) CatalogFunctionImpl(org.apache.flink.table.catalog.CatalogFunctionImpl) ObjectIdentifier(org.apache.flink.table.catalog.ObjectIdentifier)

Example 4 with CatalogFunctionImpl

use of org.apache.flink.table.catalog.CatalogFunctionImpl in project flink by apache.

the class HiveCatalog method alterFunction.

@Override
public void alterFunction(ObjectPath functionPath, CatalogFunction newFunction, boolean ignoreIfNotExists) throws FunctionNotExistException, CatalogException {
    checkNotNull(functionPath, "functionPath cannot be null");
    checkNotNull(newFunction, "newFunction cannot be null");
    try {
        // check if function exists
        getFunction(functionPath);
        Function hiveFunction;
        if (newFunction instanceof CatalogFunctionImpl) {
            hiveFunction = instantiateHiveFunction(functionPath, newFunction);
        } else {
            throw new CatalogException(String.format("Unsupported catalog function type %s", newFunction.getClass().getName()));
        }
        client.alterFunction(functionPath.getDatabaseName(), functionPath.getObjectName(), hiveFunction);
    } catch (FunctionNotExistException e) {
        if (!ignoreIfNotExists) {
            throw e;
        }
    } catch (TException e) {
        throw new CatalogException(String.format("Failed to alter function %s", functionPath.getFullName()), e);
    }
}
Also used : FunctionNotExistException(org.apache.flink.table.catalog.exceptions.FunctionNotExistException) TException(org.apache.thrift.TException) CatalogFunction(org.apache.flink.table.catalog.CatalogFunction) Function(org.apache.hadoop.hive.metastore.api.Function) CatalogException(org.apache.flink.table.catalog.exceptions.CatalogException) CatalogFunctionImpl(org.apache.flink.table.catalog.CatalogFunctionImpl)

Example 5 with CatalogFunctionImpl

use of org.apache.flink.table.catalog.CatalogFunctionImpl in project flink by apache.

the class HiveCatalogUdfITCase method testFlinkUdf.

@Test
public void testFlinkUdf() throws Exception {
    final TableSchema schema = TableSchema.builder().field("name", DataTypes.STRING()).field("age", DataTypes.INT()).build();
    final Map<String, String> sourceOptions = new HashMap<>();
    sourceOptions.put("connector.type", "filesystem");
    sourceOptions.put("connector.path", getClass().getResource("/csv/test.csv").getPath());
    sourceOptions.put("format.type", "csv");
    CatalogTable source = new CatalogTableImpl(schema, sourceOptions, "Comment.");
    hiveCatalog.createTable(new ObjectPath(HiveCatalog.DEFAULT_DB, sourceTableName), source, false);
    hiveCatalog.createFunction(new ObjectPath(HiveCatalog.DEFAULT_DB, "myudf"), new CatalogFunctionImpl(TestHiveSimpleUDF.class.getCanonicalName()), false);
    hiveCatalog.createFunction(new ObjectPath(HiveCatalog.DEFAULT_DB, "mygenericudf"), new CatalogFunctionImpl(TestHiveGenericUDF.class.getCanonicalName()), false);
    hiveCatalog.createFunction(new ObjectPath(HiveCatalog.DEFAULT_DB, "myudtf"), new CatalogFunctionImpl(TestHiveUDTF.class.getCanonicalName()), false);
    hiveCatalog.createFunction(new ObjectPath(HiveCatalog.DEFAULT_DB, "myudaf"), new CatalogFunctionImpl(GenericUDAFSum.class.getCanonicalName()), false);
    testUdf(true);
    testUdf(false);
}
Also used : ObjectPath(org.apache.flink.table.catalog.ObjectPath) TableSchema(org.apache.flink.table.api.TableSchema) HashMap(java.util.HashMap) CatalogTableImpl(org.apache.flink.table.catalog.CatalogTableImpl) CatalogTable(org.apache.flink.table.catalog.CatalogTable) CatalogFunctionImpl(org.apache.flink.table.catalog.CatalogFunctionImpl) Test(org.junit.Test)

Aggregations

CatalogFunctionImpl (org.apache.flink.table.catalog.CatalogFunctionImpl)8 CatalogFunction (org.apache.flink.table.catalog.CatalogFunction)7 ObjectIdentifier (org.apache.flink.table.catalog.ObjectIdentifier)3 Test (org.junit.Test)3 HashMap (java.util.HashMap)2 ValidationException (org.apache.flink.table.api.ValidationException)2 CatalogTable (org.apache.flink.table.catalog.CatalogTable)2 FunctionLanguage (org.apache.flink.table.catalog.FunctionLanguage)2 UnresolvedIdentifier (org.apache.flink.table.catalog.UnresolvedIdentifier)2 CatalogException (org.apache.flink.table.catalog.exceptions.CatalogException)2 CreateCatalogFunctionOperation (org.apache.flink.table.operations.ddl.CreateCatalogFunctionOperation)2 SqlNode (org.apache.calcite.sql.SqlNode)1 TableSchema (org.apache.flink.table.api.TableSchema)1 CatalogTableImpl (org.apache.flink.table.catalog.CatalogTableImpl)1 ObjectPath (org.apache.flink.table.catalog.ObjectPath)1 DatabaseNotExistException (org.apache.flink.table.catalog.exceptions.DatabaseNotExistException)1 FunctionAlreadyExistException (org.apache.flink.table.catalog.exceptions.FunctionAlreadyExistException)1 FunctionNotExistException (org.apache.flink.table.catalog.exceptions.FunctionNotExistException)1 PartitionAlreadyExistsException (org.apache.flink.table.catalog.exceptions.PartitionAlreadyExistsException)1 FunctionDefinition (org.apache.flink.table.functions.FunctionDefinition)1