Search in sources :

Example 1 with CatalogFunction

use of org.apache.flink.table.catalog.CatalogFunction in project flink by apache.

the class HiveCatalogGenericMetadataTest method testFunctionCompatibility.

@Test
public void testFunctionCompatibility() throws Exception {
    catalog.createDatabase(db1, createDb(), false);
    // create a function with old prefix 'flink:' and make sure we can properly retrieve it
    ((HiveCatalog) catalog).client.createFunction(new Function(path1.getObjectName().toLowerCase(), path1.getDatabaseName(), "flink:class.name", null, PrincipalType.GROUP, (int) (System.currentTimeMillis() / 1000), FunctionType.JAVA, new ArrayList<>()));
    CatalogFunction catalogFunction = catalog.getFunction(path1);
    assertEquals("class.name", catalogFunction.getClassName());
    assertEquals(FunctionLanguage.JAVA, catalogFunction.getFunctionLanguage());
}
Also used : CatalogFunction(org.apache.flink.table.catalog.CatalogFunction) Function(org.apache.hadoop.hive.metastore.api.Function) ArrayList(java.util.ArrayList) CatalogFunction(org.apache.flink.table.catalog.CatalogFunction) Test(org.junit.Test)

Example 2 with CatalogFunction

use of org.apache.flink.table.catalog.CatalogFunction in project flink by apache.

the class HiveCatalog method createFunction.

// ------ functions ------
@Override
public void createFunction(ObjectPath functionPath, CatalogFunction function, boolean ignoreIfExists) throws FunctionAlreadyExistException, DatabaseNotExistException, CatalogException {
    checkNotNull(functionPath, "functionPath cannot be null");
    checkNotNull(function, "function cannot be null");
    Function hiveFunction;
    if (function instanceof CatalogFunctionImpl) {
        hiveFunction = instantiateHiveFunction(functionPath, function);
    } else {
        throw new CatalogException(String.format("Unsupported catalog function type %s", function.getClass().getName()));
    }
    try {
        client.createFunction(hiveFunction);
    } catch (NoSuchObjectException e) {
        throw new DatabaseNotExistException(getName(), functionPath.getDatabaseName(), e);
    } catch (AlreadyExistsException e) {
        if (!ignoreIfExists) {
            throw new FunctionAlreadyExistException(getName(), functionPath, e);
        }
    } catch (TException e) {
        throw new CatalogException(String.format("Failed to create function %s", functionPath.getFullName()), e);
    }
}
Also used : TException(org.apache.thrift.TException) CatalogFunction(org.apache.flink.table.catalog.CatalogFunction) Function(org.apache.hadoop.hive.metastore.api.Function) FunctionAlreadyExistException(org.apache.flink.table.catalog.exceptions.FunctionAlreadyExistException) AlreadyExistsException(org.apache.hadoop.hive.metastore.api.AlreadyExistsException) PartitionAlreadyExistsException(org.apache.flink.table.catalog.exceptions.PartitionAlreadyExistsException) CatalogException(org.apache.flink.table.catalog.exceptions.CatalogException) NoSuchObjectException(org.apache.hadoop.hive.metastore.api.NoSuchObjectException) DatabaseNotExistException(org.apache.flink.table.catalog.exceptions.DatabaseNotExistException) CatalogFunctionImpl(org.apache.flink.table.catalog.CatalogFunctionImpl)

Example 3 with CatalogFunction

use of org.apache.flink.table.catalog.CatalogFunction in project flink by apache.

the class TableEnvironmentImpl method alterCatalogFunction.

private TableResultInternal alterCatalogFunction(AlterCatalogFunctionOperation alterCatalogFunctionOperation) {
    String exMsg = getDDLOpExecuteErrorMsg(alterCatalogFunctionOperation.asSummaryString());
    try {
        CatalogFunction function = alterCatalogFunctionOperation.getCatalogFunction();
        if (alterCatalogFunctionOperation.isTemporary()) {
            throw new ValidationException("Alter temporary catalog function is not supported");
        } else {
            Catalog catalog = getCatalogOrThrowException(alterCatalogFunctionOperation.getFunctionIdentifier().getCatalogName());
            catalog.alterFunction(alterCatalogFunctionOperation.getFunctionIdentifier().toObjectPath(), function, alterCatalogFunctionOperation.isIfExists());
        }
        return TableResultImpl.TABLE_RESULT_OK;
    } catch (ValidationException e) {
        throw e;
    } catch (FunctionNotExistException e) {
        throw new ValidationException(e.getMessage(), e);
    } catch (Exception e) {
        throw new TableException(exMsg, e);
    }
}
Also used : FunctionNotExistException(org.apache.flink.table.catalog.exceptions.FunctionNotExistException) TableException(org.apache.flink.table.api.TableException) ValidationException(org.apache.flink.table.api.ValidationException) CatalogFunction(org.apache.flink.table.catalog.CatalogFunction) Catalog(org.apache.flink.table.catalog.Catalog) GenericInMemoryCatalog(org.apache.flink.table.catalog.GenericInMemoryCatalog) FunctionCatalog(org.apache.flink.table.catalog.FunctionCatalog) FunctionAlreadyExistException(org.apache.flink.table.catalog.exceptions.FunctionAlreadyExistException) DatabaseNotExistException(org.apache.flink.table.catalog.exceptions.DatabaseNotExistException) TableAlreadyExistException(org.apache.flink.table.catalog.exceptions.TableAlreadyExistException) TableException(org.apache.flink.table.api.TableException) IOException(java.io.IOException) ExecutionException(java.util.concurrent.ExecutionException) CatalogException(org.apache.flink.table.catalog.exceptions.CatalogException) FunctionNotExistException(org.apache.flink.table.catalog.exceptions.FunctionNotExistException) DatabaseNotEmptyException(org.apache.flink.table.catalog.exceptions.DatabaseNotEmptyException) DatabaseAlreadyExistException(org.apache.flink.table.catalog.exceptions.DatabaseAlreadyExistException) SqlParserException(org.apache.flink.table.api.SqlParserException) ValidationException(org.apache.flink.table.api.ValidationException) TableNotExistException(org.apache.flink.table.catalog.exceptions.TableNotExistException)

Example 4 with CatalogFunction

use of org.apache.flink.table.catalog.CatalogFunction in project flink by apache.

the class SqlToOperationConverter method convertCreateFunction.

/**
 * Convert CREATE FUNCTION statement.
 */
private Operation convertCreateFunction(SqlCreateFunction sqlCreateFunction) {
    UnresolvedIdentifier unresolvedIdentifier = UnresolvedIdentifier.of(sqlCreateFunction.getFunctionIdentifier());
    if (sqlCreateFunction.isSystemFunction()) {
        return new CreateTempSystemFunctionOperation(unresolvedIdentifier.getObjectName(), sqlCreateFunction.getFunctionClassName().getValueAs(String.class), sqlCreateFunction.isIfNotExists(), parseLanguage(sqlCreateFunction.getFunctionLanguage()));
    } else {
        FunctionLanguage language = parseLanguage(sqlCreateFunction.getFunctionLanguage());
        CatalogFunction catalogFunction = new CatalogFunctionImpl(sqlCreateFunction.getFunctionClassName().getValueAs(String.class), language);
        ObjectIdentifier identifier = catalogManager.qualifyIdentifier(unresolvedIdentifier);
        return new CreateCatalogFunctionOperation(identifier, catalogFunction, sqlCreateFunction.isIfNotExists(), sqlCreateFunction.isTemporary());
    }
}
Also used : CreateCatalogFunctionOperation(org.apache.flink.table.operations.ddl.CreateCatalogFunctionOperation) UnresolvedIdentifier(org.apache.flink.table.catalog.UnresolvedIdentifier) FunctionLanguage(org.apache.flink.table.catalog.FunctionLanguage) CatalogFunction(org.apache.flink.table.catalog.CatalogFunction) CreateTempSystemFunctionOperation(org.apache.flink.table.operations.ddl.CreateTempSystemFunctionOperation) CatalogFunctionImpl(org.apache.flink.table.catalog.CatalogFunctionImpl) ObjectIdentifier(org.apache.flink.table.catalog.ObjectIdentifier)

Example 5 with CatalogFunction

use of org.apache.flink.table.catalog.CatalogFunction in project flink by apache.

the class SqlToOperationConverter method convertAlterFunction.

/**
 * Convert ALTER FUNCTION statement.
 */
private Operation convertAlterFunction(SqlAlterFunction sqlAlterFunction) {
    if (sqlAlterFunction.isSystemFunction()) {
        throw new ValidationException("Alter temporary system function is not supported");
    }
    FunctionLanguage language = parseLanguage(sqlAlterFunction.getFunctionLanguage());
    CatalogFunction catalogFunction = new CatalogFunctionImpl(sqlAlterFunction.getFunctionClassName().getValueAs(String.class), language);
    UnresolvedIdentifier unresolvedIdentifier = UnresolvedIdentifier.of(sqlAlterFunction.getFunctionIdentifier());
    ObjectIdentifier identifier = catalogManager.qualifyIdentifier(unresolvedIdentifier);
    return new AlterCatalogFunctionOperation(identifier, catalogFunction, sqlAlterFunction.isIfExists(), sqlAlterFunction.isTemporary());
}
Also used : ValidationException(org.apache.flink.table.api.ValidationException) AlterCatalogFunctionOperation(org.apache.flink.table.operations.ddl.AlterCatalogFunctionOperation) UnresolvedIdentifier(org.apache.flink.table.catalog.UnresolvedIdentifier) FunctionLanguage(org.apache.flink.table.catalog.FunctionLanguage) CatalogFunction(org.apache.flink.table.catalog.CatalogFunction) CatalogFunctionImpl(org.apache.flink.table.catalog.CatalogFunctionImpl) ObjectIdentifier(org.apache.flink.table.catalog.ObjectIdentifier)

Aggregations

CatalogFunction (org.apache.flink.table.catalog.CatalogFunction)10 CatalogFunctionImpl (org.apache.flink.table.catalog.CatalogFunctionImpl)7 Test (org.junit.Test)4 ValidationException (org.apache.flink.table.api.ValidationException)3 ObjectIdentifier (org.apache.flink.table.catalog.ObjectIdentifier)3 CatalogException (org.apache.flink.table.catalog.exceptions.CatalogException)3 Catalog (org.apache.flink.table.catalog.Catalog)2 FunctionLanguage (org.apache.flink.table.catalog.FunctionLanguage)2 UnresolvedIdentifier (org.apache.flink.table.catalog.UnresolvedIdentifier)2 DatabaseNotExistException (org.apache.flink.table.catalog.exceptions.DatabaseNotExistException)2 FunctionAlreadyExistException (org.apache.flink.table.catalog.exceptions.FunctionAlreadyExistException)2 FunctionNotExistException (org.apache.flink.table.catalog.exceptions.FunctionNotExistException)2 CreateCatalogFunctionOperation (org.apache.flink.table.operations.ddl.CreateCatalogFunctionOperation)2 Function (org.apache.hadoop.hive.metastore.api.Function)2 IOException (java.io.IOException)1 ArrayList (java.util.ArrayList)1 HashMap (java.util.HashMap)1 ExecutionException (java.util.concurrent.ExecutionException)1 SqlNode (org.apache.calcite.sql.SqlNode)1 SqlParserException (org.apache.flink.table.api.SqlParserException)1